Apr 21 07:03:03.260913 ip-10-0-137-163 systemd[1]: Starting Kubernetes Kubelet... Apr 21 07:03:03.725177 ip-10-0-137-163 kubenswrapper[2581]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 07:03:03.725177 ip-10-0-137-163 kubenswrapper[2581]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 07:03:03.725177 ip-10-0-137-163 kubenswrapper[2581]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 07:03:03.725177 ip-10-0-137-163 kubenswrapper[2581]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 07:03:03.725177 ip-10-0-137-163 kubenswrapper[2581]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 07:03:03.727099 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.727004 2581 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 07:03:03.733053 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733031 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:03:03.733053 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733051 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:03:03.733053 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733055 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:03:03.733053 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733058 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:03:03.733191 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733061 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:03:03.733191 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733064 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:03:03.733191 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733067 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:03:03.733191 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733070 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:03:03.733191 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733073 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:03:03.733191 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733076 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:03:03.733191 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733079 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:03:03.733191 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733081 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:03:03.733191 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733084 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:03:03.733191 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733086 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:03:03.733191 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733089 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:03:03.733191 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733092 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:03:03.733191 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733094 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:03:03.733191 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733098 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:03:03.733191 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733102 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:03:03.733191 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733105 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:03:03.733191 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733108 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:03:03.733191 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733121 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:03:03.733191 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733124 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:03:03.733754 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733126 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:03:03.733754 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733129 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:03:03.733754 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733132 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:03:03.733754 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733135 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:03:03.733754 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733137 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:03:03.733754 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733140 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:03:03.733754 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733142 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:03:03.733754 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733145 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:03:03.733754 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733148 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:03:03.733754 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733150 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:03:03.733754 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733153 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:03:03.733754 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733155 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:03:03.733754 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733158 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:03:03.733754 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733160 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:03:03.733754 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733164 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:03:03.733754 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733167 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:03:03.733754 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733169 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:03:03.733754 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733172 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:03:03.733754 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733174 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:03:03.733754 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733177 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:03:03.734237 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733179 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:03:03.734237 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733183 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:03:03.734237 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733185 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:03:03.734237 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733188 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:03:03.734237 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733190 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:03:03.734237 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733193 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:03:03.734237 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733196 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:03:03.734237 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733199 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:03:03.734237 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733202 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:03:03.734237 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733204 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:03:03.734237 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733207 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:03:03.734237 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733210 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:03:03.734237 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733212 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:03:03.734237 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733215 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:03:03.734237 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733218 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:03:03.734237 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733220 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:03:03.734237 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733223 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:03:03.734237 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733225 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:03:03.734237 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733228 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:03:03.734708 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733230 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:03:03.734708 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733232 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:03:03.734708 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733235 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:03:03.734708 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733238 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:03:03.734708 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733240 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:03:03.734708 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733242 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:03:03.734708 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733246 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:03:03.734708 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733249 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:03:03.734708 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733252 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:03:03.734708 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733255 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:03:03.734708 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733258 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:03:03.734708 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733260 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:03:03.734708 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733264 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:03:03.734708 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733266 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:03:03.734708 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733269 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:03:03.734708 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733271 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:03:03.734708 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733275 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:03:03.734708 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733279 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:03:03.734708 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733283 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:03:03.735206 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733287 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:03:03.735206 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733290 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:03:03.735206 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733293 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:03:03.735206 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733296 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:03:03.735206 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733299 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:03:03.735206 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733738 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:03:03.735206 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733743 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:03:03.735206 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733746 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:03:03.735206 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733748 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:03:03.735206 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733751 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:03:03.735206 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733754 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:03:03.735206 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733756 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:03:03.735206 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733759 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:03:03.735206 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733762 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:03:03.735206 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733764 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:03:03.735206 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733767 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:03:03.735206 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733770 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:03:03.735206 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733772 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:03:03.735206 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733774 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:03:03.735206 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733777 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:03:03.735715 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733781 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:03:03.735715 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733784 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:03:03.735715 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733787 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:03:03.735715 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733790 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:03:03.735715 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733792 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:03:03.735715 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733796 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:03:03.735715 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733798 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:03:03.735715 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733801 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:03:03.735715 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733803 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:03:03.735715 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733806 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:03:03.735715 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733808 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:03:03.735715 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733811 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:03:03.735715 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733813 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:03:03.735715 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733816 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:03:03.735715 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733818 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:03:03.735715 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733820 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:03:03.735715 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733823 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:03:03.735715 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733825 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:03:03.735715 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733827 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:03:03.735715 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733830 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:03:03.736210 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733833 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:03:03.736210 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733835 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:03:03.736210 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733838 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:03:03.736210 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733842 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:03:03.736210 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733844 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:03:03.736210 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733846 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:03:03.736210 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733849 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:03:03.736210 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733852 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:03:03.736210 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733854 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:03:03.736210 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733856 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:03:03.736210 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733859 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:03:03.736210 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733861 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:03:03.736210 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733864 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:03:03.736210 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733867 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:03:03.736210 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733869 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:03:03.736210 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733872 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:03:03.736210 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733875 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:03:03.736210 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733877 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:03:03.736210 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733880 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:03:03.736210 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733883 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:03:03.736798 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733885 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:03:03.736798 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733888 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:03:03.736798 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733890 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:03:03.736798 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733893 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:03:03.736798 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733895 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:03:03.736798 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733898 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:03:03.736798 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733901 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:03:03.736798 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733903 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:03:03.736798 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733907 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:03:03.736798 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733911 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:03:03.736798 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733913 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:03:03.736798 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733916 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:03:03.736798 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733918 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:03:03.736798 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733920 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:03:03.736798 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733923 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:03:03.736798 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733925 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:03:03.736798 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733928 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:03:03.736798 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733931 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:03:03.736798 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733933 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:03:03.736798 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733936 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:03:03.737292 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733939 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:03:03.737292 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733941 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:03:03.737292 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733944 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:03:03.737292 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733947 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:03:03.737292 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733950 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:03:03.737292 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733952 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:03:03.737292 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733955 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:03:03.737292 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733957 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:03:03.737292 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733959 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:03:03.737292 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733963 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:03:03.737292 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.733967 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:03:03.737292 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.735895 2581 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 07:03:03.737292 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.735910 2581 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 07:03:03.737292 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.735917 2581 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 07:03:03.737292 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.735922 2581 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 07:03:03.737292 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.735927 2581 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 07:03:03.737292 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.735930 2581 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 07:03:03.737292 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.735935 2581 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 07:03:03.737292 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.735940 2581 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 07:03:03.737292 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.735943 2581 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.735946 2581 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.735951 2581 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.735954 2581 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.735957 2581 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.735960 2581 flags.go:64] FLAG: --cgroup-root="" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.735963 2581 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.735966 2581 flags.go:64] FLAG: --client-ca-file="" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.735969 2581 flags.go:64] FLAG: --cloud-config="" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.735972 2581 flags.go:64] FLAG: --cloud-provider="external" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.735975 2581 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736493 2581 flags.go:64] FLAG: --cluster-domain="" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736497 2581 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736500 2581 flags.go:64] FLAG: --config-dir="" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736504 2581 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736507 2581 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736516 2581 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736519 2581 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736523 2581 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736526 2581 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736529 2581 flags.go:64] FLAG: --contention-profiling="false" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736533 2581 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736536 2581 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736539 2581 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736542 2581 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 07:03:03.737799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736548 2581 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736551 2581 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736554 2581 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736557 2581 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736572 2581 flags.go:64] FLAG: --enable-server="true" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736575 2581 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736580 2581 flags.go:64] FLAG: --event-burst="100" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736583 2581 flags.go:64] FLAG: --event-qps="50" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736586 2581 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736589 2581 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736593 2581 flags.go:64] FLAG: --eviction-hard="" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736597 2581 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736600 2581 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736603 2581 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736607 2581 flags.go:64] FLAG: --eviction-soft="" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736610 2581 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736613 2581 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736616 2581 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736619 2581 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736622 2581 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736624 2581 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736627 2581 flags.go:64] FLAG: --feature-gates="" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736631 2581 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736634 2581 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736637 2581 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 07:03:03.738399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736641 2581 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736644 2581 flags.go:64] FLAG: --healthz-port="10248" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736647 2581 flags.go:64] FLAG: --help="false" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736650 2581 flags.go:64] FLAG: --hostname-override="ip-10-0-137-163.ec2.internal" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736653 2581 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736656 2581 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736658 2581 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736662 2581 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736666 2581 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736669 2581 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736672 2581 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736674 2581 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736677 2581 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736680 2581 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736683 2581 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736686 2581 flags.go:64] FLAG: --kube-reserved="" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736689 2581 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736691 2581 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736694 2581 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736697 2581 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736700 2581 flags.go:64] FLAG: --lock-file="" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736703 2581 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736706 2581 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736709 2581 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 07:03:03.739039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736714 2581 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736717 2581 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736720 2581 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736723 2581 flags.go:64] FLAG: --logging-format="text" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736726 2581 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736729 2581 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736732 2581 flags.go:64] FLAG: --manifest-url="" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736735 2581 flags.go:64] FLAG: --manifest-url-header="" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736739 2581 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736742 2581 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736747 2581 flags.go:64] FLAG: --max-pods="110" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736750 2581 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736753 2581 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736755 2581 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736759 2581 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736762 2581 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736765 2581 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736769 2581 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736778 2581 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736781 2581 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736784 2581 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736787 2581 flags.go:64] FLAG: --pod-cidr="" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736790 2581 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 07:03:03.739626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736796 2581 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736799 2581 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736802 2581 flags.go:64] FLAG: --pods-per-core="0" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736805 2581 flags.go:64] FLAG: --port="10250" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736809 2581 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736813 2581 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0d4fe20803d70643b" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736816 2581 flags.go:64] FLAG: --qos-reserved="" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736820 2581 flags.go:64] FLAG: --read-only-port="10255" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736823 2581 flags.go:64] FLAG: --register-node="true" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736826 2581 flags.go:64] FLAG: --register-schedulable="true" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736829 2581 flags.go:64] FLAG: --register-with-taints="" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736833 2581 flags.go:64] FLAG: --registry-burst="10" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736835 2581 flags.go:64] FLAG: --registry-qps="5" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736838 2581 flags.go:64] FLAG: --reserved-cpus="" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736841 2581 flags.go:64] FLAG: --reserved-memory="" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736844 2581 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736848 2581 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736851 2581 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736853 2581 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736856 2581 flags.go:64] FLAG: --runonce="false" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736859 2581 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736862 2581 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736865 2581 flags.go:64] FLAG: --seccomp-default="false" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736868 2581 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736871 2581 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736874 2581 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 07:03:03.740213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736877 2581 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736881 2581 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736884 2581 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736887 2581 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736890 2581 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736893 2581 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736896 2581 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736899 2581 flags.go:64] FLAG: --system-cgroups="" Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736902 2581 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736907 2581 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736910 2581 flags.go:64] FLAG: --tls-cert-file="" Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736913 2581 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736918 2581 flags.go:64] FLAG: --tls-min-version="" Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736921 2581 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736924 2581 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736926 2581 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736930 2581 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736933 2581 flags.go:64] FLAG: --v="2" Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736937 2581 flags.go:64] FLAG: --version="false" Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736941 2581 flags.go:64] FLAG: --vmodule="" Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736946 2581 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.736949 2581 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737057 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737060 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:03:03.740848 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737064 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:03:03.741415 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737067 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:03:03.741415 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737070 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:03:03.741415 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737073 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:03:03.741415 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737076 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:03:03.741415 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737079 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:03:03.741415 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737081 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:03:03.741415 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737084 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:03:03.741415 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737087 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:03:03.741415 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737089 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:03:03.741415 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737093 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:03:03.741415 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737096 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:03:03.741415 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737098 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:03:03.741415 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737101 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:03:03.741415 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737104 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:03:03.741415 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737106 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:03:03.741415 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737109 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:03:03.741415 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737111 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:03:03.741415 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737114 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:03:03.741415 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737116 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:03:03.741913 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737119 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:03:03.741913 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737121 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:03:03.741913 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737124 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:03:03.741913 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737126 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:03:03.741913 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737129 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:03:03.741913 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737132 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:03:03.741913 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737134 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:03:03.741913 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737137 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:03:03.741913 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737139 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:03:03.741913 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737142 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:03:03.741913 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737144 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:03:03.741913 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737147 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:03:03.741913 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737149 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:03:03.741913 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737152 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:03:03.741913 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737154 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:03:03.741913 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737157 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:03:03.741913 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737159 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:03:03.741913 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737161 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:03:03.741913 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737164 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:03:03.741913 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737166 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:03:03.742416 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737168 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:03:03.742416 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737175 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:03:03.742416 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737178 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:03:03.742416 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737181 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:03:03.742416 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737185 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:03:03.742416 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737189 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:03:03.742416 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737192 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:03:03.742416 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737195 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:03:03.742416 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737197 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:03:03.742416 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737200 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:03:03.742416 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737203 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:03:03.742416 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737205 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:03:03.742416 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737208 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:03:03.742416 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737210 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:03:03.742416 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737213 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:03:03.742416 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737215 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:03:03.742416 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737218 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:03:03.742416 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737220 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:03:03.742416 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737223 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:03:03.742416 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737225 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:03:03.742927 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737228 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:03:03.742927 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737230 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:03:03.742927 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737233 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:03:03.742927 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737235 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:03:03.742927 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737238 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:03:03.742927 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737240 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:03:03.742927 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737243 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:03:03.742927 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737245 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:03:03.742927 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737248 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:03:03.742927 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737250 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:03:03.742927 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737252 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:03:03.742927 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737256 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:03:03.742927 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737260 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:03:03.742927 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737264 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:03:03.742927 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737267 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:03:03.742927 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737270 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:03:03.742927 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737274 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:03:03.742927 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737276 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:03:03.742927 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737279 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:03:03.742927 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737281 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:03:03.743443 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737284 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:03:03.743443 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737287 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:03:03.743443 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737289 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:03:03.743443 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.737291 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:03:03.743443 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.737925 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 07:03:03.745595 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.745553 2581 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 07:03:03.745640 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.745598 2581 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 07:03:03.745668 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745652 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:03:03.745668 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745658 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:03:03.745668 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745661 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:03:03.745668 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745664 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:03:03.745668 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745667 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:03:03.745668 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745670 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:03:03.745819 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745672 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:03:03.745819 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745677 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:03:03.745819 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745681 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:03:03.745819 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745684 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:03:03.745819 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745686 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:03:03.745819 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745689 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:03:03.745819 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745692 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:03:03.745819 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745695 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:03:03.745819 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745699 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:03:03.745819 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745705 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:03:03.745819 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745708 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:03:03.745819 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745711 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:03:03.745819 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745713 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:03:03.745819 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745717 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:03:03.745819 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745720 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:03:03.745819 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745722 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:03:03.745819 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745725 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:03:03.745819 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745728 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:03:03.745819 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745731 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:03:03.746286 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745734 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:03:03.746286 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745736 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:03:03.746286 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745739 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:03:03.746286 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745741 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:03:03.746286 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745744 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:03:03.746286 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745747 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:03:03.746286 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745750 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:03:03.746286 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745753 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:03:03.746286 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745755 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:03:03.746286 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745758 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:03:03.746286 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745761 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:03:03.746286 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745763 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:03:03.746286 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745766 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:03:03.746286 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745768 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:03:03.746286 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745771 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:03:03.746286 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745774 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:03:03.746286 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745777 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:03:03.746286 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745779 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:03:03.746286 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745782 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:03:03.746775 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745785 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:03:03.746775 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745788 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:03:03.746775 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745790 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:03:03.746775 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745793 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:03:03.746775 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745795 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:03:03.746775 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745798 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:03:03.746775 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745800 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:03:03.746775 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745803 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:03:03.746775 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745805 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:03:03.746775 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745808 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:03:03.746775 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745811 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:03:03.746775 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745814 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:03:03.746775 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745816 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:03:03.746775 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745818 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:03:03.746775 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745821 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:03:03.746775 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745824 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:03:03.746775 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745826 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:03:03.746775 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745830 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:03:03.746775 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745832 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:03:03.746775 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745835 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:03:03.747251 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745838 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:03:03.747251 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745840 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:03:03.747251 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745843 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:03:03.747251 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745845 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:03:03.747251 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745847 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:03:03.747251 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745850 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:03:03.747251 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745852 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:03:03.747251 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745856 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:03:03.747251 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745859 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:03:03.747251 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745861 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:03:03.747251 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745864 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:03:03.747251 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745867 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:03:03.747251 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745869 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:03:03.747251 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745871 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:03:03.747251 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745874 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:03:03.747251 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745876 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:03:03.747251 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745879 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:03:03.747251 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745882 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:03:03.747251 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745884 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:03:03.747251 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745887 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:03:03.747768 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745889 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:03:03.747768 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.745891 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:03:03.747768 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.745897 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 07:03:03.747768 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746004 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:03:03.747768 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746009 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:03:03.747768 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746012 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:03:03.747768 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746015 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:03:03.747768 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746018 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:03:03.747768 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746020 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:03:03.747768 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746023 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:03:03.747768 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746026 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:03:03.747768 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746030 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:03:03.747768 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746034 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:03:03.747768 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746037 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:03:03.747768 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746040 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:03:03.748136 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746043 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:03:03.748136 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746045 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:03:03.748136 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746048 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:03:03.748136 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746051 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:03:03.748136 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746054 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:03:03.748136 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746057 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:03:03.748136 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746059 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:03:03.748136 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746062 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:03:03.748136 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746065 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:03:03.748136 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746068 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:03:03.748136 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746070 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:03:03.748136 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746073 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:03:03.748136 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746075 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:03:03.748136 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746078 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:03:03.748136 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746081 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:03:03.748136 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746083 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:03:03.748136 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746086 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:03:03.748136 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746088 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:03:03.748136 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746091 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:03:03.748611 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746093 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:03:03.748611 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746096 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:03:03.748611 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746099 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:03:03.748611 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746102 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:03:03.748611 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746104 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:03:03.748611 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746107 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:03:03.748611 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746109 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:03:03.748611 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746112 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:03:03.748611 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746115 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:03:03.748611 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746117 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:03:03.748611 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746119 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:03:03.748611 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746122 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:03:03.748611 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746130 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:03:03.748611 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746132 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:03:03.748611 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746135 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:03:03.748611 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746137 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:03:03.748611 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746140 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:03:03.748611 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746143 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:03:03.748611 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746146 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:03:03.749082 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746148 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:03:03.749082 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746151 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:03:03.749082 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746153 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:03:03.749082 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746156 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:03:03.749082 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746159 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:03:03.749082 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746161 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:03:03.749082 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746164 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:03:03.749082 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746166 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:03:03.749082 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746169 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:03:03.749082 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746171 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:03:03.749082 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746174 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:03:03.749082 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746176 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:03:03.749082 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746179 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:03:03.749082 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746182 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:03:03.749082 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746184 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:03:03.749082 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746186 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:03:03.749082 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746189 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:03:03.749082 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746191 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:03:03.749082 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746194 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:03:03.749082 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746196 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:03:03.749579 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746199 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:03:03.749579 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746201 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:03:03.749579 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746204 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:03:03.749579 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746206 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:03:03.749579 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746209 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:03:03.749579 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746211 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:03:03.749579 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746217 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:03:03.749579 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746220 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:03:03.749579 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746222 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:03:03.749579 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746225 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:03:03.749579 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746227 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:03:03.749579 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746230 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:03:03.749579 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746233 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:03:03.749579 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746236 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:03:03.749579 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746238 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:03:03.749579 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:03.746241 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:03:03.749988 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.746246 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 07:03:03.749988 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.747804 2581 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 07:03:03.749988 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.749850 2581 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 07:03:03.750889 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.750876 2581 server.go:1019] "Starting client certificate rotation" Apr 21 07:03:03.751593 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.751555 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 07:03:03.752075 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.752064 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 07:03:03.774960 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.774934 2581 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 07:03:03.779052 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.779022 2581 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 07:03:03.797248 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.797216 2581 log.go:25] "Validated CRI v1 runtime API" Apr 21 07:03:03.803247 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.803224 2581 log.go:25] "Validated CRI v1 image API" Apr 21 07:03:03.803400 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.803267 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 07:03:03.804548 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.804526 2581 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 07:03:03.809711 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.809690 2581 fs.go:135] Filesystem UUIDs: map[0eaaced9-7a14-450b-9a17-30750e79083b:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 dd6ca1c9-a744-4527-9d0d-e5f53fd1c24f:/dev/nvme0n1p4] Apr 21 07:03:03.809793 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.809710 2581 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 07:03:03.816179 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.816055 2581 manager.go:217] Machine: {Timestamp:2026-04-21 07:03:03.813527756 +0000 UTC m=+0.432861848 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100107 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec231b9279a5a772b05b9a9c6aba9666 SystemUUID:ec231b92-79a5-a772-b05b-9a9c6aba9666 BootID:7e7964a4-2f97-40d4-b74f-a0528e21250e Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:64:a2:25:9a:e3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:64:a2:25:9a:e3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5a:c9:fe:d7:cd:9b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 07:03:03.816179 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.816173 2581 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 07:03:03.816296 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.816271 2581 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 07:03:03.817531 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.817502 2581 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 07:03:03.817736 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.817534 2581 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-163.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 07:03:03.817780 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.817747 2581 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 07:03:03.817780 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.817757 2581 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 07:03:03.817780 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.817774 2581 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 07:03:03.817865 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.817791 2581 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 07:03:03.818703 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.818692 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 21 07:03:03.819015 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.819005 2581 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 07:03:03.820722 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.820706 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vj47s" Apr 21 07:03:03.821975 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.821964 2581 kubelet.go:491] "Attempting to sync node with API server" Apr 21 07:03:03.822014 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.821986 2581 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 07:03:03.822014 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.821999 2581 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 07:03:03.822014 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.822009 2581 kubelet.go:397] "Adding apiserver pod source" Apr 21 07:03:03.822133 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.822018 2581 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 07:03:03.823142 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.823129 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 07:03:03.823185 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.823149 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 07:03:03.826326 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.826307 2581 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 07:03:03.828198 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.828183 2581 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 07:03:03.828754 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.828739 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vj47s" Apr 21 07:03:03.829504 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.829492 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 07:03:03.829550 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.829510 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 07:03:03.829550 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.829516 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 07:03:03.829550 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.829522 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 07:03:03.829550 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.829528 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 07:03:03.829550 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.829534 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 07:03:03.829550 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.829540 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 07:03:03.829550 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.829545 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 07:03:03.829550 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.829552 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 07:03:03.829767 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.829558 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 07:03:03.829767 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.829594 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 07:03:03.829767 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.829603 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 07:03:03.830540 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.830510 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 07:03:03.830540 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.830530 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 07:03:03.834555 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.834528 2581 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 07:03:03.834682 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.834617 2581 server.go:1295] "Started kubelet" Apr 21 07:03:03.835256 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.835193 2581 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 07:03:03.835345 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.835314 2581 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 07:03:03.835757 ip-10-0-137-163 systemd[1]: Started Kubernetes Kubelet. Apr 21 07:03:03.836540 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.836525 2581 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 07:03:03.836626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.835220 2581 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 07:03:03.839126 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.839095 2581 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:03:03.839331 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.839317 2581 server.go:317] "Adding debug handlers to kubelet server" Apr 21 07:03:03.839762 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.839748 2581 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:03:03.842774 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.842749 2581 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-163.ec2.internal" not found Apr 21 07:03:03.843243 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.843228 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 07:03:03.843799 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.843778 2581 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 07:03:03.844531 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.844509 2581 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 07:03:03.844531 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.844533 2581 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 07:03:03.844691 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.844512 2581 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 07:03:03.844691 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.844685 2581 reconstruct.go:97] "Volume reconstruction finished" Apr 21 07:03:03.844691 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.844692 2581 reconciler.go:26] "Reconciler: start to sync state" Apr 21 07:03:03.845731 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:03.845700 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-163.ec2.internal\" not found" Apr 21 07:03:03.845890 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.845855 2581 factory.go:55] Registering systemd factory Apr 21 07:03:03.845961 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.845910 2581 factory.go:223] Registration of the systemd container factory successfully Apr 21 07:03:03.845961 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.845912 2581 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:03:03.846151 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.846134 2581 factory.go:153] Registering CRI-O factory Apr 21 07:03:03.846196 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.846155 2581 factory.go:223] Registration of the crio container factory successfully Apr 21 07:03:03.846241 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.846215 2581 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 07:03:03.846286 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.846243 2581 factory.go:103] Registering Raw factory Apr 21 07:03:03.846286 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.846260 2581 manager.go:1196] Started watching for new ooms in manager Apr 21 07:03:03.846741 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.846724 2581 manager.go:319] Starting recovery of all containers Apr 21 07:03:03.848907 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:03.848880 2581 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-163.ec2.internal\" not found" node="ip-10-0-137-163.ec2.internal" Apr 21 07:03:03.849039 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:03.848995 2581 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 07:03:03.854743 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.854494 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 07:03:03.855523 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.855498 2581 manager.go:324] Recovery completed Apr 21 07:03:03.859178 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.859158 2581 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-163.ec2.internal" not found Apr 21 07:03:03.860609 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.860596 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:03:03.862783 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.862661 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-163.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:03:03.862783 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.862691 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:03:03.862783 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.862703 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-163.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:03:03.863247 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.863234 2581 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 07:03:03.863247 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.863246 2581 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 07:03:03.863348 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.863262 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 21 07:03:03.865481 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.865470 2581 policy_none.go:49] "None policy: Start" Apr 21 07:03:03.865521 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.865489 2581 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 07:03:03.865521 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.865500 2581 state_mem.go:35] "Initializing new in-memory state store" Apr 21 07:03:03.903293 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.903262 2581 manager.go:341] "Starting Device Plugin manager" Apr 21 07:03:03.918900 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:03.903308 2581 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 07:03:03.918900 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.903319 2581 server.go:85] "Starting device plugin registration server" Apr 21 07:03:03.918900 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.903620 2581 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 07:03:03.918900 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.903633 2581 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 07:03:03.918900 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.903732 2581 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 07:03:03.918900 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.903813 2581 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 07:03:03.918900 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.903822 2581 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 07:03:03.918900 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:03.904317 2581 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 07:03:03.918900 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:03.904361 2581 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-163.ec2.internal\" not found" Apr 21 07:03:03.918900 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.914928 2581 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-137-163.ec2.internal" not found Apr 21 07:03:03.974109 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.974076 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 07:03:03.974109 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.974115 2581 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 07:03:03.974335 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.974144 2581 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 07:03:03.974335 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.974152 2581 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 07:03:03.974335 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:03.974185 2581 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 07:03:03.977376 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:03.977308 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:03:04.004154 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.004124 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:03:04.005451 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.005426 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-163.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:03:04.005596 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.005461 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:03:04.005596 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.005477 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-163.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:03:04.005596 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.005507 2581 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-163.ec2.internal" Apr 21 07:03:04.011616 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.011596 2581 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-163.ec2.internal" Apr 21 07:03:04.011687 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:04.011624 2581 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-163.ec2.internal\": node \"ip-10-0-137-163.ec2.internal\" not found" Apr 21 07:03:04.074916 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.074871 2581 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-163.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-163.ec2.internal"] Apr 21 07:03:04.077178 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.077160 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-163.ec2.internal" Apr 21 07:03:04.077266 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.077165 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-163.ec2.internal" Apr 21 07:03:04.105459 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.105431 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-163.ec2.internal" Apr 21 07:03:04.108885 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.108869 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-163.ec2.internal" Apr 21 07:03:04.115838 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.115819 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 07:03:04.127139 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.127117 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 07:03:04.246377 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.246294 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3db10428280db186de36082b1aff4988-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-163.ec2.internal\" (UID: \"3db10428280db186de36082b1aff4988\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-163.ec2.internal" Apr 21 07:03:04.246377 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.246327 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3db10428280db186de36082b1aff4988-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-163.ec2.internal\" (UID: \"3db10428280db186de36082b1aff4988\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-163.ec2.internal" Apr 21 07:03:04.246377 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.246351 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bb1b7fe43ed1fe0c8375d218f68d3580-config\") pod \"kube-apiserver-proxy-ip-10-0-137-163.ec2.internal\" (UID: \"bb1b7fe43ed1fe0c8375d218f68d3580\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-163.ec2.internal" Apr 21 07:03:04.347130 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.347099 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bb1b7fe43ed1fe0c8375d218f68d3580-config\") pod \"kube-apiserver-proxy-ip-10-0-137-163.ec2.internal\" (UID: \"bb1b7fe43ed1fe0c8375d218f68d3580\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-163.ec2.internal" Apr 21 07:03:04.347130 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.347132 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3db10428280db186de36082b1aff4988-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-163.ec2.internal\" (UID: \"3db10428280db186de36082b1aff4988\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-163.ec2.internal" Apr 21 07:03:04.347321 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.347152 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3db10428280db186de36082b1aff4988-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-163.ec2.internal\" (UID: \"3db10428280db186de36082b1aff4988\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-163.ec2.internal" Apr 21 07:03:04.347321 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.347197 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/3db10428280db186de36082b1aff4988-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-163.ec2.internal\" (UID: \"3db10428280db186de36082b1aff4988\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-163.ec2.internal" Apr 21 07:03:04.347321 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.347214 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bb1b7fe43ed1fe0c8375d218f68d3580-config\") pod \"kube-apiserver-proxy-ip-10-0-137-163.ec2.internal\" (UID: \"bb1b7fe43ed1fe0c8375d218f68d3580\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-163.ec2.internal" Apr 21 07:03:04.347321 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.347242 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3db10428280db186de36082b1aff4988-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-163.ec2.internal\" (UID: \"3db10428280db186de36082b1aff4988\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-163.ec2.internal" Apr 21 07:03:04.418355 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.418275 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-163.ec2.internal" Apr 21 07:03:04.428878 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.428845 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-163.ec2.internal" Apr 21 07:03:04.750730 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.750710 2581 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 07:03:04.751471 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.750861 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 07:03:04.751471 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.750880 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 07:03:04.751471 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.750902 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 07:03:04.822384 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.822345 2581 apiserver.go:52] "Watching apiserver" Apr 21 07:03:04.829429 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.829399 2581 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 07:03:04.829865 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.829841 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7cnmr","kube-system/kube-apiserver-proxy-ip-10-0-137-163.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r","openshift-cluster-node-tuning-operator/tuned-4jl8x","openshift-dns/node-resolver-ptn2z","openshift-image-registry/node-ca-vrvfp","openshift-multus/multus-additional-cni-plugins-b76h9","openshift-multus/multus-mhtw2","openshift-multus/network-metrics-daemon-r4v6n","kube-system/konnectivity-agent-nq67t","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-163.ec2.internal","openshift-network-diagnostics/network-check-target-4qpb2","openshift-network-operator/iptables-alerter-dr8n8"] Apr 21 07:03:04.831460 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.831412 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 06:58:03 +0000 UTC" deadline="2027-11-21 07:35:57.388454809 +0000 UTC" Apr 21 07:03:04.831460 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.831458 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13896h32m52.557000171s" Apr 21 07:03:04.831742 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.831712 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.833069 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.833048 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:04.834846 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.834449 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.834846 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.834517 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 07:03:04.834846 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.834517 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 07:03:04.836024 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.835985 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rlf59\"" Apr 21 07:03:04.836210 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.836191 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 07:03:04.836279 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.836256 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 07:03:04.836325 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.836303 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 07:03:04.836467 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.836451 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 07:03:04.836593 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.836553 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 07:03:04.836659 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.836621 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-st4qj\"" Apr 21 07:03:04.836806 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.836786 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 07:03:04.837274 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.836880 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 07:03:04.837335 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.837270 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-x786t\"" Apr 21 07:03:04.837335 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.837301 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:03:04.837434 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.837355 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 07:03:04.838466 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.838445 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ptn2z" Apr 21 07:03:04.838586 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.838526 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vrvfp" Apr 21 07:03:04.839882 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.839862 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.840512 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.840495 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 07:03:04.840638 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.840583 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-57kdv\"" Apr 21 07:03:04.840760 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.840742 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 07:03:04.840940 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.840923 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 07:03:04.841051 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.841000 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.841303 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.841284 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 07:03:04.841385 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.841284 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-p2lfm\"" Apr 21 07:03:04.841615 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.841602 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 07:03:04.841959 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.841943 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 07:03:04.842011 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.841994 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 07:03:04.842083 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.842065 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-66w7g\"" Apr 21 07:03:04.842291 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.842269 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:04.842373 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.842359 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 07:03:04.842425 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.842403 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 07:03:04.842477 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:04.842357 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r4v6n" podUID="42b45a2d-c99c-40f2-97f6-2d31aff6854f" Apr 21 07:03:04.842729 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.842713 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 07:03:04.843373 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.843355 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 07:03:04.843634 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.843551 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 07:03:04.843721 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.843656 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nq67t" Apr 21 07:03:04.843840 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.843659 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-b6xpz\"" Apr 21 07:03:04.845455 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.845436 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:04.845534 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:04.845507 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qpb2" podUID="f00d904c-86da-4e00-801a-3bd1d7dbe5f4" Apr 21 07:03:04.845652 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.845635 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 07:03:04.845810 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.845794 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-rgdw7\"" Apr 21 07:03:04.846579 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.846549 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 07:03:04.847064 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.847047 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dr8n8" Apr 21 07:03:04.849474 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.849452 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-log-socket\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.849577 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.849485 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-run\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.849577 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.849511 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-etc-kubernetes\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.849577 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.849546 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-node-log\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.849712 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.849614 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-etc-systemd\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.849712 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.849648 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-cni-binary-copy\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.849712 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.849681 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-host-run-k8s-cni-cncf-io\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.849712 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.849701 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-multus-conf-dir\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.849909 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.849734 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-run-ovn\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.849909 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.849759 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/49a3e211-6f6c-4501-878b-c01a12dfbbb1-sys-fs\") pod \"aws-ebs-csi-driver-node-tw92r\" (UID: \"49a3e211-6f6c-4501-878b-c01a12dfbbb1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:04.849909 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.849783 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-var-lib-kubelet\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.849909 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.849788 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:03:04.849909 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.849806 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-host\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.849909 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.849829 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-host-var-lib-kubelet\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.849909 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.849851 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-host-cni-netd\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.849909 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.849874 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d553e50a-31de-42be-99de-2bc791bca6e2-cni-binary-copy\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.849909 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.849888 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-2zr5j\"" Apr 21 07:03:04.849909 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.849900 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-var-lib-openvswitch\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.850325 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.849923 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-host-cni-bin\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.850325 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.849946 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 07:03:04.850325 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.849973 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/49a3e211-6f6c-4501-878b-c01a12dfbbb1-socket-dir\") pod \"aws-ebs-csi-driver-node-tw92r\" (UID: \"49a3e211-6f6c-4501-878b-c01a12dfbbb1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:04.850325 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850006 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/49a3e211-6f6c-4501-878b-c01a12dfbbb1-device-dir\") pod \"aws-ebs-csi-driver-node-tw92r\" (UID: \"49a3e211-6f6c-4501-878b-c01a12dfbbb1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:04.850325 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850032 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qqh6\" (UniqueName: \"kubernetes.io/projected/49a3e211-6f6c-4501-878b-c01a12dfbbb1-kube-api-access-7qqh6\") pod \"aws-ebs-csi-driver-node-tw92r\" (UID: \"49a3e211-6f6c-4501-878b-c01a12dfbbb1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:04.850325 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850044 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 07:03:04.850325 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850061 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.850325 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850086 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-cnibin\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.850325 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850111 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs\") pod \"network-metrics-daemon-r4v6n\" (UID: \"42b45a2d-c99c-40f2-97f6-2d31aff6854f\") " pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:04.850325 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850137 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-host-kubelet\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.850325 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850179 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-host-slash\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.850325 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850211 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-ovnkube-config\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.850325 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850233 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49a3e211-6f6c-4501-878b-c01a12dfbbb1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tw92r\" (UID: \"49a3e211-6f6c-4501-878b-c01a12dfbbb1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:04.850325 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850273 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-etc-modprobe-d\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.850325 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850314 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-etc-kubernetes\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.851017 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850346 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/21307e09-27b9-492e-ac26-b3d09e5794af-hosts-file\") pod \"node-resolver-ptn2z\" (UID: \"21307e09-27b9-492e-ac26-b3d09e5794af\") " pod="openshift-dns/node-resolver-ptn2z" Apr 21 07:03:04.851017 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850372 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-system-cni-dir\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.851017 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850396 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-host-var-lib-cni-bin\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.851017 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850419 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-hostroot\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.851017 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850445 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-run-openvswitch\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.851017 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850484 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/86aeed30-b464-4b2f-a813-f3fb4f3e9998-tmp\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.851017 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850508 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/21307e09-27b9-492e-ac26-b3d09e5794af-tmp-dir\") pod \"node-resolver-ptn2z\" (UID: \"21307e09-27b9-492e-ac26-b3d09e5794af\") " pod="openshift-dns/node-resolver-ptn2z" Apr 21 07:03:04.851017 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850532 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-cnibin\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.851017 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850557 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d553e50a-31de-42be-99de-2bc791bca6e2-multus-daemon-config\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.851017 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850598 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-host-run-netns\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.851017 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850623 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-ovnkube-script-lib\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.851017 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850647 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-sys\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.851017 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850671 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-lib-modules\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.851017 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850697 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28j6s\" (UniqueName: \"kubernetes.io/projected/42b45a2d-c99c-40f2-97f6-2d31aff6854f-kube-api-access-28j6s\") pod \"network-metrics-daemon-r4v6n\" (UID: \"42b45a2d-c99c-40f2-97f6-2d31aff6854f\") " pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:04.851017 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850730 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-multus-socket-dir-parent\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.851017 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850796 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-host-run-multus-certs\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.851017 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850825 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/49a3e211-6f6c-4501-878b-c01a12dfbbb1-etc-selinux\") pod \"aws-ebs-csi-driver-node-tw92r\" (UID: \"49a3e211-6f6c-4501-878b-c01a12dfbbb1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:04.851810 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850848 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/86aeed30-b464-4b2f-a813-f3fb4f3e9998-etc-tuned\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.851810 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850870 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/df97fb0c-eb01-481b-ab26-0073456033cd-serviceca\") pod \"node-ca-vrvfp\" (UID: \"df97fb0c-eb01-481b-ab26-0073456033cd\") " pod="openshift-image-registry/node-ca-vrvfp" Apr 21 07:03:04.851810 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850893 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66plz\" (UniqueName: \"kubernetes.io/projected/df97fb0c-eb01-481b-ab26-0073456033cd-kube-api-access-66plz\") pod \"node-ca-vrvfp\" (UID: \"df97fb0c-eb01-481b-ab26-0073456033cd\") " pod="openshift-image-registry/node-ca-vrvfp" Apr 21 07:03:04.851810 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850934 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-system-cni-dir\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.851810 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850972 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6852\" (UniqueName: \"kubernetes.io/projected/21307e09-27b9-492e-ac26-b3d09e5794af-kube-api-access-k6852\") pod \"node-resolver-ptn2z\" (UID: \"21307e09-27b9-492e-ac26-b3d09e5794af\") " pod="openshift-dns/node-resolver-ptn2z" Apr 21 07:03:04.851810 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.850995 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-run-systemd\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.851810 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851010 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-env-overrides\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.851810 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851026 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/49a3e211-6f6c-4501-878b-c01a12dfbbb1-registration-dir\") pod \"aws-ebs-csi-driver-node-tw92r\" (UID: \"49a3e211-6f6c-4501-878b-c01a12dfbbb1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:04.851810 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851040 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-multus-cni-dir\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.851810 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851055 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-host-var-lib-cni-multus\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.851810 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851092 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-systemd-units\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.851810 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851146 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-etc-openvswitch\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.851810 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851176 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-ovn-node-metrics-cert\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.851810 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851202 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvtpj\" (UniqueName: \"kubernetes.io/projected/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-kube-api-access-gvtpj\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.851810 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851225 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-766ml\" (UniqueName: \"kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml\") pod \"network-check-target-4qpb2\" (UID: \"f00d904c-86da-4e00-801a-3bd1d7dbe5f4\") " pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:04.851810 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851251 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k5zk\" (UniqueName: \"kubernetes.io/projected/d553e50a-31de-42be-99de-2bc791bca6e2-kube-api-access-7k5zk\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.852720 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851275 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/140fdd65-e7b7-4a63-bcd0-c990e87edf65-agent-certs\") pod \"konnectivity-agent-nq67t\" (UID: \"140fdd65-e7b7-4a63-bcd0-c990e87edf65\") " pod="kube-system/konnectivity-agent-nq67t" Apr 21 07:03:04.852720 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851322 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/140fdd65-e7b7-4a63-bcd0-c990e87edf65-konnectivity-ca\") pod \"konnectivity-agent-nq67t\" (UID: \"140fdd65-e7b7-4a63-bcd0-c990e87edf65\") " pod="kube-system/konnectivity-agent-nq67t" Apr 21 07:03:04.852720 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851354 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-etc-sysctl-conf\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.852720 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851376 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df97fb0c-eb01-481b-ab26-0073456033cd-host\") pod \"node-ca-vrvfp\" (UID: \"df97fb0c-eb01-481b-ab26-0073456033cd\") " pod="openshift-image-registry/node-ca-vrvfp" Apr 21 07:03:04.852720 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851399 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbgfd\" (UniqueName: \"kubernetes.io/projected/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-kube-api-access-jbgfd\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.852720 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851421 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-etc-sysconfig\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.852720 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851448 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.852720 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851471 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-etc-sysctl-d\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.852720 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851495 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlfvr\" (UniqueName: \"kubernetes.io/projected/86aeed30-b464-4b2f-a813-f3fb4f3e9998-kube-api-access-hlfvr\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.852720 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851518 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-os-release\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.852720 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851581 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-host-run-netns\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.852720 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851616 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-host-run-ovn-kubernetes\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.852720 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851641 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-os-release\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.852720 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851666 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.852720 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.851691 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.857231 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.857211 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 07:03:04.893633 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.893603 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-zl7gh" Apr 21 07:03:04.900822 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.900793 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-zl7gh" Apr 21 07:03:04.945205 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.945183 2581 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 07:03:04.951927 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.951895 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-run-ovn\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.952068 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.951932 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/49a3e211-6f6c-4501-878b-c01a12dfbbb1-sys-fs\") pod \"aws-ebs-csi-driver-node-tw92r\" (UID: \"49a3e211-6f6c-4501-878b-c01a12dfbbb1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:04.952068 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.951958 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-var-lib-kubelet\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.952068 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.951979 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-host\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.952068 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952001 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-host-var-lib-kubelet\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.952068 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952028 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a0d5c34-e09f-40bc-8eec-7f880a3de770-host-slash\") pod \"iptables-alerter-dr8n8\" (UID: \"0a0d5c34-e09f-40bc-8eec-7f880a3de770\") " pod="openshift-network-operator/iptables-alerter-dr8n8" Apr 21 07:03:04.952068 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952027 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-run-ovn\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.952068 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952035 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-var-lib-kubelet\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.952068 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952043 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/49a3e211-6f6c-4501-878b-c01a12dfbbb1-sys-fs\") pod \"aws-ebs-csi-driver-node-tw92r\" (UID: \"49a3e211-6f6c-4501-878b-c01a12dfbbb1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:04.952068 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952065 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-host-cni-netd\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.952381 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952078 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-host\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.952381 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952119 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-host-var-lib-kubelet\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.952381 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952154 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d553e50a-31de-42be-99de-2bc791bca6e2-cni-binary-copy\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.952381 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952159 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-host-cni-netd\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.952381 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952190 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-var-lib-openvswitch\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.952381 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952228 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-host-cni-bin\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.952381 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952246 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-var-lib-openvswitch\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.952381 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952288 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/49a3e211-6f6c-4501-878b-c01a12dfbbb1-socket-dir\") pod \"aws-ebs-csi-driver-node-tw92r\" (UID: \"49a3e211-6f6c-4501-878b-c01a12dfbbb1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:04.952381 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952330 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-host-cni-bin\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.952817 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952385 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/49a3e211-6f6c-4501-878b-c01a12dfbbb1-device-dir\") pod \"aws-ebs-csi-driver-node-tw92r\" (UID: \"49a3e211-6f6c-4501-878b-c01a12dfbbb1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:04.952817 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952452 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qqh6\" (UniqueName: \"kubernetes.io/projected/49a3e211-6f6c-4501-878b-c01a12dfbbb1-kube-api-access-7qqh6\") pod \"aws-ebs-csi-driver-node-tw92r\" (UID: \"49a3e211-6f6c-4501-878b-c01a12dfbbb1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:04.952817 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952457 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/49a3e211-6f6c-4501-878b-c01a12dfbbb1-device-dir\") pod \"aws-ebs-csi-driver-node-tw92r\" (UID: \"49a3e211-6f6c-4501-878b-c01a12dfbbb1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:04.952817 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952474 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/49a3e211-6f6c-4501-878b-c01a12dfbbb1-socket-dir\") pod \"aws-ebs-csi-driver-node-tw92r\" (UID: \"49a3e211-6f6c-4501-878b-c01a12dfbbb1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:04.952817 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952483 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.952817 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952525 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-cnibin\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.952817 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952548 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs\") pod \"network-metrics-daemon-r4v6n\" (UID: \"42b45a2d-c99c-40f2-97f6-2d31aff6854f\") " pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:04.952817 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952606 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-cnibin\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.952817 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952632 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-host-kubelet\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.952817 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952656 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-host-slash\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.952817 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:04.952665 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:03:04.952817 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952678 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-ovnkube-config\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.952817 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952703 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49a3e211-6f6c-4501-878b-c01a12dfbbb1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tw92r\" (UID: \"49a3e211-6f6c-4501-878b-c01a12dfbbb1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:04.952817 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952748 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49a3e211-6f6c-4501-878b-c01a12dfbbb1-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tw92r\" (UID: \"49a3e211-6f6c-4501-878b-c01a12dfbbb1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:04.952817 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:04.952767 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs podName:42b45a2d-c99c-40f2-97f6-2d31aff6854f nodeName:}" failed. No retries permitted until 2026-04-21 07:03:05.452708287 +0000 UTC m=+2.072042358 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs") pod "network-metrics-daemon-r4v6n" (UID: "42b45a2d-c99c-40f2-97f6-2d31aff6854f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:03:04.952817 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952780 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-host-slash\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.952817 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952795 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-etc-modprobe-d\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.953599 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952816 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-host-kubelet\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.953599 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952828 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-etc-kubernetes\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.953599 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952854 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/21307e09-27b9-492e-ac26-b3d09e5794af-hosts-file\") pod \"node-resolver-ptn2z\" (UID: \"21307e09-27b9-492e-ac26-b3d09e5794af\") " pod="openshift-dns/node-resolver-ptn2z" Apr 21 07:03:04.953599 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952875 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-system-cni-dir\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.953599 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952897 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d553e50a-31de-42be-99de-2bc791bca6e2-cni-binary-copy\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.953599 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952954 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-host-var-lib-cni-bin\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.953599 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952961 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-etc-kubernetes\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.953599 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952900 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-host-var-lib-cni-bin\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.953599 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952976 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/21307e09-27b9-492e-ac26-b3d09e5794af-hosts-file\") pod \"node-resolver-ptn2z\" (UID: \"21307e09-27b9-492e-ac26-b3d09e5794af\") " pod="openshift-dns/node-resolver-ptn2z" Apr 21 07:03:04.953599 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.952987 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-hostroot\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.953599 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953003 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-run-openvswitch\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.953599 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953011 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-system-cni-dir\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.953599 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953021 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/86aeed30-b464-4b2f-a813-f3fb4f3e9998-tmp\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.953599 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953036 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-hostroot\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.953599 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953036 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/21307e09-27b9-492e-ac26-b3d09e5794af-tmp-dir\") pod \"node-resolver-ptn2z\" (UID: \"21307e09-27b9-492e-ac26-b3d09e5794af\") " pod="openshift-dns/node-resolver-ptn2z" Apr 21 07:03:04.953599 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953059 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-run-openvswitch\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.953599 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953062 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-cnibin\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.953599 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953069 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.954427 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953081 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d553e50a-31de-42be-99de-2bc791bca6e2-multus-daemon-config\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.954427 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953086 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-cnibin\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.954427 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953084 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-etc-modprobe-d\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.954427 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953100 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-host-run-netns\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.954427 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953124 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-host-run-netns\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.954427 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953135 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-ovnkube-script-lib\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.954427 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953164 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-sys\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.954427 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953189 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-lib-modules\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.954427 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953216 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28j6s\" (UniqueName: \"kubernetes.io/projected/42b45a2d-c99c-40f2-97f6-2d31aff6854f-kube-api-access-28j6s\") pod \"network-metrics-daemon-r4v6n\" (UID: \"42b45a2d-c99c-40f2-97f6-2d31aff6854f\") " pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:04.954427 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953264 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-multus-socket-dir-parent\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.954427 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953291 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-host-run-multus-certs\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.954427 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953326 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/49a3e211-6f6c-4501-878b-c01a12dfbbb1-etc-selinux\") pod \"aws-ebs-csi-driver-node-tw92r\" (UID: \"49a3e211-6f6c-4501-878b-c01a12dfbbb1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:04.954427 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953332 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/21307e09-27b9-492e-ac26-b3d09e5794af-tmp-dir\") pod \"node-resolver-ptn2z\" (UID: \"21307e09-27b9-492e-ac26-b3d09e5794af\") " pod="openshift-dns/node-resolver-ptn2z" Apr 21 07:03:04.954427 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953346 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-ovnkube-config\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.954427 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953370 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/86aeed30-b464-4b2f-a813-f3fb4f3e9998-etc-tuned\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.954427 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953389 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/df97fb0c-eb01-481b-ab26-0073456033cd-serviceca\") pod \"node-ca-vrvfp\" (UID: \"df97fb0c-eb01-481b-ab26-0073456033cd\") " pod="openshift-image-registry/node-ca-vrvfp" Apr 21 07:03:04.954427 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953399 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-sys\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.954427 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953404 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66plz\" (UniqueName: \"kubernetes.io/projected/df97fb0c-eb01-481b-ab26-0073456033cd-kube-api-access-66plz\") pod \"node-ca-vrvfp\" (UID: \"df97fb0c-eb01-481b-ab26-0073456033cd\") " pod="openshift-image-registry/node-ca-vrvfp" Apr 21 07:03:04.955493 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953344 2581 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 07:03:04.955493 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953422 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-system-cni-dir\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.955493 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953434 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/49a3e211-6f6c-4501-878b-c01a12dfbbb1-etc-selinux\") pod \"aws-ebs-csi-driver-node-tw92r\" (UID: \"49a3e211-6f6c-4501-878b-c01a12dfbbb1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:04.955493 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953459 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d553e50a-31de-42be-99de-2bc791bca6e2-multus-daemon-config\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.955493 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953489 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-lib-modules\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.955493 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953647 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-multus-socket-dir-parent\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.955493 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953690 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-host-run-multus-certs\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.955493 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953438 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k6852\" (UniqueName: \"kubernetes.io/projected/21307e09-27b9-492e-ac26-b3d09e5794af-kube-api-access-k6852\") pod \"node-resolver-ptn2z\" (UID: \"21307e09-27b9-492e-ac26-b3d09e5794af\") " pod="openshift-dns/node-resolver-ptn2z" Apr 21 07:03:04.955493 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953728 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-run-systemd\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.955493 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953754 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-env-overrides\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.955493 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953779 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/49a3e211-6f6c-4501-878b-c01a12dfbbb1-registration-dir\") pod \"aws-ebs-csi-driver-node-tw92r\" (UID: \"49a3e211-6f6c-4501-878b-c01a12dfbbb1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:04.955493 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953788 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-run-systemd\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.955493 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953807 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-multus-cni-dir\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.955493 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953833 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-host-var-lib-cni-multus\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.955493 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953872 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/49a3e211-6f6c-4501-878b-c01a12dfbbb1-registration-dir\") pod \"aws-ebs-csi-driver-node-tw92r\" (UID: \"49a3e211-6f6c-4501-878b-c01a12dfbbb1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:04.955493 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953886 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvdzc\" (UniqueName: \"kubernetes.io/projected/0a0d5c34-e09f-40bc-8eec-7f880a3de770-kube-api-access-pvdzc\") pod \"iptables-alerter-dr8n8\" (UID: \"0a0d5c34-e09f-40bc-8eec-7f880a3de770\") " pod="openshift-network-operator/iptables-alerter-dr8n8" Apr 21 07:03:04.955493 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953916 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-systemd-units\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.955493 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953942 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-etc-openvswitch\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.956366 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953967 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-ovn-node-metrics-cert\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.956366 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953995 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvtpj\" (UniqueName: \"kubernetes.io/projected/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-kube-api-access-gvtpj\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.956366 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953999 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-ovnkube-script-lib\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.956366 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954007 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/df97fb0c-eb01-481b-ab26-0073456033cd-serviceca\") pod \"node-ca-vrvfp\" (UID: \"df97fb0c-eb01-481b-ab26-0073456033cd\") " pod="openshift-image-registry/node-ca-vrvfp" Apr 21 07:03:04.956366 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954032 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-766ml\" (UniqueName: \"kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml\") pod \"network-check-target-4qpb2\" (UID: \"f00d904c-86da-4e00-801a-3bd1d7dbe5f4\") " pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:04.956366 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954051 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7k5zk\" (UniqueName: \"kubernetes.io/projected/d553e50a-31de-42be-99de-2bc791bca6e2-kube-api-access-7k5zk\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.956366 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954065 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-multus-cni-dir\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.956366 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954068 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/140fdd65-e7b7-4a63-bcd0-c990e87edf65-agent-certs\") pod \"konnectivity-agent-nq67t\" (UID: \"140fdd65-e7b7-4a63-bcd0-c990e87edf65\") " pod="kube-system/konnectivity-agent-nq67t" Apr 21 07:03:04.956366 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954105 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/140fdd65-e7b7-4a63-bcd0-c990e87edf65-konnectivity-ca\") pod \"konnectivity-agent-nq67t\" (UID: \"140fdd65-e7b7-4a63-bcd0-c990e87edf65\") " pod="kube-system/konnectivity-agent-nq67t" Apr 21 07:03:04.956366 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954130 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-etc-sysctl-conf\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.956366 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954156 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df97fb0c-eb01-481b-ab26-0073456033cd-host\") pod \"node-ca-vrvfp\" (UID: \"df97fb0c-eb01-481b-ab26-0073456033cd\") " pod="openshift-image-registry/node-ca-vrvfp" Apr 21 07:03:04.956366 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954182 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbgfd\" (UniqueName: \"kubernetes.io/projected/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-kube-api-access-jbgfd\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.956366 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954187 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-env-overrides\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.956366 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954207 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-etc-sysconfig\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.956366 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954236 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-etc-openvswitch\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.956366 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954236 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.956366 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954274 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-etc-sysctl-d\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.957159 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954303 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlfvr\" (UniqueName: \"kubernetes.io/projected/86aeed30-b464-4b2f-a813-f3fb4f3e9998-kube-api-access-hlfvr\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.957159 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954330 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-os-release\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.957159 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954393 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-host-run-netns\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.957159 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954551 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-host-run-ovn-kubernetes\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.957159 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954596 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-os-release\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.957159 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954622 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.957159 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954646 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.957159 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954676 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-log-socket\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.957159 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954676 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-etc-sysctl-d\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.957159 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954698 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-run\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.957159 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954719 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-etc-kubernetes\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.957159 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954726 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-host-var-lib-cni-multus\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.957159 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.953834 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-system-cni-dir\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.957159 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954739 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-node-log\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.957159 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954781 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-etc-systemd\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.957159 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954789 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-systemd-units\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.957159 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954810 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-cni-binary-copy\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.957715 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954833 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-host-run-k8s-cni-cncf-io\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.957715 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954871 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-multus-conf-dir\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.957715 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954907 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0a0d5c34-e09f-40bc-8eec-7f880a3de770-iptables-alerter-script\") pod \"iptables-alerter-dr8n8\" (UID: \"0a0d5c34-e09f-40bc-8eec-7f880a3de770\") " pod="openshift-network-operator/iptables-alerter-dr8n8" Apr 21 07:03:04.957715 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.954276 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.957715 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.955210 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-os-release\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.957715 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.955257 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-host-run-netns\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.957715 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.955322 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-host-run-ovn-kubernetes\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.957715 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.955376 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-os-release\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.957715 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.955491 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.957715 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.955503 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/140fdd65-e7b7-4a63-bcd0-c990e87edf65-konnectivity-ca\") pod \"konnectivity-agent-nq67t\" (UID: \"140fdd65-e7b7-4a63-bcd0-c990e87edf65\") " pod="kube-system/konnectivity-agent-nq67t" Apr 21 07:03:04.957715 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.955853 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-etc-sysctl-conf\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.957715 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.955910 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-etc-sysconfig\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.957715 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.955940 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.957715 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.955950 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-node-log\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.957715 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.955994 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-log-socket\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.957715 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.956013 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-etc-systemd\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.957715 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.956042 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/86aeed30-b464-4b2f-a813-f3fb4f3e9998-run\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.958177 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.956086 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-etc-kubernetes\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.958177 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.956131 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-host-run-k8s-cni-cncf-io\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.958177 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.956174 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d553e50a-31de-42be-99de-2bc791bca6e2-multus-conf-dir\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.958177 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.956257 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/df97fb0c-eb01-481b-ab26-0073456033cd-host\") pod \"node-ca-vrvfp\" (UID: \"df97fb0c-eb01-481b-ab26-0073456033cd\") " pod="openshift-image-registry/node-ca-vrvfp" Apr 21 07:03:04.958177 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.956456 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-cni-binary-copy\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.958177 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.957023 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/86aeed30-b464-4b2f-a813-f3fb4f3e9998-tmp\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.958177 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.957071 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/86aeed30-b464-4b2f-a813-f3fb4f3e9998-etc-tuned\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.958177 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.957282 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-ovn-node-metrics-cert\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.958177 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.957394 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/140fdd65-e7b7-4a63-bcd0-c990e87edf65-agent-certs\") pod \"konnectivity-agent-nq67t\" (UID: \"140fdd65-e7b7-4a63-bcd0-c990e87edf65\") " pod="kube-system/konnectivity-agent-nq67t" Apr 21 07:03:04.966257 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.966233 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qqh6\" (UniqueName: \"kubernetes.io/projected/49a3e211-6f6c-4501-878b-c01a12dfbbb1-kube-api-access-7qqh6\") pod \"aws-ebs-csi-driver-node-tw92r\" (UID: \"49a3e211-6f6c-4501-878b-c01a12dfbbb1\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:04.967406 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:04.967385 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:03:04.967406 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:04.967411 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:03:04.967605 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:04.967425 2581 projected.go:194] Error preparing data for projected volume kube-api-access-766ml for pod openshift-network-diagnostics/network-check-target-4qpb2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:03:04.967605 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:04.967505 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml podName:f00d904c-86da-4e00-801a-3bd1d7dbe5f4 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:05.467486052 +0000 UTC m=+2.086820123 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-766ml" (UniqueName: "kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml") pod "network-check-target-4qpb2" (UID: "f00d904c-86da-4e00-801a-3bd1d7dbe5f4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:03:04.971610 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.971502 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28j6s\" (UniqueName: \"kubernetes.io/projected/42b45a2d-c99c-40f2-97f6-2d31aff6854f-kube-api-access-28j6s\") pod \"network-metrics-daemon-r4v6n\" (UID: \"42b45a2d-c99c-40f2-97f6-2d31aff6854f\") " pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:04.971610 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.971506 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66plz\" (UniqueName: \"kubernetes.io/projected/df97fb0c-eb01-481b-ab26-0073456033cd-kube-api-access-66plz\") pod \"node-ca-vrvfp\" (UID: \"df97fb0c-eb01-481b-ab26-0073456033cd\") " pod="openshift-image-registry/node-ca-vrvfp" Apr 21 07:03:04.971755 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.971627 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbgfd\" (UniqueName: \"kubernetes.io/projected/ff68b29a-db87-4ff2-882d-9f1e312dd5ce-kube-api-access-jbgfd\") pod \"ovnkube-node-7cnmr\" (UID: \"ff68b29a-db87-4ff2-882d-9f1e312dd5ce\") " pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:04.972218 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.972191 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k5zk\" (UniqueName: \"kubernetes.io/projected/d553e50a-31de-42be-99de-2bc791bca6e2-kube-api-access-7k5zk\") pod \"multus-mhtw2\" (UID: \"d553e50a-31de-42be-99de-2bc791bca6e2\") " pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.973581 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.972821 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvtpj\" (UniqueName: \"kubernetes.io/projected/d6e8c99f-04e0-4c02-b29b-c5d5e6e76763-kube-api-access-gvtpj\") pod \"multus-additional-cni-plugins-b76h9\" (UID: \"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763\") " pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:04.973581 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.973145 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mhtw2" Apr 21 07:03:04.973581 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.973267 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlfvr\" (UniqueName: \"kubernetes.io/projected/86aeed30-b464-4b2f-a813-f3fb4f3e9998-kube-api-access-hlfvr\") pod \"tuned-4jl8x\" (UID: \"86aeed30-b464-4b2f-a813-f3fb4f3e9998\") " pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:04.973910 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:04.973875 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3db10428280db186de36082b1aff4988.slice/crio-676cd40cf7012e1b6dee1df9a5750d0e86684eefe6a1a8405f059d589628fcf4 WatchSource:0}: Error finding container 676cd40cf7012e1b6dee1df9a5750d0e86684eefe6a1a8405f059d589628fcf4: Status 404 returned error can't find the container with id 676cd40cf7012e1b6dee1df9a5750d0e86684eefe6a1a8405f059d589628fcf4 Apr 21 07:03:04.974203 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.974182 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6852\" (UniqueName: \"kubernetes.io/projected/21307e09-27b9-492e-ac26-b3d09e5794af-kube-api-access-k6852\") pod \"node-resolver-ptn2z\" (UID: \"21307e09-27b9-492e-ac26-b3d09e5794af\") " pod="openshift-dns/node-resolver-ptn2z" Apr 21 07:03:04.974333 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:04.974318 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb1b7fe43ed1fe0c8375d218f68d3580.slice/crio-0a33474fbbea28f73d0f14aea5337e5d971e3b176f0dbc2627ce05150d8318e2 WatchSource:0}: Error finding container 0a33474fbbea28f73d0f14aea5337e5d971e3b176f0dbc2627ce05150d8318e2: Status 404 returned error can't find the container with id 0a33474fbbea28f73d0f14aea5337e5d971e3b176f0dbc2627ce05150d8318e2 Apr 21 07:03:04.979672 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.979500 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nq67t" Apr 21 07:03:04.980066 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:04.980047 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 07:03:04.989345 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:04.989316 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod140fdd65_e7b7_4a63_bcd0_c990e87edf65.slice/crio-031a4e369e74b3c9252dbf249532c492573c3729224c4283c5147c435854ec5d WatchSource:0}: Error finding container 031a4e369e74b3c9252dbf249532c492573c3729224c4283c5147c435854ec5d: Status 404 returned error can't find the container with id 031a4e369e74b3c9252dbf249532c492573c3729224c4283c5147c435854ec5d Apr 21 07:03:05.055875 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:05.055784 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pvdzc\" (UniqueName: \"kubernetes.io/projected/0a0d5c34-e09f-40bc-8eec-7f880a3de770-kube-api-access-pvdzc\") pod \"iptables-alerter-dr8n8\" (UID: \"0a0d5c34-e09f-40bc-8eec-7f880a3de770\") " pod="openshift-network-operator/iptables-alerter-dr8n8" Apr 21 07:03:05.056039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:05.055984 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0a0d5c34-e09f-40bc-8eec-7f880a3de770-iptables-alerter-script\") pod \"iptables-alerter-dr8n8\" (UID: \"0a0d5c34-e09f-40bc-8eec-7f880a3de770\") " pod="openshift-network-operator/iptables-alerter-dr8n8" Apr 21 07:03:05.056101 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:05.056017 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a0d5c34-e09f-40bc-8eec-7f880a3de770-host-slash\") pod \"iptables-alerter-dr8n8\" (UID: \"0a0d5c34-e09f-40bc-8eec-7f880a3de770\") " pod="openshift-network-operator/iptables-alerter-dr8n8" Apr 21 07:03:05.056219 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:05.056191 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0a0d5c34-e09f-40bc-8eec-7f880a3de770-host-slash\") pod \"iptables-alerter-dr8n8\" (UID: \"0a0d5c34-e09f-40bc-8eec-7f880a3de770\") " pod="openshift-network-operator/iptables-alerter-dr8n8" Apr 21 07:03:05.056618 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:05.056596 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0a0d5c34-e09f-40bc-8eec-7f880a3de770-iptables-alerter-script\") pod \"iptables-alerter-dr8n8\" (UID: \"0a0d5c34-e09f-40bc-8eec-7f880a3de770\") " pod="openshift-network-operator/iptables-alerter-dr8n8" Apr 21 07:03:05.066092 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:05.066070 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvdzc\" (UniqueName: \"kubernetes.io/projected/0a0d5c34-e09f-40bc-8eec-7f880a3de770-kube-api-access-pvdzc\") pod \"iptables-alerter-dr8n8\" (UID: \"0a0d5c34-e09f-40bc-8eec-7f880a3de770\") " pod="openshift-network-operator/iptables-alerter-dr8n8" Apr 21 07:03:05.162217 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:05.162180 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:05.169058 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:05.169023 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff68b29a_db87_4ff2_882d_9f1e312dd5ce.slice/crio-9d2cf96ab3a5a5b770ed2195012a328527a39a576e7ce78ddcddc09dfcd709be WatchSource:0}: Error finding container 9d2cf96ab3a5a5b770ed2195012a328527a39a576e7ce78ddcddc09dfcd709be: Status 404 returned error can't find the container with id 9d2cf96ab3a5a5b770ed2195012a328527a39a576e7ce78ddcddc09dfcd709be Apr 21 07:03:05.173189 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:05.173147 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" Apr 21 07:03:05.179892 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:05.179860 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49a3e211_6f6c_4501_878b_c01a12dfbbb1.slice/crio-1a0eda009586fbb0f5391a1ff725f48f3a68843fbc2d93130cbc0360ccdbe99e WatchSource:0}: Error finding container 1a0eda009586fbb0f5391a1ff725f48f3a68843fbc2d93130cbc0360ccdbe99e: Status 404 returned error can't find the container with id 1a0eda009586fbb0f5391a1ff725f48f3a68843fbc2d93130cbc0360ccdbe99e Apr 21 07:03:05.205907 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:05.205877 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" Apr 21 07:03:05.207510 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:05.207496 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ptn2z" Apr 21 07:03:05.212386 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:05.212359 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86aeed30_b464_4b2f_a813_f3fb4f3e9998.slice/crio-63f1900f96ac0012533602d73e85b2b7d0a2ba8bd0ad57c5827be732d6d5eca4 WatchSource:0}: Error finding container 63f1900f96ac0012533602d73e85b2b7d0a2ba8bd0ad57c5827be732d6d5eca4: Status 404 returned error can't find the container with id 63f1900f96ac0012533602d73e85b2b7d0a2ba8bd0ad57c5827be732d6d5eca4 Apr 21 07:03:05.214665 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:05.214637 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21307e09_27b9_492e_ac26_b3d09e5794af.slice/crio-72b96c347d6add93a04732793aaa20dd3f35ea595d621642abe48d8c562bf7a7 WatchSource:0}: Error finding container 72b96c347d6add93a04732793aaa20dd3f35ea595d621642abe48d8c562bf7a7: Status 404 returned error can't find the container with id 72b96c347d6add93a04732793aaa20dd3f35ea595d621642abe48d8c562bf7a7 Apr 21 07:03:05.219420 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:05.219403 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vrvfp" Apr 21 07:03:05.225642 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:05.225617 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf97fb0c_eb01_481b_ab26_0073456033cd.slice/crio-6c8423bd8f27edb903810be50b027de633129b1afde6aba73f19924ff33a7198 WatchSource:0}: Error finding container 6c8423bd8f27edb903810be50b027de633129b1afde6aba73f19924ff33a7198: Status 404 returned error can't find the container with id 6c8423bd8f27edb903810be50b027de633129b1afde6aba73f19924ff33a7198 Apr 21 07:03:05.241929 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:05.241905 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b76h9" Apr 21 07:03:05.249381 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:05.249355 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6e8c99f_04e0_4c02_b29b_c5d5e6e76763.slice/crio-015f3b6991b35252375ae971532f74e074267db6b7ead8faf39441d15958fe9e WatchSource:0}: Error finding container 015f3b6991b35252375ae971532f74e074267db6b7ead8faf39441d15958fe9e: Status 404 returned error can't find the container with id 015f3b6991b35252375ae971532f74e074267db6b7ead8faf39441d15958fe9e Apr 21 07:03:05.290333 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:05.290300 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-dr8n8" Apr 21 07:03:05.298269 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:05.298235 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a0d5c34_e09f_40bc_8eec_7f880a3de770.slice/crio-bd01d8bcaa2ef3116d7208fe72643c23606e374552c9ac19ba89c306812631df WatchSource:0}: Error finding container bd01d8bcaa2ef3116d7208fe72643c23606e374552c9ac19ba89c306812631df: Status 404 returned error can't find the container with id bd01d8bcaa2ef3116d7208fe72643c23606e374552c9ac19ba89c306812631df Apr 21 07:03:05.459620 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:05.459505 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs\") pod \"network-metrics-daemon-r4v6n\" (UID: \"42b45a2d-c99c-40f2-97f6-2d31aff6854f\") " pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:05.459957 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:05.459647 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:03:05.459957 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:05.459729 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs podName:42b45a2d-c99c-40f2-97f6-2d31aff6854f nodeName:}" failed. No retries permitted until 2026-04-21 07:03:06.459704803 +0000 UTC m=+3.079038874 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs") pod "network-metrics-daemon-r4v6n" (UID: "42b45a2d-c99c-40f2-97f6-2d31aff6854f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:03:05.560905 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:05.560135 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-766ml\" (UniqueName: \"kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml\") pod \"network-check-target-4qpb2\" (UID: \"f00d904c-86da-4e00-801a-3bd1d7dbe5f4\") " pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:05.560905 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:05.560323 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:03:05.560905 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:05.560346 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:03:05.560905 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:05.560358 2581 projected.go:194] Error preparing data for projected volume kube-api-access-766ml for pod openshift-network-diagnostics/network-check-target-4qpb2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:03:05.560905 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:05.560416 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml podName:f00d904c-86da-4e00-801a-3bd1d7dbe5f4 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:06.560395317 +0000 UTC m=+3.179729392 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-766ml" (UniqueName: "kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml") pod "network-check-target-4qpb2" (UID: "f00d904c-86da-4e00-801a-3bd1d7dbe5f4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:03:05.594622 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:05.594551 2581 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:03:05.648172 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:05.648136 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:03:05.681840 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:05.681807 2581 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:03:05.901773 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:05.901672 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 06:58:04 +0000 UTC" deadline="2028-01-08 04:23:02.48626795 +0000 UTC" Apr 21 07:03:05.901773 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:05.901718 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15045h19m56.58455347s" Apr 21 07:03:05.977783 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:05.977749 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:05.977971 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:05.977895 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r4v6n" podUID="42b45a2d-c99c-40f2-97f6-2d31aff6854f" Apr 21 07:03:06.002351 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:06.002256 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vrvfp" event={"ID":"df97fb0c-eb01-481b-ab26-0073456033cd","Type":"ContainerStarted","Data":"6c8423bd8f27edb903810be50b027de633129b1afde6aba73f19924ff33a7198"} Apr 21 07:03:06.005458 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:06.005420 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ptn2z" event={"ID":"21307e09-27b9-492e-ac26-b3d09e5794af","Type":"ContainerStarted","Data":"72b96c347d6add93a04732793aaa20dd3f35ea595d621642abe48d8c562bf7a7"} Apr 21 07:03:06.013214 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:06.013176 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" event={"ID":"86aeed30-b464-4b2f-a813-f3fb4f3e9998","Type":"ContainerStarted","Data":"63f1900f96ac0012533602d73e85b2b7d0a2ba8bd0ad57c5827be732d6d5eca4"} Apr 21 07:03:06.025926 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:06.025855 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" event={"ID":"ff68b29a-db87-4ff2-882d-9f1e312dd5ce","Type":"ContainerStarted","Data":"9d2cf96ab3a5a5b770ed2195012a328527a39a576e7ce78ddcddc09dfcd709be"} Apr 21 07:03:06.028199 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:06.028132 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nq67t" event={"ID":"140fdd65-e7b7-4a63-bcd0-c990e87edf65","Type":"ContainerStarted","Data":"031a4e369e74b3c9252dbf249532c492573c3729224c4283c5147c435854ec5d"} Apr 21 07:03:06.032751 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:06.032672 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mhtw2" event={"ID":"d553e50a-31de-42be-99de-2bc791bca6e2","Type":"ContainerStarted","Data":"edfe899b4af8f29294ae1ea8280f42bae53043d3f665438663cf396602830d58"} Apr 21 07:03:06.046859 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:06.046819 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-163.ec2.internal" event={"ID":"bb1b7fe43ed1fe0c8375d218f68d3580","Type":"ContainerStarted","Data":"0a33474fbbea28f73d0f14aea5337e5d971e3b176f0dbc2627ce05150d8318e2"} Apr 21 07:03:06.065769 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:06.065690 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-163.ec2.internal" event={"ID":"3db10428280db186de36082b1aff4988","Type":"ContainerStarted","Data":"676cd40cf7012e1b6dee1df9a5750d0e86684eefe6a1a8405f059d589628fcf4"} Apr 21 07:03:06.078706 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:06.078587 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dr8n8" event={"ID":"0a0d5c34-e09f-40bc-8eec-7f880a3de770","Type":"ContainerStarted","Data":"bd01d8bcaa2ef3116d7208fe72643c23606e374552c9ac19ba89c306812631df"} Apr 21 07:03:06.085809 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:06.085736 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b76h9" event={"ID":"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763","Type":"ContainerStarted","Data":"015f3b6991b35252375ae971532f74e074267db6b7ead8faf39441d15958fe9e"} Apr 21 07:03:06.097360 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:06.097236 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" event={"ID":"49a3e211-6f6c-4501-878b-c01a12dfbbb1","Type":"ContainerStarted","Data":"1a0eda009586fbb0f5391a1ff725f48f3a68843fbc2d93130cbc0360ccdbe99e"} Apr 21 07:03:06.466920 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:06.466878 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs\") pod \"network-metrics-daemon-r4v6n\" (UID: \"42b45a2d-c99c-40f2-97f6-2d31aff6854f\") " pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:06.467168 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:06.467040 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:03:06.467168 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:06.467110 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs podName:42b45a2d-c99c-40f2-97f6-2d31aff6854f nodeName:}" failed. No retries permitted until 2026-04-21 07:03:08.467090605 +0000 UTC m=+5.086424697 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs") pod "network-metrics-daemon-r4v6n" (UID: "42b45a2d-c99c-40f2-97f6-2d31aff6854f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:03:06.568031 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:06.567990 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-766ml\" (UniqueName: \"kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml\") pod \"network-check-target-4qpb2\" (UID: \"f00d904c-86da-4e00-801a-3bd1d7dbe5f4\") " pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:06.568227 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:06.568158 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:03:06.568227 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:06.568178 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:03:06.568227 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:06.568191 2581 projected.go:194] Error preparing data for projected volume kube-api-access-766ml for pod openshift-network-diagnostics/network-check-target-4qpb2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:03:06.568376 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:06.568251 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml podName:f00d904c-86da-4e00-801a-3bd1d7dbe5f4 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:08.568231813 +0000 UTC m=+5.187565888 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-766ml" (UniqueName: "kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml") pod "network-check-target-4qpb2" (UID: "f00d904c-86da-4e00-801a-3bd1d7dbe5f4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:03:06.903268 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:06.903168 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 06:58:04 +0000 UTC" deadline="2028-01-16 10:02:30.83115688 +0000 UTC" Apr 21 07:03:06.903268 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:06.903212 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15242h59m23.927949062s" Apr 21 07:03:06.974606 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:06.974518 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:06.974799 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:06.974671 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qpb2" podUID="f00d904c-86da-4e00-801a-3bd1d7dbe5f4" Apr 21 07:03:07.979969 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:07.979934 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:07.980401 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:07.980081 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r4v6n" podUID="42b45a2d-c99c-40f2-97f6-2d31aff6854f" Apr 21 07:03:08.482927 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:08.482870 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs\") pod \"network-metrics-daemon-r4v6n\" (UID: \"42b45a2d-c99c-40f2-97f6-2d31aff6854f\") " pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:08.483221 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:08.483081 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:03:08.483221 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:08.483191 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs podName:42b45a2d-c99c-40f2-97f6-2d31aff6854f nodeName:}" failed. No retries permitted until 2026-04-21 07:03:12.483169732 +0000 UTC m=+9.102503803 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs") pod "network-metrics-daemon-r4v6n" (UID: "42b45a2d-c99c-40f2-97f6-2d31aff6854f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:03:08.583992 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:08.583957 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-766ml\" (UniqueName: \"kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml\") pod \"network-check-target-4qpb2\" (UID: \"f00d904c-86da-4e00-801a-3bd1d7dbe5f4\") " pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:08.584170 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:08.584088 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:03:08.584170 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:08.584111 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:03:08.584170 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:08.584120 2581 projected.go:194] Error preparing data for projected volume kube-api-access-766ml for pod openshift-network-diagnostics/network-check-target-4qpb2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:03:08.584170 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:08.584170 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml podName:f00d904c-86da-4e00-801a-3bd1d7dbe5f4 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:12.584156734 +0000 UTC m=+9.203490802 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-766ml" (UniqueName: "kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml") pod "network-check-target-4qpb2" (UID: "f00d904c-86da-4e00-801a-3bd1d7dbe5f4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:03:08.975423 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:08.974890 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:08.975423 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:08.975013 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qpb2" podUID="f00d904c-86da-4e00-801a-3bd1d7dbe5f4" Apr 21 07:03:09.978530 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:09.978497 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:09.979191 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:09.978652 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r4v6n" podUID="42b45a2d-c99c-40f2-97f6-2d31aff6854f" Apr 21 07:03:10.974833 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:10.974791 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:10.975073 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:10.974933 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qpb2" podUID="f00d904c-86da-4e00-801a-3bd1d7dbe5f4" Apr 21 07:03:11.975844 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:11.975798 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:11.976394 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:11.975960 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r4v6n" podUID="42b45a2d-c99c-40f2-97f6-2d31aff6854f" Apr 21 07:03:12.517648 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:12.517600 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs\") pod \"network-metrics-daemon-r4v6n\" (UID: \"42b45a2d-c99c-40f2-97f6-2d31aff6854f\") " pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:12.517913 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:12.517774 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:03:12.517913 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:12.517852 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs podName:42b45a2d-c99c-40f2-97f6-2d31aff6854f nodeName:}" failed. No retries permitted until 2026-04-21 07:03:20.517832366 +0000 UTC m=+17.137166448 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs") pod "network-metrics-daemon-r4v6n" (UID: "42b45a2d-c99c-40f2-97f6-2d31aff6854f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:03:12.619144 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:12.618493 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-766ml\" (UniqueName: \"kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml\") pod \"network-check-target-4qpb2\" (UID: \"f00d904c-86da-4e00-801a-3bd1d7dbe5f4\") " pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:12.619144 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:12.618683 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:03:12.619144 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:12.618702 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:03:12.619144 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:12.618715 2581 projected.go:194] Error preparing data for projected volume kube-api-access-766ml for pod openshift-network-diagnostics/network-check-target-4qpb2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:03:12.619144 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:12.618780 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml podName:f00d904c-86da-4e00-801a-3bd1d7dbe5f4 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:20.618758554 +0000 UTC m=+17.238092648 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-766ml" (UniqueName: "kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml") pod "network-check-target-4qpb2" (UID: "f00d904c-86da-4e00-801a-3bd1d7dbe5f4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:03:12.975447 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:12.975349 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:12.975655 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:12.975490 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qpb2" podUID="f00d904c-86da-4e00-801a-3bd1d7dbe5f4" Apr 21 07:03:13.976098 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:13.976058 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:13.976595 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:13.976189 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r4v6n" podUID="42b45a2d-c99c-40f2-97f6-2d31aff6854f" Apr 21 07:03:14.975611 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:14.975083 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:14.975611 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:14.975202 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qpb2" podUID="f00d904c-86da-4e00-801a-3bd1d7dbe5f4" Apr 21 07:03:15.975097 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:15.975060 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:15.975608 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:15.975187 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r4v6n" podUID="42b45a2d-c99c-40f2-97f6-2d31aff6854f" Apr 21 07:03:16.975191 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:16.975154 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:16.975725 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:16.975312 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qpb2" podUID="f00d904c-86da-4e00-801a-3bd1d7dbe5f4" Apr 21 07:03:17.974662 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:17.974617 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:17.974865 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:17.974749 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r4v6n" podUID="42b45a2d-c99c-40f2-97f6-2d31aff6854f" Apr 21 07:03:18.974531 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:18.974492 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:18.974963 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:18.974651 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qpb2" podUID="f00d904c-86da-4e00-801a-3bd1d7dbe5f4" Apr 21 07:03:19.975416 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:19.975126 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:19.975905 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:19.975578 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r4v6n" podUID="42b45a2d-c99c-40f2-97f6-2d31aff6854f" Apr 21 07:03:20.580905 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:20.580864 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs\") pod \"network-metrics-daemon-r4v6n\" (UID: \"42b45a2d-c99c-40f2-97f6-2d31aff6854f\") " pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:20.581106 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:20.581033 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:03:20.581157 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:20.581123 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs podName:42b45a2d-c99c-40f2-97f6-2d31aff6854f nodeName:}" failed. No retries permitted until 2026-04-21 07:03:36.581097978 +0000 UTC m=+33.200432050 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs") pod "network-metrics-daemon-r4v6n" (UID: "42b45a2d-c99c-40f2-97f6-2d31aff6854f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:03:20.681474 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:20.681433 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-766ml\" (UniqueName: \"kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml\") pod \"network-check-target-4qpb2\" (UID: \"f00d904c-86da-4e00-801a-3bd1d7dbe5f4\") " pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:20.681690 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:20.681621 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:03:20.681690 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:20.681644 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:03:20.681690 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:20.681657 2581 projected.go:194] Error preparing data for projected volume kube-api-access-766ml for pod openshift-network-diagnostics/network-check-target-4qpb2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:03:20.681853 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:20.681723 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml podName:f00d904c-86da-4e00-801a-3bd1d7dbe5f4 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:36.681702555 +0000 UTC m=+33.301036624 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-766ml" (UniqueName: "kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml") pod "network-check-target-4qpb2" (UID: "f00d904c-86da-4e00-801a-3bd1d7dbe5f4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:03:20.975280 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:20.975240 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:20.975497 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:20.975378 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qpb2" podUID="f00d904c-86da-4e00-801a-3bd1d7dbe5f4" Apr 21 07:03:21.975062 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:21.975021 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:21.975244 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:21.975178 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r4v6n" podUID="42b45a2d-c99c-40f2-97f6-2d31aff6854f" Apr 21 07:03:22.975134 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:22.975086 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:22.975591 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:22.975278 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qpb2" podUID="f00d904c-86da-4e00-801a-3bd1d7dbe5f4" Apr 21 07:03:23.975378 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:23.975176 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:23.976241 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:23.975463 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r4v6n" podUID="42b45a2d-c99c-40f2-97f6-2d31aff6854f" Apr 21 07:03:24.143797 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:24.143761 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" event={"ID":"86aeed30-b464-4b2f-a813-f3fb4f3e9998","Type":"ContainerStarted","Data":"ea4801625908552314af8cab495e779be31381fa15d099b8a832b3a897a00ea9"} Apr 21 07:03:24.155663 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:24.155628 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" event={"ID":"ff68b29a-db87-4ff2-882d-9f1e312dd5ce","Type":"ContainerStarted","Data":"0efb7d243f7794336253f070284138256855326b805b2bcdb9e91fac516fbb42"} Apr 21 07:03:24.155798 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:24.155682 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" event={"ID":"ff68b29a-db87-4ff2-882d-9f1e312dd5ce","Type":"ContainerStarted","Data":"1fbacff5a392b61df204cde6db37827e7884b8b0dc67f33a6816b2b212d22260"} Apr 21 07:03:24.155798 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:24.155698 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" event={"ID":"ff68b29a-db87-4ff2-882d-9f1e312dd5ce","Type":"ContainerStarted","Data":"dc614aee23455e4f6275130ace85c0a161696331aef2635d85344d1e5f829859"} Apr 21 07:03:24.155798 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:24.155711 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" event={"ID":"ff68b29a-db87-4ff2-882d-9f1e312dd5ce","Type":"ContainerStarted","Data":"ba00f7b80f21b6e3a1b4f6b74a37f6c611a039a382dd4e148bbb64d772f50ca9"} Apr 21 07:03:24.157907 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:24.157804 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mhtw2" event={"ID":"d553e50a-31de-42be-99de-2bc791bca6e2","Type":"ContainerStarted","Data":"10f4f1deef2419a0d9dbca7a79652a9ba7042d7ab060e9467904371ec401effb"} Apr 21 07:03:24.159594 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:24.159456 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-4jl8x" podStartSLOduration=1.870771452 podStartE2EDuration="20.159440681s" podCreationTimestamp="2026-04-21 07:03:04 +0000 UTC" firstStartedPulling="2026-04-21 07:03:05.214012526 +0000 UTC m=+1.833346594" lastFinishedPulling="2026-04-21 07:03:23.502681755 +0000 UTC m=+20.122015823" observedRunningTime="2026-04-21 07:03:24.159376484 +0000 UTC m=+20.778710575" watchObservedRunningTime="2026-04-21 07:03:24.159440681 +0000 UTC m=+20.778774771" Apr 21 07:03:24.161134 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:24.161106 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-163.ec2.internal" event={"ID":"bb1b7fe43ed1fe0c8375d218f68d3580","Type":"ContainerStarted","Data":"0a37b52d68ae3e994e8221c8adee62bf5de4c2a2c77a241aeb2ccdb5bc169b33"} Apr 21 07:03:24.177036 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:24.176903 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mhtw2" podStartSLOduration=1.574766479 podStartE2EDuration="20.176887165s" podCreationTimestamp="2026-04-21 07:03:04 +0000 UTC" firstStartedPulling="2026-04-21 07:03:04.986250596 +0000 UTC m=+1.605584679" lastFinishedPulling="2026-04-21 07:03:23.588371298 +0000 UTC m=+20.207705365" observedRunningTime="2026-04-21 07:03:24.17632234 +0000 UTC m=+20.795656431" watchObservedRunningTime="2026-04-21 07:03:24.176887165 +0000 UTC m=+20.796221254" Apr 21 07:03:24.193048 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:24.193003 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-163.ec2.internal" podStartSLOduration=20.19298537 podStartE2EDuration="20.19298537s" podCreationTimestamp="2026-04-21 07:03:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:03:24.19239452 +0000 UTC m=+20.811728610" watchObservedRunningTime="2026-04-21 07:03:24.19298537 +0000 UTC m=+20.812319460" Apr 21 07:03:24.974540 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:24.974353 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:24.974701 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:24.974628 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qpb2" podUID="f00d904c-86da-4e00-801a-3bd1d7dbe5f4" Apr 21 07:03:25.164895 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:25.164860 2581 generic.go:358] "Generic (PLEG): container finished" podID="3db10428280db186de36082b1aff4988" containerID="8c6fa0d131a4bba089e0a6ea8d3aed675892b66e4b9773cb3c2fd55eb20c8d97" exitCode=0 Apr 21 07:03:25.165450 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:25.164943 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-163.ec2.internal" event={"ID":"3db10428280db186de36082b1aff4988","Type":"ContainerDied","Data":"8c6fa0d131a4bba089e0a6ea8d3aed675892b66e4b9773cb3c2fd55eb20c8d97"} Apr 21 07:03:25.166626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:25.166598 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-dr8n8" event={"ID":"0a0d5c34-e09f-40bc-8eec-7f880a3de770","Type":"ContainerStarted","Data":"f583261c8abfa433179ca38782160c3ad2b0d0826caf8f8473aa2fd3c58c2639"} Apr 21 07:03:25.168031 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:25.168003 2581 generic.go:358] "Generic (PLEG): container finished" podID="d6e8c99f-04e0-4c02-b29b-c5d5e6e76763" containerID="020b2f83cf3259c0f10ed942038ce9d70dcd353edbee6366f9592b25c52ef3c2" exitCode=0 Apr 21 07:03:25.168160 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:25.168083 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b76h9" event={"ID":"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763","Type":"ContainerDied","Data":"020b2f83cf3259c0f10ed942038ce9d70dcd353edbee6366f9592b25c52ef3c2"} Apr 21 07:03:25.169531 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:25.169422 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" event={"ID":"49a3e211-6f6c-4501-878b-c01a12dfbbb1","Type":"ContainerStarted","Data":"59e120ff70a5a1611bd24c8faee2a2f0499ed731a45d20c01aa11fefc1dcfd08"} Apr 21 07:03:25.170906 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:25.170879 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vrvfp" event={"ID":"df97fb0c-eb01-481b-ab26-0073456033cd","Type":"ContainerStarted","Data":"0d285ec195ac8537863245e7f20835f45a08ad2c88985b31a0beb89cf8d804e9"} Apr 21 07:03:25.172318 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:25.172287 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ptn2z" event={"ID":"21307e09-27b9-492e-ac26-b3d09e5794af","Type":"ContainerStarted","Data":"23d53bff54c6a8212811a5b83ac1c9f1a6a892b9ac4e69990d6d2b27caf263d4"} Apr 21 07:03:25.175263 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:25.175241 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" event={"ID":"ff68b29a-db87-4ff2-882d-9f1e312dd5ce","Type":"ContainerStarted","Data":"7ea8596d64f860a4064e6c3f5dfc6757b9b2f10d959e813455d2ff0db981720e"} Apr 21 07:03:25.175263 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:25.175268 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" event={"ID":"ff68b29a-db87-4ff2-882d-9f1e312dd5ce","Type":"ContainerStarted","Data":"ddaced7c2a6253a0ffaf5cf2136216c9684a98d89c6ac46fd6e08a5b390794c4"} Apr 21 07:03:25.178319 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:25.178283 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nq67t" event={"ID":"140fdd65-e7b7-4a63-bcd0-c990e87edf65","Type":"ContainerStarted","Data":"555267ad9635dadd91441ebd024406b7082d05ba0b3796f9c7e2779ecccbf1fe"} Apr 21 07:03:25.196401 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:25.196346 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ptn2z" podStartSLOduration=2.910691504 podStartE2EDuration="21.196325419s" podCreationTimestamp="2026-04-21 07:03:04 +0000 UTC" firstStartedPulling="2026-04-21 07:03:05.217104083 +0000 UTC m=+1.836438151" lastFinishedPulling="2026-04-21 07:03:23.502737995 +0000 UTC m=+20.122072066" observedRunningTime="2026-04-21 07:03:25.196135971 +0000 UTC m=+21.815470061" watchObservedRunningTime="2026-04-21 07:03:25.196325419 +0000 UTC m=+21.815659510" Apr 21 07:03:25.238929 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:25.238834 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vrvfp" podStartSLOduration=2.892002439 podStartE2EDuration="21.238818324s" podCreationTimestamp="2026-04-21 07:03:04 +0000 UTC" firstStartedPulling="2026-04-21 07:03:05.227403482 +0000 UTC m=+1.846737550" lastFinishedPulling="2026-04-21 07:03:23.574219352 +0000 UTC m=+20.193553435" observedRunningTime="2026-04-21 07:03:25.238507706 +0000 UTC m=+21.857841797" watchObservedRunningTime="2026-04-21 07:03:25.238818324 +0000 UTC m=+21.858152413" Apr 21 07:03:25.255580 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:25.255507 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-nq67t" podStartSLOduration=2.717449753 podStartE2EDuration="21.255490977s" podCreationTimestamp="2026-04-21 07:03:04 +0000 UTC" firstStartedPulling="2026-04-21 07:03:04.991655762 +0000 UTC m=+1.610989833" lastFinishedPulling="2026-04-21 07:03:23.529696986 +0000 UTC m=+20.149031057" observedRunningTime="2026-04-21 07:03:25.255159139 +0000 UTC m=+21.874493231" watchObservedRunningTime="2026-04-21 07:03:25.255490977 +0000 UTC m=+21.874825066" Apr 21 07:03:25.275958 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:25.275899 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-dr8n8" podStartSLOduration=3.000470871 podStartE2EDuration="21.275884005s" podCreationTimestamp="2026-04-21 07:03:04 +0000 UTC" firstStartedPulling="2026-04-21 07:03:05.299959912 +0000 UTC m=+1.919293980" lastFinishedPulling="2026-04-21 07:03:23.575373045 +0000 UTC m=+20.194707114" observedRunningTime="2026-04-21 07:03:25.27552565 +0000 UTC m=+21.894859751" watchObservedRunningTime="2026-04-21 07:03:25.275884005 +0000 UTC m=+21.895218097" Apr 21 07:03:25.497817 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:25.497781 2581 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 07:03:25.912715 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:25.912592 2581 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T07:03:25.497811411Z","UUID":"3a5d7751-40ff-4036-bb69-d5a5ab4f135a","Handler":null,"Name":"","Endpoint":""} Apr 21 07:03:25.914941 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:25.914918 2581 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 07:03:25.914941 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:25.914946 2581 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 07:03:25.975185 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:25.975155 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:25.975377 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:25.975289 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r4v6n" podUID="42b45a2d-c99c-40f2-97f6-2d31aff6854f" Apr 21 07:03:26.182737 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:26.182653 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" event={"ID":"49a3e211-6f6c-4501-878b-c01a12dfbbb1","Type":"ContainerStarted","Data":"5293b31edef5f103ea6951f6954d94e856bf4539dc036c3842a63c9e987fb135"} Apr 21 07:03:26.184978 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:26.184943 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-163.ec2.internal" event={"ID":"3db10428280db186de36082b1aff4988","Type":"ContainerStarted","Data":"9da781fb84bf6da182b8f83d63717015788768d181ee088013a63a308e4e77f6"} Apr 21 07:03:26.200958 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:26.200896 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-163.ec2.internal" podStartSLOduration=22.200876182 podStartE2EDuration="22.200876182s" podCreationTimestamp="2026-04-21 07:03:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:03:26.200401255 +0000 UTC m=+22.819735347" watchObservedRunningTime="2026-04-21 07:03:26.200876182 +0000 UTC m=+22.820210297" Apr 21 07:03:26.975159 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:26.975072 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:26.975311 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:26.975242 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qpb2" podUID="f00d904c-86da-4e00-801a-3bd1d7dbe5f4" Apr 21 07:03:27.189909 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:27.189869 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" event={"ID":"49a3e211-6f6c-4501-878b-c01a12dfbbb1","Type":"ContainerStarted","Data":"76df6ba0c3966389627ab0c0e68799400c4578985061ee919cdd1ad9b41ce34c"} Apr 21 07:03:27.193700 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:27.193663 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" event={"ID":"ff68b29a-db87-4ff2-882d-9f1e312dd5ce","Type":"ContainerStarted","Data":"181939f04aaf77845736273a751e04ba1083fa3387467806783cc3c0029b60c6"} Apr 21 07:03:27.206429 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:27.206375 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tw92r" podStartSLOduration=1.786046772 podStartE2EDuration="23.206361387s" podCreationTimestamp="2026-04-21 07:03:04 +0000 UTC" firstStartedPulling="2026-04-21 07:03:05.181440042 +0000 UTC m=+1.800774109" lastFinishedPulling="2026-04-21 07:03:26.601754657 +0000 UTC m=+23.221088724" observedRunningTime="2026-04-21 07:03:27.206198832 +0000 UTC m=+23.825532921" watchObservedRunningTime="2026-04-21 07:03:27.206361387 +0000 UTC m=+23.825695476" Apr 21 07:03:27.899841 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:27.899799 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-nq67t" Apr 21 07:03:27.900439 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:27.900411 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-nq67t" Apr 21 07:03:27.974493 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:27.974455 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:27.974698 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:27.974604 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r4v6n" podUID="42b45a2d-c99c-40f2-97f6-2d31aff6854f" Apr 21 07:03:28.196461 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:28.196371 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-nq67t" Apr 21 07:03:28.200078 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:28.200045 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-nq67t" Apr 21 07:03:28.974410 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:28.974364 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:28.974609 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:28.974495 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qpb2" podUID="f00d904c-86da-4e00-801a-3bd1d7dbe5f4" Apr 21 07:03:29.974970 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:29.974779 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:29.975623 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:29.975049 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r4v6n" podUID="42b45a2d-c99c-40f2-97f6-2d31aff6854f" Apr 21 07:03:30.202538 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:30.202500 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" event={"ID":"ff68b29a-db87-4ff2-882d-9f1e312dd5ce","Type":"ContainerStarted","Data":"1b2ace3eb4286a4073488220746cca21f5bf8cec18e3b3a97c9f9c13949a9b3f"} Apr 21 07:03:30.202846 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:30.202808 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:30.204284 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:30.204259 2581 generic.go:358] "Generic (PLEG): container finished" podID="d6e8c99f-04e0-4c02-b29b-c5d5e6e76763" containerID="e6e99308f5b7d166ea54b8884cd7699035f2fa66dfcf830ba5617d98d983b091" exitCode=0 Apr 21 07:03:30.204392 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:30.204347 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b76h9" event={"ID":"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763","Type":"ContainerDied","Data":"e6e99308f5b7d166ea54b8884cd7699035f2fa66dfcf830ba5617d98d983b091"} Apr 21 07:03:30.218853 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:30.218826 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:30.229282 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:30.229201 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" podStartSLOduration=7.618036529 podStartE2EDuration="26.229183952s" podCreationTimestamp="2026-04-21 07:03:04 +0000 UTC" firstStartedPulling="2026-04-21 07:03:05.170721241 +0000 UTC m=+1.790055309" lastFinishedPulling="2026-04-21 07:03:23.78186866 +0000 UTC m=+20.401202732" observedRunningTime="2026-04-21 07:03:30.228745501 +0000 UTC m=+26.848079628" watchObservedRunningTime="2026-04-21 07:03:30.229183952 +0000 UTC m=+26.848518042" Apr 21 07:03:30.975046 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:30.975015 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:30.975522 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:30.975139 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qpb2" podUID="f00d904c-86da-4e00-801a-3bd1d7dbe5f4" Apr 21 07:03:31.187464 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:31.187425 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4qpb2"] Apr 21 07:03:31.188094 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:31.188053 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-r4v6n"] Apr 21 07:03:31.188225 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:31.188212 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:31.188333 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:31.188313 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r4v6n" podUID="42b45a2d-c99c-40f2-97f6-2d31aff6854f" Apr 21 07:03:31.208613 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:31.208489 2581 generic.go:358] "Generic (PLEG): container finished" podID="d6e8c99f-04e0-4c02-b29b-c5d5e6e76763" containerID="73fdcfd4fc455cd002bbf80557885e54c2b3754c6d609a2dc0f5285e29a11b96" exitCode=0 Apr 21 07:03:31.208773 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:31.208626 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:31.208773 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:31.208624 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b76h9" event={"ID":"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763","Type":"ContainerDied","Data":"73fdcfd4fc455cd002bbf80557885e54c2b3754c6d609a2dc0f5285e29a11b96"} Apr 21 07:03:31.208773 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:31.208740 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qpb2" podUID="f00d904c-86da-4e00-801a-3bd1d7dbe5f4" Apr 21 07:03:31.208926 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:31.208840 2581 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 07:03:31.209428 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:31.209405 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:31.226553 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:31.226524 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:32.212953 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:32.212912 2581 generic.go:358] "Generic (PLEG): container finished" podID="d6e8c99f-04e0-4c02-b29b-c5d5e6e76763" containerID="a0ec19018209ec97a55637d17ba3855ecc795494929b110f81708f5ddefb8d03" exitCode=0 Apr 21 07:03:32.213400 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:32.212986 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b76h9" event={"ID":"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763","Type":"ContainerDied","Data":"a0ec19018209ec97a55637d17ba3855ecc795494929b110f81708f5ddefb8d03"} Apr 21 07:03:32.213400 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:32.213212 2581 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 07:03:32.974804 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:32.974768 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:32.974984 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:32.974766 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:32.974984 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:32.974935 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r4v6n" podUID="42b45a2d-c99c-40f2-97f6-2d31aff6854f" Apr 21 07:03:32.975097 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:32.975050 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qpb2" podUID="f00d904c-86da-4e00-801a-3bd1d7dbe5f4" Apr 21 07:03:33.215282 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:33.215201 2581 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 07:03:34.971037 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:34.970318 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:34.971037 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:34.970533 2581 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 07:03:34.975438 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:34.975244 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:34.975438 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:34.975244 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:34.975438 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:34.975405 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-r4v6n" podUID="42b45a2d-c99c-40f2-97f6-2d31aff6854f" Apr 21 07:03:34.975438 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:34.975431 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-4qpb2" podUID="f00d904c-86da-4e00-801a-3bd1d7dbe5f4" Apr 21 07:03:34.991460 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:34.991249 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7cnmr" Apr 21 07:03:36.593612 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.593358 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs\") pod \"network-metrics-daemon-r4v6n\" (UID: \"42b45a2d-c99c-40f2-97f6-2d31aff6854f\") " pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:36.593996 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:36.593542 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:03:36.593996 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:36.593705 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs podName:42b45a2d-c99c-40f2-97f6-2d31aff6854f nodeName:}" failed. No retries permitted until 2026-04-21 07:04:08.593687664 +0000 UTC m=+65.213021731 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs") pod "network-metrics-daemon-r4v6n" (UID: "42b45a2d-c99c-40f2-97f6-2d31aff6854f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:03:36.694475 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.694427 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-766ml\" (UniqueName: \"kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml\") pod \"network-check-target-4qpb2\" (UID: \"f00d904c-86da-4e00-801a-3bd1d7dbe5f4\") " pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:36.694692 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:36.694640 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:03:36.694692 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:36.694668 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:03:36.694692 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:36.694683 2581 projected.go:194] Error preparing data for projected volume kube-api-access-766ml for pod openshift-network-diagnostics/network-check-target-4qpb2: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:03:36.694824 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:36.694741 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml podName:f00d904c-86da-4e00-801a-3bd1d7dbe5f4 nodeName:}" failed. No retries permitted until 2026-04-21 07:04:08.694725699 +0000 UTC m=+65.314059767 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-766ml" (UniqueName: "kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml") pod "network-check-target-4qpb2" (UID: "f00d904c-86da-4e00-801a-3bd1d7dbe5f4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:03:36.757041 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.757007 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-163.ec2.internal" event="NodeReady" Apr 21 07:03:36.757203 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.757168 2581 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 07:03:36.816525 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.816481 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xtdjm"] Apr 21 07:03:36.851537 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.851396 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-bdt5g"] Apr 21 07:03:36.851712 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.851557 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xtdjm" Apr 21 07:03:36.856527 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.856328 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-p8pbf\"" Apr 21 07:03:36.856527 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.856326 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 07:03:36.856527 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.856325 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 07:03:36.876148 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.876097 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bdt5g"] Apr 21 07:03:36.876148 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.876141 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xtdjm"] Apr 21 07:03:36.876388 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.876232 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bdt5g" Apr 21 07:03:36.879583 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.879540 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 07:03:36.881020 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.880993 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-f8s28\"" Apr 21 07:03:36.881794 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.881768 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 07:03:36.882313 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.882293 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 07:03:36.974646 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.974614 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:03:36.974829 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.974619 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:03:36.980933 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.980893 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 07:03:36.980933 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.980893 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 07:03:36.981213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.980896 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 07:03:36.981213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.981081 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4vntj\"" Apr 21 07:03:36.981363 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.981345 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5rqg2\"" Apr 21 07:03:36.997549 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.997519 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2427cd84-1ecc-4868-adb1-7e6205d1a291-config-volume\") pod \"dns-default-xtdjm\" (UID: \"2427cd84-1ecc-4868-adb1-7e6205d1a291\") " pod="openshift-dns/dns-default-xtdjm" Apr 21 07:03:36.997737 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.997580 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2427cd84-1ecc-4868-adb1-7e6205d1a291-tmp-dir\") pod \"dns-default-xtdjm\" (UID: \"2427cd84-1ecc-4868-adb1-7e6205d1a291\") " pod="openshift-dns/dns-default-xtdjm" Apr 21 07:03:36.997737 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.997652 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2427cd84-1ecc-4868-adb1-7e6205d1a291-metrics-tls\") pod \"dns-default-xtdjm\" (UID: \"2427cd84-1ecc-4868-adb1-7e6205d1a291\") " pod="openshift-dns/dns-default-xtdjm" Apr 21 07:03:36.997737 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.997705 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b9zf\" (UniqueName: \"kubernetes.io/projected/a6868c94-bebf-4199-8e95-b97042cabdc1-kube-api-access-2b9zf\") pod \"ingress-canary-bdt5g\" (UID: \"a6868c94-bebf-4199-8e95-b97042cabdc1\") " pod="openshift-ingress-canary/ingress-canary-bdt5g" Apr 21 07:03:36.997737 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.997731 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6868c94-bebf-4199-8e95-b97042cabdc1-cert\") pod \"ingress-canary-bdt5g\" (UID: \"a6868c94-bebf-4199-8e95-b97042cabdc1\") " pod="openshift-ingress-canary/ingress-canary-bdt5g" Apr 21 07:03:36.998006 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:36.997791 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8v48\" (UniqueName: \"kubernetes.io/projected/2427cd84-1ecc-4868-adb1-7e6205d1a291-kube-api-access-g8v48\") pod \"dns-default-xtdjm\" (UID: \"2427cd84-1ecc-4868-adb1-7e6205d1a291\") " pod="openshift-dns/dns-default-xtdjm" Apr 21 07:03:37.098339 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:37.098298 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8v48\" (UniqueName: \"kubernetes.io/projected/2427cd84-1ecc-4868-adb1-7e6205d1a291-kube-api-access-g8v48\") pod \"dns-default-xtdjm\" (UID: \"2427cd84-1ecc-4868-adb1-7e6205d1a291\") " pod="openshift-dns/dns-default-xtdjm" Apr 21 07:03:37.098548 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:37.098377 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2427cd84-1ecc-4868-adb1-7e6205d1a291-config-volume\") pod \"dns-default-xtdjm\" (UID: \"2427cd84-1ecc-4868-adb1-7e6205d1a291\") " pod="openshift-dns/dns-default-xtdjm" Apr 21 07:03:37.098548 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:37.098404 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2427cd84-1ecc-4868-adb1-7e6205d1a291-tmp-dir\") pod \"dns-default-xtdjm\" (UID: \"2427cd84-1ecc-4868-adb1-7e6205d1a291\") " pod="openshift-dns/dns-default-xtdjm" Apr 21 07:03:37.098548 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:37.098435 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2427cd84-1ecc-4868-adb1-7e6205d1a291-metrics-tls\") pod \"dns-default-xtdjm\" (UID: \"2427cd84-1ecc-4868-adb1-7e6205d1a291\") " pod="openshift-dns/dns-default-xtdjm" Apr 21 07:03:37.098548 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:37.098464 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2b9zf\" (UniqueName: \"kubernetes.io/projected/a6868c94-bebf-4199-8e95-b97042cabdc1-kube-api-access-2b9zf\") pod \"ingress-canary-bdt5g\" (UID: \"a6868c94-bebf-4199-8e95-b97042cabdc1\") " pod="openshift-ingress-canary/ingress-canary-bdt5g" Apr 21 07:03:37.098548 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:37.098488 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6868c94-bebf-4199-8e95-b97042cabdc1-cert\") pod \"ingress-canary-bdt5g\" (UID: \"a6868c94-bebf-4199-8e95-b97042cabdc1\") " pod="openshift-ingress-canary/ingress-canary-bdt5g" Apr 21 07:03:37.098829 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:37.098637 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:03:37.098829 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:37.098695 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6868c94-bebf-4199-8e95-b97042cabdc1-cert podName:a6868c94-bebf-4199-8e95-b97042cabdc1 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:37.598675057 +0000 UTC m=+34.218009126 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a6868c94-bebf-4199-8e95-b97042cabdc1-cert") pod "ingress-canary-bdt5g" (UID: "a6868c94-bebf-4199-8e95-b97042cabdc1") : secret "canary-serving-cert" not found Apr 21 07:03:37.098829 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:37.098720 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:03:37.098829 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:37.098779 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2427cd84-1ecc-4868-adb1-7e6205d1a291-metrics-tls podName:2427cd84-1ecc-4868-adb1-7e6205d1a291 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:37.598761681 +0000 UTC m=+34.218095753 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2427cd84-1ecc-4868-adb1-7e6205d1a291-metrics-tls") pod "dns-default-xtdjm" (UID: "2427cd84-1ecc-4868-adb1-7e6205d1a291") : secret "dns-default-metrics-tls" not found Apr 21 07:03:37.098829 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:37.098812 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2427cd84-1ecc-4868-adb1-7e6205d1a291-tmp-dir\") pod \"dns-default-xtdjm\" (UID: \"2427cd84-1ecc-4868-adb1-7e6205d1a291\") " pod="openshift-dns/dns-default-xtdjm" Apr 21 07:03:37.099076 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:37.099059 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2427cd84-1ecc-4868-adb1-7e6205d1a291-config-volume\") pod \"dns-default-xtdjm\" (UID: \"2427cd84-1ecc-4868-adb1-7e6205d1a291\") " pod="openshift-dns/dns-default-xtdjm" Apr 21 07:03:37.111362 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:37.111273 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8v48\" (UniqueName: \"kubernetes.io/projected/2427cd84-1ecc-4868-adb1-7e6205d1a291-kube-api-access-g8v48\") pod \"dns-default-xtdjm\" (UID: \"2427cd84-1ecc-4868-adb1-7e6205d1a291\") " pod="openshift-dns/dns-default-xtdjm" Apr 21 07:03:37.111523 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:37.111402 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b9zf\" (UniqueName: \"kubernetes.io/projected/a6868c94-bebf-4199-8e95-b97042cabdc1-kube-api-access-2b9zf\") pod \"ingress-canary-bdt5g\" (UID: \"a6868c94-bebf-4199-8e95-b97042cabdc1\") " pod="openshift-ingress-canary/ingress-canary-bdt5g" Apr 21 07:03:37.601995 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:37.601946 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2427cd84-1ecc-4868-adb1-7e6205d1a291-metrics-tls\") pod \"dns-default-xtdjm\" (UID: \"2427cd84-1ecc-4868-adb1-7e6205d1a291\") " pod="openshift-dns/dns-default-xtdjm" Apr 21 07:03:37.601995 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:37.602005 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6868c94-bebf-4199-8e95-b97042cabdc1-cert\") pod \"ingress-canary-bdt5g\" (UID: \"a6868c94-bebf-4199-8e95-b97042cabdc1\") " pod="openshift-ingress-canary/ingress-canary-bdt5g" Apr 21 07:03:37.602608 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:37.602172 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:03:37.602608 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:37.602246 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2427cd84-1ecc-4868-adb1-7e6205d1a291-metrics-tls podName:2427cd84-1ecc-4868-adb1-7e6205d1a291 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:38.602223852 +0000 UTC m=+35.221557938 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2427cd84-1ecc-4868-adb1-7e6205d1a291-metrics-tls") pod "dns-default-xtdjm" (UID: "2427cd84-1ecc-4868-adb1-7e6205d1a291") : secret "dns-default-metrics-tls" not found Apr 21 07:03:37.602608 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:37.602172 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:03:37.602608 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:37.602329 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6868c94-bebf-4199-8e95-b97042cabdc1-cert podName:a6868c94-bebf-4199-8e95-b97042cabdc1 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:38.60231035 +0000 UTC m=+35.221644432 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a6868c94-bebf-4199-8e95-b97042cabdc1-cert") pod "ingress-canary-bdt5g" (UID: "a6868c94-bebf-4199-8e95-b97042cabdc1") : secret "canary-serving-cert" not found Apr 21 07:03:38.612136 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:38.612085 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2427cd84-1ecc-4868-adb1-7e6205d1a291-metrics-tls\") pod \"dns-default-xtdjm\" (UID: \"2427cd84-1ecc-4868-adb1-7e6205d1a291\") " pod="openshift-dns/dns-default-xtdjm" Apr 21 07:03:38.612136 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:38.612137 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6868c94-bebf-4199-8e95-b97042cabdc1-cert\") pod \"ingress-canary-bdt5g\" (UID: \"a6868c94-bebf-4199-8e95-b97042cabdc1\") " pod="openshift-ingress-canary/ingress-canary-bdt5g" Apr 21 07:03:38.612726 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:38.612233 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:03:38.612726 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:38.612237 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:03:38.612726 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:38.612288 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6868c94-bebf-4199-8e95-b97042cabdc1-cert podName:a6868c94-bebf-4199-8e95-b97042cabdc1 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:40.612272792 +0000 UTC m=+37.231606860 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a6868c94-bebf-4199-8e95-b97042cabdc1-cert") pod "ingress-canary-bdt5g" (UID: "a6868c94-bebf-4199-8e95-b97042cabdc1") : secret "canary-serving-cert" not found Apr 21 07:03:38.612726 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:38.612301 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2427cd84-1ecc-4868-adb1-7e6205d1a291-metrics-tls podName:2427cd84-1ecc-4868-adb1-7e6205d1a291 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:40.61229565 +0000 UTC m=+37.231629718 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2427cd84-1ecc-4868-adb1-7e6205d1a291-metrics-tls") pod "dns-default-xtdjm" (UID: "2427cd84-1ecc-4868-adb1-7e6205d1a291") : secret "dns-default-metrics-tls" not found Apr 21 07:03:39.228984 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:39.228942 2581 generic.go:358] "Generic (PLEG): container finished" podID="d6e8c99f-04e0-4c02-b29b-c5d5e6e76763" containerID="6b3a8e9c3a3aca29f929e3b48137411e54c5f483c5fa6a1cf82527cccfb9b0e6" exitCode=0 Apr 21 07:03:39.229173 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:39.228996 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b76h9" event={"ID":"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763","Type":"ContainerDied","Data":"6b3a8e9c3a3aca29f929e3b48137411e54c5f483c5fa6a1cf82527cccfb9b0e6"} Apr 21 07:03:39.758375 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:39.758336 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-t4h72"] Apr 21 07:03:39.764277 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:39.764251 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-t4h72" Apr 21 07:03:39.766777 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:39.766747 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 21 07:03:39.766927 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:39.766835 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-vr6wq\"" Apr 21 07:03:39.766927 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:39.766913 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 21 07:03:39.771925 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:39.771899 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-t4h72"] Apr 21 07:03:39.920704 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:39.920663 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkxrw\" (UniqueName: \"kubernetes.io/projected/1911b9d6-1a67-4559-b684-2a2fc0ad29c0-kube-api-access-mkxrw\") pod \"migrator-74bb7799d9-t4h72\" (UID: \"1911b9d6-1a67-4559-b684-2a2fc0ad29c0\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-t4h72" Apr 21 07:03:40.021080 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:40.020987 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mkxrw\" (UniqueName: \"kubernetes.io/projected/1911b9d6-1a67-4559-b684-2a2fc0ad29c0-kube-api-access-mkxrw\") pod \"migrator-74bb7799d9-t4h72\" (UID: \"1911b9d6-1a67-4559-b684-2a2fc0ad29c0\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-t4h72" Apr 21 07:03:40.030777 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:40.030742 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkxrw\" (UniqueName: \"kubernetes.io/projected/1911b9d6-1a67-4559-b684-2a2fc0ad29c0-kube-api-access-mkxrw\") pod \"migrator-74bb7799d9-t4h72\" (UID: \"1911b9d6-1a67-4559-b684-2a2fc0ad29c0\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-t4h72" Apr 21 07:03:40.074681 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:40.074642 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-t4h72" Apr 21 07:03:40.234515 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:40.234182 2581 generic.go:358] "Generic (PLEG): container finished" podID="d6e8c99f-04e0-4c02-b29b-c5d5e6e76763" containerID="8eecd1f12ce114b473d0fec8201e2ef196dd3d049a5b2c20f8c74abc7a4c7e5e" exitCode=0 Apr 21 07:03:40.234515 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:40.234258 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b76h9" event={"ID":"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763","Type":"ContainerDied","Data":"8eecd1f12ce114b473d0fec8201e2ef196dd3d049a5b2c20f8c74abc7a4c7e5e"} Apr 21 07:03:40.236393 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:40.235664 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-t4h72"] Apr 21 07:03:40.240344 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:40.240317 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1911b9d6_1a67_4559_b684_2a2fc0ad29c0.slice/crio-1fd059ef91a74926eecdde3f7a5e384328951cbce7fca44e55f1abfee0b22c05 WatchSource:0}: Error finding container 1fd059ef91a74926eecdde3f7a5e384328951cbce7fca44e55f1abfee0b22c05: Status 404 returned error can't find the container with id 1fd059ef91a74926eecdde3f7a5e384328951cbce7fca44e55f1abfee0b22c05 Apr 21 07:03:40.625398 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:40.625363 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2427cd84-1ecc-4868-adb1-7e6205d1a291-metrics-tls\") pod \"dns-default-xtdjm\" (UID: \"2427cd84-1ecc-4868-adb1-7e6205d1a291\") " pod="openshift-dns/dns-default-xtdjm" Apr 21 07:03:40.625628 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:40.625402 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6868c94-bebf-4199-8e95-b97042cabdc1-cert\") pod \"ingress-canary-bdt5g\" (UID: \"a6868c94-bebf-4199-8e95-b97042cabdc1\") " pod="openshift-ingress-canary/ingress-canary-bdt5g" Apr 21 07:03:40.625628 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:40.625523 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:03:40.625628 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:40.625526 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:03:40.625628 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:40.625593 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6868c94-bebf-4199-8e95-b97042cabdc1-cert podName:a6868c94-bebf-4199-8e95-b97042cabdc1 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:44.625577407 +0000 UTC m=+41.244911488 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a6868c94-bebf-4199-8e95-b97042cabdc1-cert") pod "ingress-canary-bdt5g" (UID: "a6868c94-bebf-4199-8e95-b97042cabdc1") : secret "canary-serving-cert" not found Apr 21 07:03:40.625628 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:40.625606 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2427cd84-1ecc-4868-adb1-7e6205d1a291-metrics-tls podName:2427cd84-1ecc-4868-adb1-7e6205d1a291 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:44.625600192 +0000 UTC m=+41.244934260 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2427cd84-1ecc-4868-adb1-7e6205d1a291-metrics-tls") pod "dns-default-xtdjm" (UID: "2427cd84-1ecc-4868-adb1-7e6205d1a291") : secret "dns-default-metrics-tls" not found Apr 21 07:03:40.880713 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:40.880620 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ptn2z_21307e09-27b9-492e-ac26-b3d09e5794af/dns-node-resolver/0.log" Apr 21 07:03:41.238130 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:41.238043 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-t4h72" event={"ID":"1911b9d6-1a67-4559-b684-2a2fc0ad29c0","Type":"ContainerStarted","Data":"1fd059ef91a74926eecdde3f7a5e384328951cbce7fca44e55f1abfee0b22c05"} Apr 21 07:03:41.242017 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:41.241980 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b76h9" event={"ID":"d6e8c99f-04e0-4c02-b29b-c5d5e6e76763","Type":"ContainerStarted","Data":"e90458d4f58d3e2bf4871ed58e0cc6026a0f7419977c8d1de93a4dc3b1430962"} Apr 21 07:03:41.268409 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:41.268345 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-b76h9" podStartSLOduration=4.302202188 podStartE2EDuration="37.268326654s" podCreationTimestamp="2026-04-21 07:03:04 +0000 UTC" firstStartedPulling="2026-04-21 07:03:05.250970371 +0000 UTC m=+1.870304439" lastFinishedPulling="2026-04-21 07:03:38.217094837 +0000 UTC m=+34.836428905" observedRunningTime="2026-04-21 07:03:41.267206747 +0000 UTC m=+37.886540838" watchObservedRunningTime="2026-04-21 07:03:41.268326654 +0000 UTC m=+37.887660744" Apr 21 07:03:41.881750 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:41.881722 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vrvfp_df97fb0c-eb01-481b-ab26-0073456033cd/node-ca/0.log" Apr 21 07:03:42.245607 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:42.245345 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-t4h72" event={"ID":"1911b9d6-1a67-4559-b684-2a2fc0ad29c0","Type":"ContainerStarted","Data":"a66a52706f1cb09bb70afdb631d80b373242aa284a769f41096a31fd95b70d10"} Apr 21 07:03:42.245802 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:42.245622 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-t4h72" event={"ID":"1911b9d6-1a67-4559-b684-2a2fc0ad29c0","Type":"ContainerStarted","Data":"647ba4862ff605ffaabcb1bba814a3ba0f55b64823d4bc2fc862f7735f4d1a9b"} Apr 21 07:03:42.274756 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:42.274689 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-t4h72" podStartSLOduration=1.6904214720000001 podStartE2EDuration="3.274673021s" podCreationTimestamp="2026-04-21 07:03:39 +0000 UTC" firstStartedPulling="2026-04-21 07:03:40.242079744 +0000 UTC m=+36.861413812" lastFinishedPulling="2026-04-21 07:03:41.826331288 +0000 UTC m=+38.445665361" observedRunningTime="2026-04-21 07:03:42.273868138 +0000 UTC m=+38.893202229" watchObservedRunningTime="2026-04-21 07:03:42.274673021 +0000 UTC m=+38.894007111" Apr 21 07:03:44.658252 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:44.658206 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2427cd84-1ecc-4868-adb1-7e6205d1a291-metrics-tls\") pod \"dns-default-xtdjm\" (UID: \"2427cd84-1ecc-4868-adb1-7e6205d1a291\") " pod="openshift-dns/dns-default-xtdjm" Apr 21 07:03:44.658252 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:44.658259 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6868c94-bebf-4199-8e95-b97042cabdc1-cert\") pod \"ingress-canary-bdt5g\" (UID: \"a6868c94-bebf-4199-8e95-b97042cabdc1\") " pod="openshift-ingress-canary/ingress-canary-bdt5g" Apr 21 07:03:44.658765 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:44.658362 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:03:44.658765 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:44.658384 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:03:44.658765 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:44.658433 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6868c94-bebf-4199-8e95-b97042cabdc1-cert podName:a6868c94-bebf-4199-8e95-b97042cabdc1 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:52.658410309 +0000 UTC m=+49.277744389 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a6868c94-bebf-4199-8e95-b97042cabdc1-cert") pod "ingress-canary-bdt5g" (UID: "a6868c94-bebf-4199-8e95-b97042cabdc1") : secret "canary-serving-cert" not found Apr 21 07:03:44.658765 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:03:44.658460 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2427cd84-1ecc-4868-adb1-7e6205d1a291-metrics-tls podName:2427cd84-1ecc-4868-adb1-7e6205d1a291 nodeName:}" failed. No retries permitted until 2026-04-21 07:03:52.658443087 +0000 UTC m=+49.277777158 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2427cd84-1ecc-4868-adb1-7e6205d1a291-metrics-tls") pod "dns-default-xtdjm" (UID: "2427cd84-1ecc-4868-adb1-7e6205d1a291") : secret "dns-default-metrics-tls" not found Apr 21 07:03:52.717309 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:52.717261 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2427cd84-1ecc-4868-adb1-7e6205d1a291-metrics-tls\") pod \"dns-default-xtdjm\" (UID: \"2427cd84-1ecc-4868-adb1-7e6205d1a291\") " pod="openshift-dns/dns-default-xtdjm" Apr 21 07:03:52.717861 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:52.717319 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6868c94-bebf-4199-8e95-b97042cabdc1-cert\") pod \"ingress-canary-bdt5g\" (UID: \"a6868c94-bebf-4199-8e95-b97042cabdc1\") " pod="openshift-ingress-canary/ingress-canary-bdt5g" Apr 21 07:03:52.721475 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:52.721440 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2427cd84-1ecc-4868-adb1-7e6205d1a291-metrics-tls\") pod \"dns-default-xtdjm\" (UID: \"2427cd84-1ecc-4868-adb1-7e6205d1a291\") " pod="openshift-dns/dns-default-xtdjm" Apr 21 07:03:52.721658 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:52.721635 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a6868c94-bebf-4199-8e95-b97042cabdc1-cert\") pod \"ingress-canary-bdt5g\" (UID: \"a6868c94-bebf-4199-8e95-b97042cabdc1\") " pod="openshift-ingress-canary/ingress-canary-bdt5g" Apr 21 07:03:52.763630 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:52.763591 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xtdjm" Apr 21 07:03:52.788625 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:52.788596 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bdt5g" Apr 21 07:03:52.929219 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:52.929187 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xtdjm"] Apr 21 07:03:52.964541 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:52.964509 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bdt5g"] Apr 21 07:03:52.967601 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:03:52.967527 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6868c94_bebf_4199_8e95_b97042cabdc1.slice/crio-9dfdb29f70c9a6c9e49f05cd44f0a1a441f441d75d52b23fe2b6987a96367112 WatchSource:0}: Error finding container 9dfdb29f70c9a6c9e49f05cd44f0a1a441f441d75d52b23fe2b6987a96367112: Status 404 returned error can't find the container with id 9dfdb29f70c9a6c9e49f05cd44f0a1a441f441d75d52b23fe2b6987a96367112 Apr 21 07:03:53.269385 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:53.269294 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bdt5g" event={"ID":"a6868c94-bebf-4199-8e95-b97042cabdc1","Type":"ContainerStarted","Data":"9dfdb29f70c9a6c9e49f05cd44f0a1a441f441d75d52b23fe2b6987a96367112"} Apr 21 07:03:53.270411 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:53.270381 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xtdjm" event={"ID":"2427cd84-1ecc-4868-adb1-7e6205d1a291","Type":"ContainerStarted","Data":"61a0cb94422202f9f42e1123c01162943109f16ee30b8a6be5b5417f079ff7ec"} Apr 21 07:03:55.276284 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:55.276180 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xtdjm" event={"ID":"2427cd84-1ecc-4868-adb1-7e6205d1a291","Type":"ContainerStarted","Data":"799247e91b9adccb673d53917ef149540dd17b7c82e7457c107638681a9ee82d"} Apr 21 07:03:55.276284 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:55.276216 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xtdjm" event={"ID":"2427cd84-1ecc-4868-adb1-7e6205d1a291","Type":"ContainerStarted","Data":"63a33e6b25627d5a55323791086c74ccaf391406cb1a3e65ea923085c660fc1d"} Apr 21 07:03:55.276284 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:55.276257 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-xtdjm" Apr 21 07:03:55.277496 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:55.277469 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bdt5g" event={"ID":"a6868c94-bebf-4199-8e95-b97042cabdc1","Type":"ContainerStarted","Data":"84a972065a9056a76e74530a3fdcc9dd3fec5a138045e3e1f124dee78e9874b3"} Apr 21 07:03:55.293858 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:55.293795 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xtdjm" podStartSLOduration=17.244498168 podStartE2EDuration="19.293775565s" podCreationTimestamp="2026-04-21 07:03:36 +0000 UTC" firstStartedPulling="2026-04-21 07:03:52.936500807 +0000 UTC m=+49.555834888" lastFinishedPulling="2026-04-21 07:03:54.985778201 +0000 UTC m=+51.605112285" observedRunningTime="2026-04-21 07:03:55.293722533 +0000 UTC m=+51.913056635" watchObservedRunningTime="2026-04-21 07:03:55.293775565 +0000 UTC m=+51.913109668" Apr 21 07:03:55.309935 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:03:55.309868 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-bdt5g" podStartSLOduration=17.290836671 podStartE2EDuration="19.309852314s" podCreationTimestamp="2026-04-21 07:03:36 +0000 UTC" firstStartedPulling="2026-04-21 07:03:52.969445867 +0000 UTC m=+49.588779934" lastFinishedPulling="2026-04-21 07:03:54.988461506 +0000 UTC m=+51.607795577" observedRunningTime="2026-04-21 07:03:55.309733063 +0000 UTC m=+51.929067153" watchObservedRunningTime="2026-04-21 07:03:55.309852314 +0000 UTC m=+51.929186404" Apr 21 07:04:02.560373 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.560308 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6g68p"] Apr 21 07:04:02.563539 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.563517 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6g68p" Apr 21 07:04:02.568041 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.568018 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 07:04:02.569218 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.569192 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 07:04:02.569345 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.569213 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 07:04:02.569345 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.569213 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-l89lk\"" Apr 21 07:04:02.569453 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.569216 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 07:04:02.576843 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.576812 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6g68p"] Apr 21 07:04:02.640208 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.640168 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-ncs5j"] Apr 21 07:04:02.644769 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.644742 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6887d699d4-whtf5"] Apr 21 07:04:02.644919 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.644898 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-ncs5j" Apr 21 07:04:02.646254 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.646233 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-765c64c998-s4lnn"] Apr 21 07:04:02.646386 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.646370 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:02.648066 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.648044 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-765c64c998-s4lnn" Apr 21 07:04:02.648919 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.648899 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 07:04:02.649023 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.648899 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 07:04:02.649023 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.649006 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-5b624\"" Apr 21 07:04:02.650720 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.650703 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 07:04:02.651323 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.651307 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 07:04:02.651409 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.651321 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 07:04:02.651527 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.651481 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 07:04:02.651745 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.651582 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-2rctr\"" Apr 21 07:04:02.652824 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.652795 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 07:04:02.653387 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.653369 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 07:04:02.653864 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.653846 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 07:04:02.653949 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.653878 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 07:04:02.654010 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.653988 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 21 07:04:02.674800 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.674772 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-ncs5j"] Apr 21 07:04:02.675916 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.675903 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6887d699d4-whtf5"] Apr 21 07:04:02.676910 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.676889 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-765c64c998-s4lnn"] Apr 21 07:04:02.687153 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.687126 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/fba56b2f-24aa-46b9-b5c7-88cc67c2fb44-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6g68p\" (UID: \"fba56b2f-24aa-46b9-b5c7-88cc67c2fb44\") " pod="openshift-insights/insights-runtime-extractor-6g68p" Apr 21 07:04:02.687302 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.687172 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/fba56b2f-24aa-46b9-b5c7-88cc67c2fb44-data-volume\") pod \"insights-runtime-extractor-6g68p\" (UID: \"fba56b2f-24aa-46b9-b5c7-88cc67c2fb44\") " pod="openshift-insights/insights-runtime-extractor-6g68p" Apr 21 07:04:02.687302 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.687231 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/fba56b2f-24aa-46b9-b5c7-88cc67c2fb44-crio-socket\") pod \"insights-runtime-extractor-6g68p\" (UID: \"fba56b2f-24aa-46b9-b5c7-88cc67c2fb44\") " pod="openshift-insights/insights-runtime-extractor-6g68p" Apr 21 07:04:02.687302 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.687277 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fba56b2f-24aa-46b9-b5c7-88cc67c2fb44-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6g68p\" (UID: \"fba56b2f-24aa-46b9-b5c7-88cc67c2fb44\") " pod="openshift-insights/insights-runtime-extractor-6g68p" Apr 21 07:04:02.687437 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.687328 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5x7t\" (UniqueName: \"kubernetes.io/projected/fba56b2f-24aa-46b9-b5c7-88cc67c2fb44-kube-api-access-g5x7t\") pod \"insights-runtime-extractor-6g68p\" (UID: \"fba56b2f-24aa-46b9-b5c7-88cc67c2fb44\") " pod="openshift-insights/insights-runtime-extractor-6g68p" Apr 21 07:04:02.743097 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.743056 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5977bd9744-9cf64"] Apr 21 07:04:02.745473 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.745449 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:02.748324 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.748297 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 07:04:02.748631 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.748615 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-wwtlv\"" Apr 21 07:04:02.749000 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.748986 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 07:04:02.749086 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.749071 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 07:04:02.754871 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.754849 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 07:04:02.758785 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.758758 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5977bd9744-9cf64"] Apr 21 07:04:02.788522 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.788490 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/fba56b2f-24aa-46b9-b5c7-88cc67c2fb44-data-volume\") pod \"insights-runtime-extractor-6g68p\" (UID: \"fba56b2f-24aa-46b9-b5c7-88cc67c2fb44\") " pod="openshift-insights/insights-runtime-extractor-6g68p" Apr 21 07:04:02.788522 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.788527 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d3cc54b9-cc90-420d-adae-f60121d771d4-console-config\") pod \"console-6887d699d4-whtf5\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:02.788794 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.788551 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d3cc54b9-cc90-420d-adae-f60121d771d4-console-oauth-config\") pod \"console-6887d699d4-whtf5\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:02.788794 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.788597 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d3cc54b9-cc90-420d-adae-f60121d771d4-service-ca\") pod \"console-6887d699d4-whtf5\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:02.788794 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.788681 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/fba56b2f-24aa-46b9-b5c7-88cc67c2fb44-crio-socket\") pod \"insights-runtime-extractor-6g68p\" (UID: \"fba56b2f-24aa-46b9-b5c7-88cc67c2fb44\") " pod="openshift-insights/insights-runtime-extractor-6g68p" Apr 21 07:04:02.788794 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.788700 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6frc\" (UniqueName: \"kubernetes.io/projected/6ce9ee09-1262-4278-8b0b-72dce2cc896a-kube-api-access-w6frc\") pod \"downloads-6bcc868b7-ncs5j\" (UID: \"6ce9ee09-1262-4278-8b0b-72dce2cc896a\") " pod="openshift-console/downloads-6bcc868b7-ncs5j" Apr 21 07:04:02.788794 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.788716 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3cc54b9-cc90-420d-adae-f60121d771d4-console-serving-cert\") pod \"console-6887d699d4-whtf5\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:02.788794 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.788735 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fba56b2f-24aa-46b9-b5c7-88cc67c2fb44-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6g68p\" (UID: \"fba56b2f-24aa-46b9-b5c7-88cc67c2fb44\") " pod="openshift-insights/insights-runtime-extractor-6g68p" Apr 21 07:04:02.788794 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.788758 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5x7t\" (UniqueName: \"kubernetes.io/projected/fba56b2f-24aa-46b9-b5c7-88cc67c2fb44-kube-api-access-g5x7t\") pod \"insights-runtime-extractor-6g68p\" (UID: \"fba56b2f-24aa-46b9-b5c7-88cc67c2fb44\") " pod="openshift-insights/insights-runtime-extractor-6g68p" Apr 21 07:04:02.788794 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.788779 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d3cc54b9-cc90-420d-adae-f60121d771d4-oauth-serving-cert\") pod \"console-6887d699d4-whtf5\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:02.789082 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.788832 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6sqv\" (UniqueName: \"kubernetes.io/projected/d3cc54b9-cc90-420d-adae-f60121d771d4-kube-api-access-b6sqv\") pod \"console-6887d699d4-whtf5\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:02.789082 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.788885 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wrmt\" (UniqueName: \"kubernetes.io/projected/2372ef7d-8c3c-4eba-8da5-912ad24032da-kube-api-access-7wrmt\") pod \"klusterlet-addon-workmgr-765c64c998-s4lnn\" (UID: \"2372ef7d-8c3c-4eba-8da5-912ad24032da\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-765c64c998-s4lnn" Apr 21 07:04:02.789082 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.788835 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/fba56b2f-24aa-46b9-b5c7-88cc67c2fb44-crio-socket\") pod \"insights-runtime-extractor-6g68p\" (UID: \"fba56b2f-24aa-46b9-b5c7-88cc67c2fb44\") " pod="openshift-insights/insights-runtime-extractor-6g68p" Apr 21 07:04:02.789082 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.788905 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/fba56b2f-24aa-46b9-b5c7-88cc67c2fb44-data-volume\") pod \"insights-runtime-extractor-6g68p\" (UID: \"fba56b2f-24aa-46b9-b5c7-88cc67c2fb44\") " pod="openshift-insights/insights-runtime-extractor-6g68p" Apr 21 07:04:02.789082 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.788927 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2372ef7d-8c3c-4eba-8da5-912ad24032da-tmp\") pod \"klusterlet-addon-workmgr-765c64c998-s4lnn\" (UID: \"2372ef7d-8c3c-4eba-8da5-912ad24032da\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-765c64c998-s4lnn" Apr 21 07:04:02.789082 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.788952 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/2372ef7d-8c3c-4eba-8da5-912ad24032da-klusterlet-config\") pod \"klusterlet-addon-workmgr-765c64c998-s4lnn\" (UID: \"2372ef7d-8c3c-4eba-8da5-912ad24032da\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-765c64c998-s4lnn" Apr 21 07:04:02.789082 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.788980 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/fba56b2f-24aa-46b9-b5c7-88cc67c2fb44-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6g68p\" (UID: \"fba56b2f-24aa-46b9-b5c7-88cc67c2fb44\") " pod="openshift-insights/insights-runtime-extractor-6g68p" Apr 21 07:04:02.789483 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.789464 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/fba56b2f-24aa-46b9-b5c7-88cc67c2fb44-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6g68p\" (UID: \"fba56b2f-24aa-46b9-b5c7-88cc67c2fb44\") " pod="openshift-insights/insights-runtime-extractor-6g68p" Apr 21 07:04:02.791209 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.791186 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fba56b2f-24aa-46b9-b5c7-88cc67c2fb44-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6g68p\" (UID: \"fba56b2f-24aa-46b9-b5c7-88cc67c2fb44\") " pod="openshift-insights/insights-runtime-extractor-6g68p" Apr 21 07:04:02.799510 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.799473 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5x7t\" (UniqueName: \"kubernetes.io/projected/fba56b2f-24aa-46b9-b5c7-88cc67c2fb44-kube-api-access-g5x7t\") pod \"insights-runtime-extractor-6g68p\" (UID: \"fba56b2f-24aa-46b9-b5c7-88cc67c2fb44\") " pod="openshift-insights/insights-runtime-extractor-6g68p" Apr 21 07:04:02.873379 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.873348 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6g68p" Apr 21 07:04:02.889913 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.889879 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6sqv\" (UniqueName: \"kubernetes.io/projected/d3cc54b9-cc90-420d-adae-f60121d771d4-kube-api-access-b6sqv\") pod \"console-6887d699d4-whtf5\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:02.890092 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.889921 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e67323c8-e3cc-4745-b61d-27f2a2459601-installation-pull-secrets\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:02.890092 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.890026 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e67323c8-e3cc-4745-b61d-27f2a2459601-image-registry-private-configuration\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:02.890092 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.890064 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d3cc54b9-cc90-420d-adae-f60121d771d4-console-oauth-config\") pod \"console-6887d699d4-whtf5\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:02.890255 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.890102 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d3cc54b9-cc90-420d-adae-f60121d771d4-service-ca\") pod \"console-6887d699d4-whtf5\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:02.890255 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.890140 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kspr\" (UniqueName: \"kubernetes.io/projected/e67323c8-e3cc-4745-b61d-27f2a2459601-kube-api-access-7kspr\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:02.890255 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.890183 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6frc\" (UniqueName: \"kubernetes.io/projected/6ce9ee09-1262-4278-8b0b-72dce2cc896a-kube-api-access-w6frc\") pod \"downloads-6bcc868b7-ncs5j\" (UID: \"6ce9ee09-1262-4278-8b0b-72dce2cc896a\") " pod="openshift-console/downloads-6bcc868b7-ncs5j" Apr 21 07:04:02.890255 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.890211 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3cc54b9-cc90-420d-adae-f60121d771d4-console-serving-cert\") pod \"console-6887d699d4-whtf5\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:02.890255 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.890235 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d3cc54b9-cc90-420d-adae-f60121d771d4-oauth-serving-cert\") pod \"console-6887d699d4-whtf5\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:02.890484 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.890263 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e67323c8-e3cc-4745-b61d-27f2a2459601-registry-certificates\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:02.890484 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.890292 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e67323c8-e3cc-4745-b61d-27f2a2459601-ca-trust-extracted\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:02.890484 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.890321 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e67323c8-e3cc-4745-b61d-27f2a2459601-bound-sa-token\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:02.890484 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.890399 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wrmt\" (UniqueName: \"kubernetes.io/projected/2372ef7d-8c3c-4eba-8da5-912ad24032da-kube-api-access-7wrmt\") pod \"klusterlet-addon-workmgr-765c64c998-s4lnn\" (UID: \"2372ef7d-8c3c-4eba-8da5-912ad24032da\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-765c64c998-s4lnn" Apr 21 07:04:02.890484 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.890439 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e67323c8-e3cc-4745-b61d-27f2a2459601-trusted-ca\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:02.890484 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.890475 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2372ef7d-8c3c-4eba-8da5-912ad24032da-tmp\") pod \"klusterlet-addon-workmgr-765c64c998-s4lnn\" (UID: \"2372ef7d-8c3c-4eba-8da5-912ad24032da\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-765c64c998-s4lnn" Apr 21 07:04:02.890862 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.890504 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/2372ef7d-8c3c-4eba-8da5-912ad24032da-klusterlet-config\") pod \"klusterlet-addon-workmgr-765c64c998-s4lnn\" (UID: \"2372ef7d-8c3c-4eba-8da5-912ad24032da\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-765c64c998-s4lnn" Apr 21 07:04:02.890862 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.890536 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d3cc54b9-cc90-420d-adae-f60121d771d4-console-config\") pod \"console-6887d699d4-whtf5\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:02.890862 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.890633 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e67323c8-e3cc-4745-b61d-27f2a2459601-registry-tls\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:02.890961 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.890916 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2372ef7d-8c3c-4eba-8da5-912ad24032da-tmp\") pod \"klusterlet-addon-workmgr-765c64c998-s4lnn\" (UID: \"2372ef7d-8c3c-4eba-8da5-912ad24032da\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-765c64c998-s4lnn" Apr 21 07:04:02.891060 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.891038 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d3cc54b9-cc90-420d-adae-f60121d771d4-oauth-serving-cert\") pod \"console-6887d699d4-whtf5\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:02.891126 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.891106 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d3cc54b9-cc90-420d-adae-f60121d771d4-service-ca\") pod \"console-6887d699d4-whtf5\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:02.891374 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.891353 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d3cc54b9-cc90-420d-adae-f60121d771d4-console-config\") pod \"console-6887d699d4-whtf5\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:02.893099 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.893069 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d3cc54b9-cc90-420d-adae-f60121d771d4-console-oauth-config\") pod \"console-6887d699d4-whtf5\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:02.893579 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.893506 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/2372ef7d-8c3c-4eba-8da5-912ad24032da-klusterlet-config\") pod \"klusterlet-addon-workmgr-765c64c998-s4lnn\" (UID: \"2372ef7d-8c3c-4eba-8da5-912ad24032da\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-765c64c998-s4lnn" Apr 21 07:04:02.893945 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.893922 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3cc54b9-cc90-420d-adae-f60121d771d4-console-serving-cert\") pod \"console-6887d699d4-whtf5\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:02.900687 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.900652 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wrmt\" (UniqueName: \"kubernetes.io/projected/2372ef7d-8c3c-4eba-8da5-912ad24032da-kube-api-access-7wrmt\") pod \"klusterlet-addon-workmgr-765c64c998-s4lnn\" (UID: \"2372ef7d-8c3c-4eba-8da5-912ad24032da\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-765c64c998-s4lnn" Apr 21 07:04:02.900853 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.900837 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6sqv\" (UniqueName: \"kubernetes.io/projected/d3cc54b9-cc90-420d-adae-f60121d771d4-kube-api-access-b6sqv\") pod \"console-6887d699d4-whtf5\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:02.900992 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.900973 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6frc\" (UniqueName: \"kubernetes.io/projected/6ce9ee09-1262-4278-8b0b-72dce2cc896a-kube-api-access-w6frc\") pod \"downloads-6bcc868b7-ncs5j\" (UID: \"6ce9ee09-1262-4278-8b0b-72dce2cc896a\") " pod="openshift-console/downloads-6bcc868b7-ncs5j" Apr 21 07:04:02.956294 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.955880 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-ncs5j" Apr 21 07:04:02.961879 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.961774 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:02.966707 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.966676 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-765c64c998-s4lnn" Apr 21 07:04:02.992207 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.992151 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e67323c8-e3cc-4745-b61d-27f2a2459601-image-registry-private-configuration\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:02.992393 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.992236 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7kspr\" (UniqueName: \"kubernetes.io/projected/e67323c8-e3cc-4745-b61d-27f2a2459601-kube-api-access-7kspr\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:02.992393 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.992369 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e67323c8-e3cc-4745-b61d-27f2a2459601-registry-certificates\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:02.992510 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.992403 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e67323c8-e3cc-4745-b61d-27f2a2459601-ca-trust-extracted\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:02.992510 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.992425 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e67323c8-e3cc-4745-b61d-27f2a2459601-bound-sa-token\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:02.992510 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.992458 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e67323c8-e3cc-4745-b61d-27f2a2459601-trusted-ca\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:02.992510 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.992486 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e67323c8-e3cc-4745-b61d-27f2a2459601-registry-tls\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:02.992724 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.992524 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e67323c8-e3cc-4745-b61d-27f2a2459601-installation-pull-secrets\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:02.994185 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.993202 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e67323c8-e3cc-4745-b61d-27f2a2459601-ca-trust-extracted\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:02.994185 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.993645 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e67323c8-e3cc-4745-b61d-27f2a2459601-trusted-ca\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:02.994185 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.994112 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e67323c8-e3cc-4745-b61d-27f2a2459601-registry-certificates\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:02.995708 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.995679 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e67323c8-e3cc-4745-b61d-27f2a2459601-image-registry-private-configuration\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:02.995859 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.995840 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e67323c8-e3cc-4745-b61d-27f2a2459601-installation-pull-secrets\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:02.996287 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:02.996267 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e67323c8-e3cc-4745-b61d-27f2a2459601-registry-tls\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:03.006717 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:03.006662 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kspr\" (UniqueName: \"kubernetes.io/projected/e67323c8-e3cc-4745-b61d-27f2a2459601-kube-api-access-7kspr\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:03.007406 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:03.007384 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e67323c8-e3cc-4745-b61d-27f2a2459601-bound-sa-token\") pod \"image-registry-5977bd9744-9cf64\" (UID: \"e67323c8-e3cc-4745-b61d-27f2a2459601\") " pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:03.028989 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:03.028930 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6g68p"] Apr 21 07:04:03.034659 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:04:03.034588 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfba56b2f_24aa_46b9_b5c7_88cc67c2fb44.slice/crio-81a7c2050b688b66e029c955c7c1ba1b2a8745e86743cc5c3133549abe2338fc WatchSource:0}: Error finding container 81a7c2050b688b66e029c955c7c1ba1b2a8745e86743cc5c3133549abe2338fc: Status 404 returned error can't find the container with id 81a7c2050b688b66e029c955c7c1ba1b2a8745e86743cc5c3133549abe2338fc Apr 21 07:04:03.055588 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:03.055404 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:03.135406 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:03.135308 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-ncs5j"] Apr 21 07:04:03.210786 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:03.210753 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5977bd9744-9cf64"] Apr 21 07:04:03.213482 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:04:03.213451 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode67323c8_e3cc_4745_b61d_27f2a2459601.slice/crio-be0d0dfe2b8a460a3f9d212c80c9be4b8431001917f2756f1901c697c3846e79 WatchSource:0}: Error finding container be0d0dfe2b8a460a3f9d212c80c9be4b8431001917f2756f1901c697c3846e79: Status 404 returned error can't find the container with id be0d0dfe2b8a460a3f9d212c80c9be4b8431001917f2756f1901c697c3846e79 Apr 21 07:04:03.295421 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:03.295385 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6g68p" event={"ID":"fba56b2f-24aa-46b9-b5c7-88cc67c2fb44","Type":"ContainerStarted","Data":"325249e7e00ba218166c4b0e6ef30c3131892530680043188484c64d0ce5aee1"} Apr 21 07:04:03.295421 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:03.295428 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6g68p" event={"ID":"fba56b2f-24aa-46b9-b5c7-88cc67c2fb44","Type":"ContainerStarted","Data":"81a7c2050b688b66e029c955c7c1ba1b2a8745e86743cc5c3133549abe2338fc"} Apr 21 07:04:03.296885 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:03.296819 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5977bd9744-9cf64" event={"ID":"e67323c8-e3cc-4745-b61d-27f2a2459601","Type":"ContainerStarted","Data":"42b374dc32f2d591de74ff2a272dc5a2d3925afb10d2029fdfd584650c4cb365"} Apr 21 07:04:03.296885 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:03.296860 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5977bd9744-9cf64" event={"ID":"e67323c8-e3cc-4745-b61d-27f2a2459601","Type":"ContainerStarted","Data":"be0d0dfe2b8a460a3f9d212c80c9be4b8431001917f2756f1901c697c3846e79"} Apr 21 07:04:03.297075 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:03.296961 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:03.297924 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:03.297885 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-ncs5j" event={"ID":"6ce9ee09-1262-4278-8b0b-72dce2cc896a","Type":"ContainerStarted","Data":"01d6299dec99996562ff47c840dab20e7f8d98e1892c6229fe64c6db900bcee8"} Apr 21 07:04:03.319343 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:03.319272 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5977bd9744-9cf64" podStartSLOduration=1.3192559799999999 podStartE2EDuration="1.31925598s" podCreationTimestamp="2026-04-21 07:04:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:04:03.31762926 +0000 UTC m=+59.936963351" watchObservedRunningTime="2026-04-21 07:04:03.31925598 +0000 UTC m=+59.938590049" Apr 21 07:04:03.345121 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:03.345089 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-765c64c998-s4lnn"] Apr 21 07:04:03.348457 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:03.348421 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6887d699d4-whtf5"] Apr 21 07:04:03.348632 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:04:03.347919 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2372ef7d_8c3c_4eba_8da5_912ad24032da.slice/crio-14a0bbba97ddadb395d4c955632d267abf394fa8520c889d10d048c52d17b02c WatchSource:0}: Error finding container 14a0bbba97ddadb395d4c955632d267abf394fa8520c889d10d048c52d17b02c: Status 404 returned error can't find the container with id 14a0bbba97ddadb395d4c955632d267abf394fa8520c889d10d048c52d17b02c Apr 21 07:04:03.351928 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:04:03.351898 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3cc54b9_cc90_420d_adae_f60121d771d4.slice/crio-bbe36fa6f3708e101b3e5fd3961910cfd9d218ea0ca95fef4d3226c77b6c4dd8 WatchSource:0}: Error finding container bbe36fa6f3708e101b3e5fd3961910cfd9d218ea0ca95fef4d3226c77b6c4dd8: Status 404 returned error can't find the container with id bbe36fa6f3708e101b3e5fd3961910cfd9d218ea0ca95fef4d3226c77b6c4dd8 Apr 21 07:04:04.304198 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:04.304152 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6g68p" event={"ID":"fba56b2f-24aa-46b9-b5c7-88cc67c2fb44","Type":"ContainerStarted","Data":"d0a4383736eee9afce7a0307c9083ff0dd689ad7f88f8aa6995d7125f02ef128"} Apr 21 07:04:04.305737 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:04.305707 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6887d699d4-whtf5" event={"ID":"d3cc54b9-cc90-420d-adae-f60121d771d4","Type":"ContainerStarted","Data":"bbe36fa6f3708e101b3e5fd3961910cfd9d218ea0ca95fef4d3226c77b6c4dd8"} Apr 21 07:04:04.309624 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:04.309591 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-765c64c998-s4lnn" event={"ID":"2372ef7d-8c3c-4eba-8da5-912ad24032da","Type":"ContainerStarted","Data":"14a0bbba97ddadb395d4c955632d267abf394fa8520c889d10d048c52d17b02c"} Apr 21 07:04:05.282288 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.282254 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xtdjm" Apr 21 07:04:05.685002 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.684966 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d546df6b8-p5485"] Apr 21 07:04:05.687268 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.687242 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:05.696786 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.696759 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 07:04:05.701883 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.701830 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d546df6b8-p5485"] Apr 21 07:04:05.724344 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.724309 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-service-ca\") pod \"console-5d546df6b8-p5485\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:05.724344 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.724356 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxsrs\" (UniqueName: \"kubernetes.io/projected/8beae26a-560a-4d38-bc41-007709b4a3de-kube-api-access-vxsrs\") pod \"console-5d546df6b8-p5485\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:05.724631 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.724384 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8beae26a-560a-4d38-bc41-007709b4a3de-console-oauth-config\") pod \"console-5d546df6b8-p5485\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:05.724631 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.724417 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-trusted-ca-bundle\") pod \"console-5d546df6b8-p5485\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:05.724631 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.724438 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-console-config\") pod \"console-5d546df6b8-p5485\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:05.724631 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.724510 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8beae26a-560a-4d38-bc41-007709b4a3de-console-serving-cert\") pod \"console-5d546df6b8-p5485\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:05.724631 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.724531 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-oauth-serving-cert\") pod \"console-5d546df6b8-p5485\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:05.825452 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.825408 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8beae26a-560a-4d38-bc41-007709b4a3de-console-serving-cert\") pod \"console-5d546df6b8-p5485\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:05.825452 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.825458 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-oauth-serving-cert\") pod \"console-5d546df6b8-p5485\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:05.825727 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.825505 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-service-ca\") pod \"console-5d546df6b8-p5485\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:05.825727 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.825538 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vxsrs\" (UniqueName: \"kubernetes.io/projected/8beae26a-560a-4d38-bc41-007709b4a3de-kube-api-access-vxsrs\") pod \"console-5d546df6b8-p5485\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:05.825727 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.825586 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8beae26a-560a-4d38-bc41-007709b4a3de-console-oauth-config\") pod \"console-5d546df6b8-p5485\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:05.825727 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.825624 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-trusted-ca-bundle\") pod \"console-5d546df6b8-p5485\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:05.825727 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.825658 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-console-config\") pod \"console-5d546df6b8-p5485\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:05.827308 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.827267 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-service-ca\") pod \"console-5d546df6b8-p5485\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:05.827444 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.827266 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-oauth-serving-cert\") pod \"console-5d546df6b8-p5485\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:05.827444 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.827429 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-console-config\") pod \"console-5d546df6b8-p5485\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:05.827740 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.827690 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-trusted-ca-bundle\") pod \"console-5d546df6b8-p5485\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:05.829253 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.829207 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8beae26a-560a-4d38-bc41-007709b4a3de-console-serving-cert\") pod \"console-5d546df6b8-p5485\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:05.830149 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.830121 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8beae26a-560a-4d38-bc41-007709b4a3de-console-oauth-config\") pod \"console-5d546df6b8-p5485\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:05.835134 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:05.835109 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxsrs\" (UniqueName: \"kubernetes.io/projected/8beae26a-560a-4d38-bc41-007709b4a3de-kube-api-access-vxsrs\") pod \"console-5d546df6b8-p5485\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:06.000623 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:06.000507 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:06.775253 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:06.775216 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d546df6b8-p5485"] Apr 21 07:04:06.778661 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:04:06.778627 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8beae26a_560a_4d38_bc41_007709b4a3de.slice/crio-b2a6024a7e20ac5b3069d2a741ea429c99dfccb0aa475f50aad44eedee6b3924 WatchSource:0}: Error finding container b2a6024a7e20ac5b3069d2a741ea429c99dfccb0aa475f50aad44eedee6b3924: Status 404 returned error can't find the container with id b2a6024a7e20ac5b3069d2a741ea429c99dfccb0aa475f50aad44eedee6b3924 Apr 21 07:04:07.322106 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:07.322069 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6g68p" event={"ID":"fba56b2f-24aa-46b9-b5c7-88cc67c2fb44","Type":"ContainerStarted","Data":"2f58f02a968abeb5633302db7a24dc4a90f240e5b150971babb0065f63a2d7d7"} Apr 21 07:04:07.323880 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:07.323835 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6887d699d4-whtf5" event={"ID":"d3cc54b9-cc90-420d-adae-f60121d771d4","Type":"ContainerStarted","Data":"93e33d8f1c87232a8aab8fe07c946187fe268b3a7f3d15f2169a173d2ba71963"} Apr 21 07:04:07.325311 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:07.325278 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d546df6b8-p5485" event={"ID":"8beae26a-560a-4d38-bc41-007709b4a3de","Type":"ContainerStarted","Data":"7bd668ea4866db49f15a3dd01e1910ce7182813ab0d0a5e5bbe91233b03c5cbc"} Apr 21 07:04:07.325444 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:07.325313 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d546df6b8-p5485" event={"ID":"8beae26a-560a-4d38-bc41-007709b4a3de","Type":"ContainerStarted","Data":"b2a6024a7e20ac5b3069d2a741ea429c99dfccb0aa475f50aad44eedee6b3924"} Apr 21 07:04:07.341667 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:07.341613 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6g68p" podStartSLOduration=1.8637511679999998 podStartE2EDuration="5.341596931s" podCreationTimestamp="2026-04-21 07:04:02 +0000 UTC" firstStartedPulling="2026-04-21 07:04:03.145478 +0000 UTC m=+59.764812071" lastFinishedPulling="2026-04-21 07:04:06.623323764 +0000 UTC m=+63.242657834" observedRunningTime="2026-04-21 07:04:07.341020485 +0000 UTC m=+63.960354575" watchObservedRunningTime="2026-04-21 07:04:07.341596931 +0000 UTC m=+63.960931015" Apr 21 07:04:07.361005 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:07.360941 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d546df6b8-p5485" podStartSLOduration=2.360920791 podStartE2EDuration="2.360920791s" podCreationTimestamp="2026-04-21 07:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:04:07.36033818 +0000 UTC m=+63.979672269" watchObservedRunningTime="2026-04-21 07:04:07.360920791 +0000 UTC m=+63.980254882" Apr 21 07:04:07.378295 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:07.378233 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6887d699d4-whtf5" podStartSLOduration=2.099930887 podStartE2EDuration="5.378212222s" podCreationTimestamp="2026-04-21 07:04:02 +0000 UTC" firstStartedPulling="2026-04-21 07:04:03.354199296 +0000 UTC m=+59.973533377" lastFinishedPulling="2026-04-21 07:04:06.632480631 +0000 UTC m=+63.251814712" observedRunningTime="2026-04-21 07:04:07.377336665 +0000 UTC m=+63.996670754" watchObservedRunningTime="2026-04-21 07:04:07.378212222 +0000 UTC m=+63.997546311" Apr 21 07:04:08.646748 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:08.646704 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs\") pod \"network-metrics-daemon-r4v6n\" (UID: \"42b45a2d-c99c-40f2-97f6-2d31aff6854f\") " pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:04:08.649531 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:08.649496 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 07:04:08.660434 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:08.660407 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42b45a2d-c99c-40f2-97f6-2d31aff6854f-metrics-certs\") pod \"network-metrics-daemon-r4v6n\" (UID: \"42b45a2d-c99c-40f2-97f6-2d31aff6854f\") " pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:04:08.748209 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:08.748171 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-766ml\" (UniqueName: \"kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml\") pod \"network-check-target-4qpb2\" (UID: \"f00d904c-86da-4e00-801a-3bd1d7dbe5f4\") " pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:04:08.751059 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:08.751017 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 07:04:08.761552 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:08.761521 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 07:04:08.772905 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:08.772866 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-766ml\" (UniqueName: \"kubernetes.io/projected/f00d904c-86da-4e00-801a-3bd1d7dbe5f4-kube-api-access-766ml\") pod \"network-check-target-4qpb2\" (UID: \"f00d904c-86da-4e00-801a-3bd1d7dbe5f4\") " pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:04:08.788990 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:08.788908 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4vntj\"" Apr 21 07:04:08.794802 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:08.794773 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5rqg2\"" Apr 21 07:04:08.796916 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:08.796892 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:04:08.802735 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:08.802701 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-r4v6n" Apr 21 07:04:08.951880 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:08.951826 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-4qpb2"] Apr 21 07:04:08.955122 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:04:08.955081 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf00d904c_86da_4e00_801a_3bd1d7dbe5f4.slice/crio-7b6c512edfcad677cf464ce3e1e6d45b89370024b5a4c3879f8194d1b127fa2d WatchSource:0}: Error finding container 7b6c512edfcad677cf464ce3e1e6d45b89370024b5a4c3879f8194d1b127fa2d: Status 404 returned error can't find the container with id 7b6c512edfcad677cf464ce3e1e6d45b89370024b5a4c3879f8194d1b127fa2d Apr 21 07:04:08.973191 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:08.973162 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-r4v6n"] Apr 21 07:04:08.988584 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:04:08.988518 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42b45a2d_c99c_40f2_97f6_2d31aff6854f.slice/crio-d56e284cf154bb0f7eee4e92681c48cf5a33e5d105e93b99650a09da8b3e1201 WatchSource:0}: Error finding container d56e284cf154bb0f7eee4e92681c48cf5a33e5d105e93b99650a09da8b3e1201: Status 404 returned error can't find the container with id d56e284cf154bb0f7eee4e92681c48cf5a33e5d105e93b99650a09da8b3e1201 Apr 21 07:04:09.332206 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:09.332158 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-r4v6n" event={"ID":"42b45a2d-c99c-40f2-97f6-2d31aff6854f","Type":"ContainerStarted","Data":"d56e284cf154bb0f7eee4e92681c48cf5a33e5d105e93b99650a09da8b3e1201"} Apr 21 07:04:09.333351 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:09.333325 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4qpb2" event={"ID":"f00d904c-86da-4e00-801a-3bd1d7dbe5f4","Type":"ContainerStarted","Data":"7b6c512edfcad677cf464ce3e1e6d45b89370024b5a4c3879f8194d1b127fa2d"} Apr 21 07:04:11.341532 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:11.341485 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-r4v6n" event={"ID":"42b45a2d-c99c-40f2-97f6-2d31aff6854f","Type":"ContainerStarted","Data":"7c9e6858a09e20154476025530cb9b0b2d6cbd6cfc8cf777899be5ad6bdcc83f"} Apr 21 07:04:11.341532 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:11.341540 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-r4v6n" event={"ID":"42b45a2d-c99c-40f2-97f6-2d31aff6854f","Type":"ContainerStarted","Data":"712e68c1556052fe00681c0556d0674396907b8f0b0b7b5f3dc7b8da190b689a"} Apr 21 07:04:11.359495 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:11.359429 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-r4v6n" podStartSLOduration=65.669323736 podStartE2EDuration="1m7.359407114s" podCreationTimestamp="2026-04-21 07:03:04 +0000 UTC" firstStartedPulling="2026-04-21 07:04:08.990925014 +0000 UTC m=+65.610259093" lastFinishedPulling="2026-04-21 07:04:10.681008387 +0000 UTC m=+67.300342471" observedRunningTime="2026-04-21 07:04:11.357380889 +0000 UTC m=+67.976715003" watchObservedRunningTime="2026-04-21 07:04:11.359407114 +0000 UTC m=+67.978741205" Apr 21 07:04:12.962479 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:12.962438 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:12.962985 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:12.962490 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:12.968694 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:12.968665 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:13.349488 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:13.349428 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-4qpb2" event={"ID":"f00d904c-86da-4e00-801a-3bd1d7dbe5f4","Type":"ContainerStarted","Data":"2841d96f951aaa2c5276420cc33e9904f9f92d7dde3e2a0ba2d34ad3bcc2f52d"} Apr 21 07:04:13.354077 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:13.354049 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:13.376836 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:13.376770 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-4qpb2" podStartSLOduration=66.007938585 podStartE2EDuration="1m9.376752563s" podCreationTimestamp="2026-04-21 07:03:04 +0000 UTC" firstStartedPulling="2026-04-21 07:04:08.957946889 +0000 UTC m=+65.577280971" lastFinishedPulling="2026-04-21 07:04:12.326760874 +0000 UTC m=+68.946094949" observedRunningTime="2026-04-21 07:04:13.375629142 +0000 UTC m=+69.994963232" watchObservedRunningTime="2026-04-21 07:04:13.376752563 +0000 UTC m=+69.996086652" Apr 21 07:04:14.352640 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.352604 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:04:14.929741 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.929699 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-zf5cc"] Apr 21 07:04:14.934781 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.934749 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zf5cc" Apr 21 07:04:14.937850 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.937817 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 21 07:04:14.938003 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.937849 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 07:04:14.938003 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.937938 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 21 07:04:14.938003 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.937958 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 07:04:14.938716 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.938698 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 07:04:14.938952 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.938934 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-p7kbz\"" Apr 21 07:04:14.945161 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.945137 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-zf5cc"] Apr 21 07:04:14.947179 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.947154 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zxf4k"] Apr 21 07:04:14.950597 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.950552 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-5dkkb"] Apr 21 07:04:14.950759 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.950739 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:14.954201 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.953721 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 07:04:14.954201 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.953762 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-j5sr4\"" Apr 21 07:04:14.954201 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.954001 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 07:04:14.954201 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.954004 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 07:04:14.954467 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.954250 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" Apr 21 07:04:14.958166 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.957614 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-mfxt4\"" Apr 21 07:04:14.958166 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.957879 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 21 07:04:14.958166 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.958011 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 21 07:04:14.958401 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.958303 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 21 07:04:14.973404 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:14.973371 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-5dkkb"] Apr 21 07:04:15.002528 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.002487 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/15a09b94-ea25-4ffe-8eaf-9ed2025b01a6-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-zf5cc\" (UID: \"15a09b94-ea25-4ffe-8eaf-9ed2025b01a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zf5cc" Apr 21 07:04:15.002528 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.002522 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28ntt\" (UniqueName: \"kubernetes.io/projected/968c02a1-912f-4c82-8093-ef9cc71fdca3-kube-api-access-28ntt\") pod \"kube-state-metrics-69db897b98-5dkkb\" (UID: \"968c02a1-912f-4c82-8093-ef9cc71fdca3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" Apr 21 07:04:15.002777 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.002544 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-node-exporter-textfile\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.002777 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.002629 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b72bh\" (UniqueName: \"kubernetes.io/projected/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-kube-api-access-b72bh\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.002777 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.002717 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/15a09b94-ea25-4ffe-8eaf-9ed2025b01a6-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zf5cc\" (UID: \"15a09b94-ea25-4ffe-8eaf-9ed2025b01a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zf5cc" Apr 21 07:04:15.002777 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.002751 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-node-exporter-wtmp\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.002939 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.002791 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-root\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.002939 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.002816 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j42g\" (UniqueName: \"kubernetes.io/projected/15a09b94-ea25-4ffe-8eaf-9ed2025b01a6-kube-api-access-2j42g\") pod \"openshift-state-metrics-9d44df66c-zf5cc\" (UID: \"15a09b94-ea25-4ffe-8eaf-9ed2025b01a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zf5cc" Apr 21 07:04:15.002939 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.002851 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/968c02a1-912f-4c82-8093-ef9cc71fdca3-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-5dkkb\" (UID: \"968c02a1-912f-4c82-8093-ef9cc71fdca3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" Apr 21 07:04:15.002939 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.002895 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-node-exporter-accelerators-collector-config\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.003135 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.002945 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-sys\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.003135 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.002963 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15a09b94-ea25-4ffe-8eaf-9ed2025b01a6-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-zf5cc\" (UID: \"15a09b94-ea25-4ffe-8eaf-9ed2025b01a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zf5cc" Apr 21 07:04:15.003135 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.002995 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/968c02a1-912f-4c82-8093-ef9cc71fdca3-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-5dkkb\" (UID: \"968c02a1-912f-4c82-8093-ef9cc71fdca3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" Apr 21 07:04:15.003135 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.003020 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/968c02a1-912f-4c82-8093-ef9cc71fdca3-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-5dkkb\" (UID: \"968c02a1-912f-4c82-8093-ef9cc71fdca3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" Apr 21 07:04:15.003135 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.003045 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/968c02a1-912f-4c82-8093-ef9cc71fdca3-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-5dkkb\" (UID: \"968c02a1-912f-4c82-8093-ef9cc71fdca3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" Apr 21 07:04:15.003135 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.003083 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-metrics-client-ca\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.003135 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.003124 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.003490 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.003156 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-node-exporter-tls\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.003490 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.003192 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/968c02a1-912f-4c82-8093-ef9cc71fdca3-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-5dkkb\" (UID: \"968c02a1-912f-4c82-8093-ef9cc71fdca3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" Apr 21 07:04:15.103956 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.103909 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/968c02a1-912f-4c82-8093-ef9cc71fdca3-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-5dkkb\" (UID: \"968c02a1-912f-4c82-8093-ef9cc71fdca3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" Apr 21 07:04:15.104161 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.103995 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/15a09b94-ea25-4ffe-8eaf-9ed2025b01a6-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-zf5cc\" (UID: \"15a09b94-ea25-4ffe-8eaf-9ed2025b01a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zf5cc" Apr 21 07:04:15.104161 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104027 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28ntt\" (UniqueName: \"kubernetes.io/projected/968c02a1-912f-4c82-8093-ef9cc71fdca3-kube-api-access-28ntt\") pod \"kube-state-metrics-69db897b98-5dkkb\" (UID: \"968c02a1-912f-4c82-8093-ef9cc71fdca3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" Apr 21 07:04:15.104161 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104056 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-node-exporter-textfile\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.104161 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104091 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b72bh\" (UniqueName: \"kubernetes.io/projected/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-kube-api-access-b72bh\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.104161 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104122 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/15a09b94-ea25-4ffe-8eaf-9ed2025b01a6-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zf5cc\" (UID: \"15a09b94-ea25-4ffe-8eaf-9ed2025b01a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zf5cc" Apr 21 07:04:15.104161 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104149 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-node-exporter-wtmp\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.104474 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104287 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-node-exporter-wtmp\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.104474 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104324 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-root\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.104474 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104366 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2j42g\" (UniqueName: \"kubernetes.io/projected/15a09b94-ea25-4ffe-8eaf-9ed2025b01a6-kube-api-access-2j42g\") pod \"openshift-state-metrics-9d44df66c-zf5cc\" (UID: \"15a09b94-ea25-4ffe-8eaf-9ed2025b01a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zf5cc" Apr 21 07:04:15.104474 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104398 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/968c02a1-912f-4c82-8093-ef9cc71fdca3-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-5dkkb\" (UID: \"968c02a1-912f-4c82-8093-ef9cc71fdca3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" Apr 21 07:04:15.104474 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104430 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-node-exporter-textfile\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.106040 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104481 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-root\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.106040 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104432 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-node-exporter-accelerators-collector-config\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.106040 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104691 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-sys\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.106040 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104726 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15a09b94-ea25-4ffe-8eaf-9ed2025b01a6-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-zf5cc\" (UID: \"15a09b94-ea25-4ffe-8eaf-9ed2025b01a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zf5cc" Apr 21 07:04:15.106040 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104765 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/968c02a1-912f-4c82-8093-ef9cc71fdca3-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-5dkkb\" (UID: \"968c02a1-912f-4c82-8093-ef9cc71fdca3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" Apr 21 07:04:15.106040 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104784 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/968c02a1-912f-4c82-8093-ef9cc71fdca3-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-5dkkb\" (UID: \"968c02a1-912f-4c82-8093-ef9cc71fdca3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" Apr 21 07:04:15.106040 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104793 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/968c02a1-912f-4c82-8093-ef9cc71fdca3-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-5dkkb\" (UID: \"968c02a1-912f-4c82-8093-ef9cc71fdca3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" Apr 21 07:04:15.106040 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104805 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-sys\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.106040 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104820 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/968c02a1-912f-4c82-8093-ef9cc71fdca3-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-5dkkb\" (UID: \"968c02a1-912f-4c82-8093-ef9cc71fdca3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" Apr 21 07:04:15.106040 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104868 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-metrics-client-ca\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.106040 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104899 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.106040 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104943 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-node-exporter-tls\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.106040 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.105136 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/968c02a1-912f-4c82-8093-ef9cc71fdca3-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-5dkkb\" (UID: \"968c02a1-912f-4c82-8093-ef9cc71fdca3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" Apr 21 07:04:15.106040 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.104978 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-node-exporter-accelerators-collector-config\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.106040 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.105180 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/968c02a1-912f-4c82-8093-ef9cc71fdca3-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-5dkkb\" (UID: \"968c02a1-912f-4c82-8093-ef9cc71fdca3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" Apr 21 07:04:15.106815 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.106093 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-metrics-client-ca\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.106815 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.106229 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15a09b94-ea25-4ffe-8eaf-9ed2025b01a6-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-zf5cc\" (UID: \"15a09b94-ea25-4ffe-8eaf-9ed2025b01a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zf5cc" Apr 21 07:04:15.108694 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.108662 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/15a09b94-ea25-4ffe-8eaf-9ed2025b01a6-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-zf5cc\" (UID: \"15a09b94-ea25-4ffe-8eaf-9ed2025b01a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zf5cc" Apr 21 07:04:15.108882 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.108754 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/15a09b94-ea25-4ffe-8eaf-9ed2025b01a6-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zf5cc\" (UID: \"15a09b94-ea25-4ffe-8eaf-9ed2025b01a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zf5cc" Apr 21 07:04:15.108959 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.108900 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/968c02a1-912f-4c82-8093-ef9cc71fdca3-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-5dkkb\" (UID: \"968c02a1-912f-4c82-8093-ef9cc71fdca3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" Apr 21 07:04:15.109014 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.108977 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-node-exporter-tls\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.109200 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.109172 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/968c02a1-912f-4c82-8093-ef9cc71fdca3-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-5dkkb\" (UID: \"968c02a1-912f-4c82-8093-ef9cc71fdca3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" Apr 21 07:04:15.109487 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.109466 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.113177 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.113156 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b72bh\" (UniqueName: \"kubernetes.io/projected/93a318c1-6dc6-41e2-8ce6-10df3a949d4c-kube-api-access-b72bh\") pod \"node-exporter-zxf4k\" (UID: \"93a318c1-6dc6-41e2-8ce6-10df3a949d4c\") " pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.113177 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.113179 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j42g\" (UniqueName: \"kubernetes.io/projected/15a09b94-ea25-4ffe-8eaf-9ed2025b01a6-kube-api-access-2j42g\") pod \"openshift-state-metrics-9d44df66c-zf5cc\" (UID: \"15a09b94-ea25-4ffe-8eaf-9ed2025b01a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zf5cc" Apr 21 07:04:15.113721 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.113671 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28ntt\" (UniqueName: \"kubernetes.io/projected/968c02a1-912f-4c82-8093-ef9cc71fdca3-kube-api-access-28ntt\") pod \"kube-state-metrics-69db897b98-5dkkb\" (UID: \"968c02a1-912f-4c82-8093-ef9cc71fdca3\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" Apr 21 07:04:15.247628 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.247524 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zf5cc" Apr 21 07:04:15.264189 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.264153 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zxf4k" Apr 21 07:04:15.271385 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:15.270940 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" Apr 21 07:04:16.001197 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.001156 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:16.001705 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.001288 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:16.007720 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.007690 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:16.009628 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.009601 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 07:04:16.015471 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.015446 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.019930 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.019909 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 07:04:16.020257 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.020236 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 07:04:16.020399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.020317 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 07:04:16.020399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.020353 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 07:04:16.020399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.020369 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 07:04:16.020581 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.020422 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-nj9r5\"" Apr 21 07:04:16.020581 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.020508 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 07:04:16.020716 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.020645 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 07:04:16.020716 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.020702 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 07:04:16.020820 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.020770 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 07:04:16.034946 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.034911 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 07:04:16.115486 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.115450 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/796f028f-061e-4efc-93ed-97f5cd3a0802-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.115683 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.115505 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck8j7\" (UniqueName: \"kubernetes.io/projected/796f028f-061e-4efc-93ed-97f5cd3a0802-kube-api-access-ck8j7\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.115683 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.115666 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/796f028f-061e-4efc-93ed-97f5cd3a0802-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.115790 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.115710 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/796f028f-061e-4efc-93ed-97f5cd3a0802-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.115790 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.115740 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/796f028f-061e-4efc-93ed-97f5cd3a0802-config-volume\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.115869 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.115807 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/796f028f-061e-4efc-93ed-97f5cd3a0802-tls-assets\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.115869 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.115858 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/796f028f-061e-4efc-93ed-97f5cd3a0802-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.115961 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.115884 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/796f028f-061e-4efc-93ed-97f5cd3a0802-config-out\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.115961 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.115926 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/796f028f-061e-4efc-93ed-97f5cd3a0802-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.116056 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.115979 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/796f028f-061e-4efc-93ed-97f5cd3a0802-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.116056 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.116016 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796f028f-061e-4efc-93ed-97f5cd3a0802-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.116152 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.116053 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/796f028f-061e-4efc-93ed-97f5cd3a0802-web-config\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.116152 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.116079 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/796f028f-061e-4efc-93ed-97f5cd3a0802-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.217363 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.217311 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/796f028f-061e-4efc-93ed-97f5cd3a0802-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.217363 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.217364 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/796f028f-061e-4efc-93ed-97f5cd3a0802-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.217657 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.217394 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796f028f-061e-4efc-93ed-97f5cd3a0802-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.217657 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.217426 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/796f028f-061e-4efc-93ed-97f5cd3a0802-web-config\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.217657 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.217449 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/796f028f-061e-4efc-93ed-97f5cd3a0802-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.217657 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.217494 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/796f028f-061e-4efc-93ed-97f5cd3a0802-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.217657 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.217527 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ck8j7\" (UniqueName: \"kubernetes.io/projected/796f028f-061e-4efc-93ed-97f5cd3a0802-kube-api-access-ck8j7\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.217657 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.217599 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/796f028f-061e-4efc-93ed-97f5cd3a0802-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.217657 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:04:16.217619 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/796f028f-061e-4efc-93ed-97f5cd3a0802-alertmanager-trusted-ca-bundle podName:796f028f-061e-4efc-93ed-97f5cd3a0802 nodeName:}" failed. No retries permitted until 2026-04-21 07:04:16.717592921 +0000 UTC m=+73.336926993 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/796f028f-061e-4efc-93ed-97f5cd3a0802-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "796f028f-061e-4efc-93ed-97f5cd3a0802") : configmap references non-existent config key: ca-bundle.crt Apr 21 07:04:16.218000 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.217667 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/796f028f-061e-4efc-93ed-97f5cd3a0802-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.218000 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.217698 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/796f028f-061e-4efc-93ed-97f5cd3a0802-config-volume\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.218000 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.217749 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/796f028f-061e-4efc-93ed-97f5cd3a0802-tls-assets\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.218000 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.217788 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/796f028f-061e-4efc-93ed-97f5cd3a0802-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.218000 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.217819 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/796f028f-061e-4efc-93ed-97f5cd3a0802-config-out\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.219051 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.218580 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/796f028f-061e-4efc-93ed-97f5cd3a0802-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.219051 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.218853 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/796f028f-061e-4efc-93ed-97f5cd3a0802-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.221742 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.221716 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/796f028f-061e-4efc-93ed-97f5cd3a0802-tls-assets\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.221984 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.221947 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/796f028f-061e-4efc-93ed-97f5cd3a0802-config-out\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.222235 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.222216 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/796f028f-061e-4efc-93ed-97f5cd3a0802-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.222404 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.222350 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/796f028f-061e-4efc-93ed-97f5cd3a0802-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.222633 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.222609 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/796f028f-061e-4efc-93ed-97f5cd3a0802-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.222737 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.222630 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/796f028f-061e-4efc-93ed-97f5cd3a0802-web-config\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.223947 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.223921 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/796f028f-061e-4efc-93ed-97f5cd3a0802-config-volume\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.224083 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.224061 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/796f028f-061e-4efc-93ed-97f5cd3a0802-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.225243 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.225218 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/796f028f-061e-4efc-93ed-97f5cd3a0802-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.229712 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.229686 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck8j7\" (UniqueName: \"kubernetes.io/projected/796f028f-061e-4efc-93ed-97f5cd3a0802-kube-api-access-ck8j7\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.364968 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.364931 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:16.413146 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.413109 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6887d699d4-whtf5"] Apr 21 07:04:16.723208 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.723114 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796f028f-061e-4efc-93ed-97f5cd3a0802-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.724049 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.724022 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/796f028f-061e-4efc-93ed-97f5cd3a0802-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"796f028f-061e-4efc-93ed-97f5cd3a0802\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:16.926637 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:16.926595 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:04:17.157053 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.157009 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk"] Apr 21 07:04:17.163214 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.163180 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.167211 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.167175 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 21 07:04:17.167398 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.167372 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-fc00j5n6nb35g\"" Apr 21 07:04:17.167743 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.167719 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 21 07:04:17.167825 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.167751 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 21 07:04:17.167825 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.167785 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 21 07:04:17.167987 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.167970 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 21 07:04:17.168230 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.168210 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-bz4gk\"" Apr 21 07:04:17.186444 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.186408 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk"] Apr 21 07:04:17.228324 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.228276 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.228324 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.228328 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4b4mt\" (UniqueName: \"kubernetes.io/projected/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-kube-api-access-4b4mt\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.228598 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.228395 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.228598 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.228448 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.228598 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.228500 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.228598 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.228529 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-secret-thanos-querier-tls\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.228598 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.228584 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-metrics-client-ca\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.228853 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.228638 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-secret-grpc-tls\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.329802 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.329759 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.329802 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.329810 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.330044 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.329865 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.330044 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.329907 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-secret-thanos-querier-tls\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.330044 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.329939 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-metrics-client-ca\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.330044 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.329955 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-secret-grpc-tls\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.330044 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.329983 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.330294 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.330024 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4b4mt\" (UniqueName: \"kubernetes.io/projected/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-kube-api-access-4b4mt\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.331144 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.331112 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-metrics-client-ca\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.333214 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.333186 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.333336 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.333262 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.333391 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.333346 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.333391 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.333367 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.333745 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.333724 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-secret-thanos-querier-tls\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.334326 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.334306 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-secret-grpc-tls\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.339938 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.339907 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4b4mt\" (UniqueName: \"kubernetes.io/projected/ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719-kube-api-access-4b4mt\") pod \"thanos-querier-8dcbc7c47-wc8fk\" (UID: \"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719\") " pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:17.476422 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:17.476321 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:19.359168 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.359129 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-59bf57b49c-nsmxm"] Apr 21 07:04:19.364416 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.364389 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:19.367141 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.367115 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 21 07:04:19.368527 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.368454 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 07:04:19.368527 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.368508 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 21 07:04:19.368740 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.368530 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-6o1oh527uq8gk\"" Apr 21 07:04:19.368740 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.368692 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-vhmbs\"" Apr 21 07:04:19.368874 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.368752 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 21 07:04:19.374439 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.373709 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-59bf57b49c-nsmxm"] Apr 21 07:04:19.450735 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.450671 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/363b86d0-bdf4-44a2-96f5-80c829a4f375-secret-metrics-server-client-certs\") pod \"metrics-server-59bf57b49c-nsmxm\" (UID: \"363b86d0-bdf4-44a2-96f5-80c829a4f375\") " pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:19.450735 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.450726 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/363b86d0-bdf4-44a2-96f5-80c829a4f375-metrics-server-audit-profiles\") pod \"metrics-server-59bf57b49c-nsmxm\" (UID: \"363b86d0-bdf4-44a2-96f5-80c829a4f375\") " pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:19.450988 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.450749 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkln7\" (UniqueName: \"kubernetes.io/projected/363b86d0-bdf4-44a2-96f5-80c829a4f375-kube-api-access-wkln7\") pod \"metrics-server-59bf57b49c-nsmxm\" (UID: \"363b86d0-bdf4-44a2-96f5-80c829a4f375\") " pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:19.450988 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.450826 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/363b86d0-bdf4-44a2-96f5-80c829a4f375-audit-log\") pod \"metrics-server-59bf57b49c-nsmxm\" (UID: \"363b86d0-bdf4-44a2-96f5-80c829a4f375\") " pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:19.450988 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.450879 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/363b86d0-bdf4-44a2-96f5-80c829a4f375-client-ca-bundle\") pod \"metrics-server-59bf57b49c-nsmxm\" (UID: \"363b86d0-bdf4-44a2-96f5-80c829a4f375\") " pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:19.450988 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.450914 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/363b86d0-bdf4-44a2-96f5-80c829a4f375-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-59bf57b49c-nsmxm\" (UID: \"363b86d0-bdf4-44a2-96f5-80c829a4f375\") " pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:19.450988 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.450953 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/363b86d0-bdf4-44a2-96f5-80c829a4f375-secret-metrics-server-tls\") pod \"metrics-server-59bf57b49c-nsmxm\" (UID: \"363b86d0-bdf4-44a2-96f5-80c829a4f375\") " pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:19.551707 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.551666 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/363b86d0-bdf4-44a2-96f5-80c829a4f375-audit-log\") pod \"metrics-server-59bf57b49c-nsmxm\" (UID: \"363b86d0-bdf4-44a2-96f5-80c829a4f375\") " pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:19.551880 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.551741 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/363b86d0-bdf4-44a2-96f5-80c829a4f375-client-ca-bundle\") pod \"metrics-server-59bf57b49c-nsmxm\" (UID: \"363b86d0-bdf4-44a2-96f5-80c829a4f375\") " pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:19.551880 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.551771 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/363b86d0-bdf4-44a2-96f5-80c829a4f375-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-59bf57b49c-nsmxm\" (UID: \"363b86d0-bdf4-44a2-96f5-80c829a4f375\") " pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:19.551880 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.551801 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/363b86d0-bdf4-44a2-96f5-80c829a4f375-secret-metrics-server-tls\") pod \"metrics-server-59bf57b49c-nsmxm\" (UID: \"363b86d0-bdf4-44a2-96f5-80c829a4f375\") " pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:19.551880 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.551852 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/363b86d0-bdf4-44a2-96f5-80c829a4f375-secret-metrics-server-client-certs\") pod \"metrics-server-59bf57b49c-nsmxm\" (UID: \"363b86d0-bdf4-44a2-96f5-80c829a4f375\") " pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:19.552092 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.551882 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/363b86d0-bdf4-44a2-96f5-80c829a4f375-metrics-server-audit-profiles\") pod \"metrics-server-59bf57b49c-nsmxm\" (UID: \"363b86d0-bdf4-44a2-96f5-80c829a4f375\") " pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:19.552092 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.551913 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkln7\" (UniqueName: \"kubernetes.io/projected/363b86d0-bdf4-44a2-96f5-80c829a4f375-kube-api-access-wkln7\") pod \"metrics-server-59bf57b49c-nsmxm\" (UID: \"363b86d0-bdf4-44a2-96f5-80c829a4f375\") " pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:19.552171 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.552145 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/363b86d0-bdf4-44a2-96f5-80c829a4f375-audit-log\") pod \"metrics-server-59bf57b49c-nsmxm\" (UID: \"363b86d0-bdf4-44a2-96f5-80c829a4f375\") " pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:19.553175 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.553109 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/363b86d0-bdf4-44a2-96f5-80c829a4f375-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-59bf57b49c-nsmxm\" (UID: \"363b86d0-bdf4-44a2-96f5-80c829a4f375\") " pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:19.553518 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.553489 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/363b86d0-bdf4-44a2-96f5-80c829a4f375-metrics-server-audit-profiles\") pod \"metrics-server-59bf57b49c-nsmxm\" (UID: \"363b86d0-bdf4-44a2-96f5-80c829a4f375\") " pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:19.555690 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.555665 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/363b86d0-bdf4-44a2-96f5-80c829a4f375-secret-metrics-server-client-certs\") pod \"metrics-server-59bf57b49c-nsmxm\" (UID: \"363b86d0-bdf4-44a2-96f5-80c829a4f375\") " pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:19.555817 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.555793 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/363b86d0-bdf4-44a2-96f5-80c829a4f375-secret-metrics-server-tls\") pod \"metrics-server-59bf57b49c-nsmxm\" (UID: \"363b86d0-bdf4-44a2-96f5-80c829a4f375\") " pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:19.557375 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.557351 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/363b86d0-bdf4-44a2-96f5-80c829a4f375-client-ca-bundle\") pod \"metrics-server-59bf57b49c-nsmxm\" (UID: \"363b86d0-bdf4-44a2-96f5-80c829a4f375\") " pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:19.561091 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.561066 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkln7\" (UniqueName: \"kubernetes.io/projected/363b86d0-bdf4-44a2-96f5-80c829a4f375-kube-api-access-wkln7\") pod \"metrics-server-59bf57b49c-nsmxm\" (UID: \"363b86d0-bdf4-44a2-96f5-80c829a4f375\") " pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:19.676882 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:19.676776 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:20.105133 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:20.105097 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-zf5cc"] Apr 21 07:04:20.120810 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:04:20.120775 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15a09b94_ea25_4ffe_8eaf_9ed2025b01a6.slice/crio-3eae84b0b359f1813748daf9165512453422853602203edab07efc595e3483ce WatchSource:0}: Error finding container 3eae84b0b359f1813748daf9165512453422853602203edab07efc595e3483ce: Status 404 returned error can't find the container with id 3eae84b0b359f1813748daf9165512453422853602203edab07efc595e3483ce Apr 21 07:04:20.350767 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:20.350479 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-59bf57b49c-nsmxm"] Apr 21 07:04:20.355390 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:04:20.355349 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod363b86d0_bdf4_44a2_96f5_80c829a4f375.slice/crio-d83659c7c5a36255a3bca830e21ce6dc4a33f3106a63624adc5186d1b86af527 WatchSource:0}: Error finding container d83659c7c5a36255a3bca830e21ce6dc4a33f3106a63624adc5186d1b86af527: Status 404 returned error can't find the container with id d83659c7c5a36255a3bca830e21ce6dc4a33f3106a63624adc5186d1b86af527 Apr 21 07:04:20.358108 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:20.358036 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-5dkkb"] Apr 21 07:04:20.364835 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:04:20.364800 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod968c02a1_912f_4c82_8093_ef9cc71fdca3.slice/crio-82f560f41c46f0465a61caa9db29dab4a00e47c1dc065cc737bf4f8c1adb27cd WatchSource:0}: Error finding container 82f560f41c46f0465a61caa9db29dab4a00e47c1dc065cc737bf4f8c1adb27cd: Status 404 returned error can't find the container with id 82f560f41c46f0465a61caa9db29dab4a00e47c1dc065cc737bf4f8c1adb27cd Apr 21 07:04:20.385269 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:20.384453 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk"] Apr 21 07:04:20.389608 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:20.389539 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 07:04:20.389861 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:20.389721 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zxf4k" event={"ID":"93a318c1-6dc6-41e2-8ce6-10df3a949d4c","Type":"ContainerStarted","Data":"3d8aa7959b919e5151dce16b5505bc99af410f47a1d52278937a5e7c2228cc4c"} Apr 21 07:04:20.391930 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:20.391866 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" event={"ID":"363b86d0-bdf4-44a2-96f5-80c829a4f375","Type":"ContainerStarted","Data":"d83659c7c5a36255a3bca830e21ce6dc4a33f3106a63624adc5186d1b86af527"} Apr 21 07:04:20.395253 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:20.395154 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zf5cc" event={"ID":"15a09b94-ea25-4ffe-8eaf-9ed2025b01a6","Type":"ContainerStarted","Data":"51f9a37a1a2c6e79d88e7f0e0caa0b8af80339c90b5947d5fd296771dbfbdcad"} Apr 21 07:04:20.395253 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:20.395193 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zf5cc" event={"ID":"15a09b94-ea25-4ffe-8eaf-9ed2025b01a6","Type":"ContainerStarted","Data":"56817964a744bfa1ca39760c48f5c2606bcc1d6eba000ae6c4824993cccd981b"} Apr 21 07:04:20.395253 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:20.395218 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zf5cc" event={"ID":"15a09b94-ea25-4ffe-8eaf-9ed2025b01a6","Type":"ContainerStarted","Data":"3eae84b0b359f1813748daf9165512453422853602203edab07efc595e3483ce"} Apr 21 07:04:20.401181 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:20.400613 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-ncs5j" event={"ID":"6ce9ee09-1262-4278-8b0b-72dce2cc896a","Type":"ContainerStarted","Data":"72c7ac7fd4bbaf31f5e936670e174cbb724e1e6c285a2661a9690d58bfaebe08"} Apr 21 07:04:20.401529 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:20.401479 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-ncs5j" Apr 21 07:04:20.403356 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:20.403326 2581 patch_prober.go:28] interesting pod/downloads-6bcc868b7-ncs5j container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.134.0.9:8080/\": dial tcp 10.134.0.9:8080: connect: connection refused" start-of-body= Apr 21 07:04:20.403608 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:20.403522 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-ncs5j" podUID="6ce9ee09-1262-4278-8b0b-72dce2cc896a" containerName="download-server" probeResult="failure" output="Get \"http://10.134.0.9:8080/\": dial tcp 10.134.0.9:8080: connect: connection refused" Apr 21 07:04:20.403762 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:20.403740 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" event={"ID":"968c02a1-912f-4c82-8093-ef9cc71fdca3","Type":"ContainerStarted","Data":"82f560f41c46f0465a61caa9db29dab4a00e47c1dc065cc737bf4f8c1adb27cd"} Apr 21 07:04:20.423371 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:20.423283 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-ncs5j" podStartSLOduration=1.529246689 podStartE2EDuration="18.423261917s" podCreationTimestamp="2026-04-21 07:04:02 +0000 UTC" firstStartedPulling="2026-04-21 07:04:03.137711607 +0000 UTC m=+59.757045674" lastFinishedPulling="2026-04-21 07:04:20.031726818 +0000 UTC m=+76.651060902" observedRunningTime="2026-04-21 07:04:20.420268321 +0000 UTC m=+77.039602414" watchObservedRunningTime="2026-04-21 07:04:20.423261917 +0000 UTC m=+77.042596003" Apr 21 07:04:21.218896 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.218856 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 07:04:21.244934 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.244899 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 07:04:21.246681 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.245132 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.248598 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.248272 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-b8vqai6so0siv\"" Apr 21 07:04:21.249229 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.248799 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 07:04:21.249229 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.248866 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 07:04:21.249229 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.248799 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-mq6d9\"" Apr 21 07:04:21.249229 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.249077 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 07:04:21.249229 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.249212 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 07:04:21.250082 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.249705 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 07:04:21.250082 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.249814 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 07:04:21.250082 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.249915 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 07:04:21.250282 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.250208 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 07:04:21.250535 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.250425 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 07:04:21.250535 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.250444 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 07:04:21.255607 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.254804 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 07:04:21.257976 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.257709 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 07:04:21.373088 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.372201 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.373088 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.372260 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.373088 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.372292 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-config\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.373088 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.372348 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.373088 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.372377 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.373088 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.372437 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.373088 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.372468 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.373088 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.372498 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-config-out\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.373088 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.372533 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.373088 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.372558 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.373088 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.372610 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.373088 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.372647 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.373088 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.372683 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds6t2\" (UniqueName: \"kubernetes.io/projected/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-kube-api-access-ds6t2\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.373088 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.372712 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-web-config\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.373088 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.372752 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.373088 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.372778 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.374296 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.372834 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.374296 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.372870 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.414668 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.414625 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"796f028f-061e-4efc-93ed-97f5cd3a0802","Type":"ContainerStarted","Data":"8e6ab42699e2e1193d25d715c7653c35289fbaa00084f0e20516fb46fee1e384"} Apr 21 07:04:21.418360 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.417543 2581 generic.go:358] "Generic (PLEG): container finished" podID="93a318c1-6dc6-41e2-8ce6-10df3a949d4c" containerID="cc96b875c34c691715b332148f98f7d82aab068b202f28ad31fa43533f69c7eb" exitCode=0 Apr 21 07:04:21.418360 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.417777 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zxf4k" event={"ID":"93a318c1-6dc6-41e2-8ce6-10df3a949d4c","Type":"ContainerDied","Data":"cc96b875c34c691715b332148f98f7d82aab068b202f28ad31fa43533f69c7eb"} Apr 21 07:04:21.422618 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.422517 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" event={"ID":"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719","Type":"ContainerStarted","Data":"fc715c22d3f50eab56508eb6c6fbf085973848fca24b02577445976c44349f2c"} Apr 21 07:04:21.434511 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.434469 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-ncs5j" Apr 21 07:04:21.473736 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.473652 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.473736 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.473711 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.473962 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.473750 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.473962 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.473778 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-config\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.473962 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.473815 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.473962 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.473841 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.473962 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.473953 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.474196 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.473979 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.474196 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.474024 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-config-out\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.474196 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.474053 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.474196 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.474078 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.474196 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.474114 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.474196 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.474153 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.474631 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.474197 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ds6t2\" (UniqueName: \"kubernetes.io/projected/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-kube-api-access-ds6t2\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.474631 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.474227 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-web-config\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.474631 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.474271 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.474631 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.474296 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.474631 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.474368 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.474631 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.474485 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.484525 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.475654 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.484525 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.475915 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.484525 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.477421 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.484525 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.479857 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.484525 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.481926 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.484525 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.483197 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.484525 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.484375 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.485261 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.484864 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.489455 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.487885 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-web-config\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.489455 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.489010 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.489925 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.489879 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-config\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.491163 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.490799 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.496542 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.496482 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds6t2\" (UniqueName: \"kubernetes.io/projected/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-kube-api-access-ds6t2\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.503536 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.503465 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.504072 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.503746 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.506585 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.506515 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.508808 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.508747 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/148a6d34-cf46-4ad1-b017-ed5bba1d35a0-config-out\") pod \"prometheus-k8s-0\" (UID: \"148a6d34-cf46-4ad1-b017-ed5bba1d35a0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:21.589417 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:21.589048 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:24.316143 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:24.315418 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5977bd9744-9cf64" Apr 21 07:04:28.632826 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:28.627094 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 07:04:29.452460 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.452414 2581 generic.go:358] "Generic (PLEG): container finished" podID="796f028f-061e-4efc-93ed-97f5cd3a0802" containerID="d8ef23dd0e7c2195afbd1c158b809fe1f25973d4e4662314975a3b8d1e29a30b" exitCode=0 Apr 21 07:04:29.452708 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.452506 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"796f028f-061e-4efc-93ed-97f5cd3a0802","Type":"ContainerDied","Data":"d8ef23dd0e7c2195afbd1c158b809fe1f25973d4e4662314975a3b8d1e29a30b"} Apr 21 07:04:29.459314 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.459254 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zf5cc" event={"ID":"15a09b94-ea25-4ffe-8eaf-9ed2025b01a6","Type":"ContainerStarted","Data":"8831ef31bb53f31a9572466d1094bc66f438046c7fb44a6181a3d39e07b03197"} Apr 21 07:04:29.462995 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.462954 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zxf4k" event={"ID":"93a318c1-6dc6-41e2-8ce6-10df3a949d4c","Type":"ContainerStarted","Data":"df874d0a9957d9404b46b4897d824601b31e94f9c34d6944afafe9c2982f48ba"} Apr 21 07:04:29.463131 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.463002 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zxf4k" event={"ID":"93a318c1-6dc6-41e2-8ce6-10df3a949d4c","Type":"ContainerStarted","Data":"cee9582c01ca75061109f7fdf9999b1e363b808dd43166da4fbd38ba83cbca16"} Apr 21 07:04:29.465259 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.465225 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" event={"ID":"363b86d0-bdf4-44a2-96f5-80c829a4f375","Type":"ContainerStarted","Data":"9f91863e417fdb272944097b9e644feeb9d927c643ad5b97e4a06a68474b067f"} Apr 21 07:04:29.467161 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.467134 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-765c64c998-s4lnn" event={"ID":"2372ef7d-8c3c-4eba-8da5-912ad24032da","Type":"ContainerStarted","Data":"caffb7a5980ed7e62b904eb3645c748897ee200ddd1f1f17b4c78190cda3e575"} Apr 21 07:04:29.468875 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.467540 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-765c64c998-s4lnn" Apr 21 07:04:29.470025 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.469996 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" event={"ID":"968c02a1-912f-4c82-8093-ef9cc71fdca3","Type":"ContainerStarted","Data":"300ff5102d7f80fe6ed0619ded7b05e0c62973c6460b8f5d3f1580ae647058c2"} Apr 21 07:04:29.470277 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.470244 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" event={"ID":"968c02a1-912f-4c82-8093-ef9cc71fdca3","Type":"ContainerStarted","Data":"27c2f3adf0de29eabd6fd4bbd0179813898318286f26ef69ca8fe09ec27fdd2b"} Apr 21 07:04:29.470435 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.470412 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" event={"ID":"968c02a1-912f-4c82-8093-ef9cc71fdca3","Type":"ContainerStarted","Data":"3d290cf74cf41b86f9590348d823c8607d9fea93869ce85c6399bcf3ea17ef38"} Apr 21 07:04:29.470553 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.470491 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-765c64c998-s4lnn" Apr 21 07:04:29.472537 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.472513 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" event={"ID":"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719","Type":"ContainerStarted","Data":"e266466ac3b5a0eeaf72a07334567cdf2a807afc755a37b205504e7382149bdb"} Apr 21 07:04:29.472682 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.472545 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" event={"ID":"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719","Type":"ContainerStarted","Data":"883e71144f6e6cf01472ba4ce31b7c0070cad66d3547583ec6cdb930f34f0734"} Apr 21 07:04:29.472682 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.472578 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" event={"ID":"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719","Type":"ContainerStarted","Data":"7164335fc292fcc0874cf45488f84b644c41fe4828074363d83ea5c139311b8d"} Apr 21 07:04:29.474267 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.474240 2581 generic.go:358] "Generic (PLEG): container finished" podID="148a6d34-cf46-4ad1-b017-ed5bba1d35a0" containerID="7635c754b6c6c78d95753c04d6e0477570af6c60ff0afcb56be9753d38349422" exitCode=0 Apr 21 07:04:29.474387 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.474281 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"148a6d34-cf46-4ad1-b017-ed5bba1d35a0","Type":"ContainerDied","Data":"7635c754b6c6c78d95753c04d6e0477570af6c60ff0afcb56be9753d38349422"} Apr 21 07:04:29.474387 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.474307 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"148a6d34-cf46-4ad1-b017-ed5bba1d35a0","Type":"ContainerStarted","Data":"240a8d2c1c13370bdc9b3382d02db1a958ffe9227fa14ea561f2531540940617"} Apr 21 07:04:29.520421 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.520352 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-5dkkb" podStartSLOduration=7.436753813 podStartE2EDuration="15.5203297s" podCreationTimestamp="2026-04-21 07:04:14 +0000 UTC" firstStartedPulling="2026-04-21 07:04:20.3680117 +0000 UTC m=+76.987345774" lastFinishedPulling="2026-04-21 07:04:28.451587589 +0000 UTC m=+85.070921661" observedRunningTime="2026-04-21 07:04:29.518962773 +0000 UTC m=+86.138296889" watchObservedRunningTime="2026-04-21 07:04:29.5203297 +0000 UTC m=+86.139663792" Apr 21 07:04:29.521501 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.521455 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zf5cc" podStartSLOduration=7.37994863 podStartE2EDuration="15.521442758s" podCreationTimestamp="2026-04-21 07:04:14 +0000 UTC" firstStartedPulling="2026-04-21 07:04:20.308365895 +0000 UTC m=+76.927699972" lastFinishedPulling="2026-04-21 07:04:28.449860015 +0000 UTC m=+85.069194100" observedRunningTime="2026-04-21 07:04:29.501848047 +0000 UTC m=+86.121182152" watchObservedRunningTime="2026-04-21 07:04:29.521442758 +0000 UTC m=+86.140776848" Apr 21 07:04:29.536421 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.536362 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" podStartSLOduration=2.444702779 podStartE2EDuration="10.53634588s" podCreationTimestamp="2026-04-21 07:04:19 +0000 UTC" firstStartedPulling="2026-04-21 07:04:20.359797734 +0000 UTC m=+76.979131801" lastFinishedPulling="2026-04-21 07:04:28.451440821 +0000 UTC m=+85.070774902" observedRunningTime="2026-04-21 07:04:29.535674944 +0000 UTC m=+86.155009050" watchObservedRunningTime="2026-04-21 07:04:29.53634588 +0000 UTC m=+86.155679970" Apr 21 07:04:29.589468 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.589405 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zxf4k" podStartSLOduration=14.555713973 podStartE2EDuration="15.589384335s" podCreationTimestamp="2026-04-21 07:04:14 +0000 UTC" firstStartedPulling="2026-04-21 07:04:19.94737053 +0000 UTC m=+76.566704598" lastFinishedPulling="2026-04-21 07:04:20.98104089 +0000 UTC m=+77.600374960" observedRunningTime="2026-04-21 07:04:29.587111953 +0000 UTC m=+86.206446047" watchObservedRunningTime="2026-04-21 07:04:29.589384335 +0000 UTC m=+86.208718427" Apr 21 07:04:29.603902 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:29.603836 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-765c64c998-s4lnn" podStartSLOduration=2.481416149 podStartE2EDuration="27.60381262s" podCreationTimestamp="2026-04-21 07:04:02 +0000 UTC" firstStartedPulling="2026-04-21 07:04:03.349996159 +0000 UTC m=+59.969330242" lastFinishedPulling="2026-04-21 07:04:28.472392626 +0000 UTC m=+85.091726713" observedRunningTime="2026-04-21 07:04:29.602834146 +0000 UTC m=+86.222168257" watchObservedRunningTime="2026-04-21 07:04:29.60381262 +0000 UTC m=+86.223146713" Apr 21 07:04:31.488910 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:31.488867 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" event={"ID":"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719","Type":"ContainerStarted","Data":"3d55e328181d26834533a050fd880c1c5a2f2ccc9d33ca63bcedef44fe32e2c5"} Apr 21 07:04:31.488910 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:31.488913 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" event={"ID":"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719","Type":"ContainerStarted","Data":"ecc2fe6d797d6d256a10c419763e2a654cce3cadc4b12e4813b78c2422450c2d"} Apr 21 07:04:31.489369 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:31.488924 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" event={"ID":"ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719","Type":"ContainerStarted","Data":"921d8aeea6fa0aa2ad7ae590e3f2a0b134c7df9c277906ba8a9f287d81d80ce0"} Apr 21 07:04:31.518464 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:31.518403 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" podStartSLOduration=4.4179865320000005 podStartE2EDuration="14.518386114s" podCreationTimestamp="2026-04-21 07:04:17 +0000 UTC" firstStartedPulling="2026-04-21 07:04:20.384110765 +0000 UTC m=+77.003444846" lastFinishedPulling="2026-04-21 07:04:30.484510348 +0000 UTC m=+87.103844428" observedRunningTime="2026-04-21 07:04:31.516475179 +0000 UTC m=+88.135809299" watchObservedRunningTime="2026-04-21 07:04:31.518386114 +0000 UTC m=+88.137720203" Apr 21 07:04:32.063643 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:32.063599 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d546df6b8-p5485"] Apr 21 07:04:32.492680 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:32.492644 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:33.497861 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:33.497772 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"148a6d34-cf46-4ad1-b017-ed5bba1d35a0","Type":"ContainerStarted","Data":"054e4b3ecaeb7ce0001891653d0ea2788d09ed7ed0e7e5c91f222b7573b2c17e"} Apr 21 07:04:33.497861 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:33.497819 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"148a6d34-cf46-4ad1-b017-ed5bba1d35a0","Type":"ContainerStarted","Data":"ed809305e9b8385676b8d5beeb0db7ce1a86648c02b7877fd5c5c8b3eda05745"} Apr 21 07:04:33.497861 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:33.497829 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"148a6d34-cf46-4ad1-b017-ed5bba1d35a0","Type":"ContainerStarted","Data":"44d720440470c927e2ca936fe1e260ea6212b724a455be59d5c31a740e875635"} Apr 21 07:04:33.500292 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:33.500260 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"796f028f-061e-4efc-93ed-97f5cd3a0802","Type":"ContainerStarted","Data":"fa5e84772ee76eb10145df3e29fa106bc3e65ae14bf84d15c773dc9337a9636a"} Apr 21 07:04:33.500426 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:33.500302 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"796f028f-061e-4efc-93ed-97f5cd3a0802","Type":"ContainerStarted","Data":"4b5e0ceacdd356504be05a5cdda11b163f04b3254271e449e80dd4ad265dc76f"} Apr 21 07:04:33.500426 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:33.500316 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"796f028f-061e-4efc-93ed-97f5cd3a0802","Type":"ContainerStarted","Data":"13c3c19c8db4adc078398541ea6663bbb981ce55fb7418833fe585fb3a0fe4e8"} Apr 21 07:04:33.511760 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:33.511719 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-8dcbc7c47-wc8fk" Apr 21 07:04:34.506135 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:34.506095 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"148a6d34-cf46-4ad1-b017-ed5bba1d35a0","Type":"ContainerStarted","Data":"874d5bef9215da99f3c12e778f8e6bdf4a46da9e9c0deb8e2d9f27393a571f37"} Apr 21 07:04:34.506135 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:34.506141 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"148a6d34-cf46-4ad1-b017-ed5bba1d35a0","Type":"ContainerStarted","Data":"f665a49f87976827e5755e3eec0ecab2d976337c8781957e1667b2c5eff1e53a"} Apr 21 07:04:34.506767 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:34.506158 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"148a6d34-cf46-4ad1-b017-ed5bba1d35a0","Type":"ContainerStarted","Data":"378f87eec98edc9e295ebc78325af33c86b50ace499024415d8b7f6c3ab5a72d"} Apr 21 07:04:34.509384 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:34.509352 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"796f028f-061e-4efc-93ed-97f5cd3a0802","Type":"ContainerStarted","Data":"37b3f196da81e88c7520030fb403dfc712c3129b924ad46d71d57e78ae6650b8"} Apr 21 07:04:34.509527 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:34.509393 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"796f028f-061e-4efc-93ed-97f5cd3a0802","Type":"ContainerStarted","Data":"14f2949b542686d17682447764f5fc9b25a54c26add9505b0da7640a4cd76fd7"} Apr 21 07:04:34.509527 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:34.509407 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"796f028f-061e-4efc-93ed-97f5cd3a0802","Type":"ContainerStarted","Data":"80a66caa1cd5483505ca1d07f1f829d235ed026fbe5487b52afa944618c5a6b9"} Apr 21 07:04:34.552034 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:34.551957 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=10.017095137 podStartE2EDuration="13.551938312s" podCreationTimestamp="2026-04-21 07:04:21 +0000 UTC" firstStartedPulling="2026-04-21 07:04:29.475777273 +0000 UTC m=+86.095111341" lastFinishedPulling="2026-04-21 07:04:33.010620443 +0000 UTC m=+89.629954516" observedRunningTime="2026-04-21 07:04:34.549594152 +0000 UTC m=+91.168928242" watchObservedRunningTime="2026-04-21 07:04:34.551938312 +0000 UTC m=+91.171272403" Apr 21 07:04:34.577872 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:34.577813 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=6.961696231 podStartE2EDuration="19.577795455s" podCreationTimestamp="2026-04-21 07:04:15 +0000 UTC" firstStartedPulling="2026-04-21 07:04:20.388265533 +0000 UTC m=+77.007599615" lastFinishedPulling="2026-04-21 07:04:33.004364768 +0000 UTC m=+89.623698839" observedRunningTime="2026-04-21 07:04:34.57564928 +0000 UTC m=+91.194983371" watchObservedRunningTime="2026-04-21 07:04:34.577795455 +0000 UTC m=+91.197129579" Apr 21 07:04:36.591036 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:36.591000 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:04:39.677239 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:39.677200 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:39.677673 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:39.677295 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:41.437205 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:41.437136 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6887d699d4-whtf5" podUID="d3cc54b9-cc90-420d-adae-f60121d771d4" containerName="console" containerID="cri-o://93e33d8f1c87232a8aab8fe07c946187fe268b3a7f3d15f2169a173d2ba71963" gracePeriod=15 Apr 21 07:04:41.726609 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:41.726584 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6887d699d4-whtf5_d3cc54b9-cc90-420d-adae-f60121d771d4/console/0.log" Apr 21 07:04:41.726751 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:41.726661 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:41.798011 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:41.797971 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6sqv\" (UniqueName: \"kubernetes.io/projected/d3cc54b9-cc90-420d-adae-f60121d771d4-kube-api-access-b6sqv\") pod \"d3cc54b9-cc90-420d-adae-f60121d771d4\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " Apr 21 07:04:41.798011 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:41.798015 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3cc54b9-cc90-420d-adae-f60121d771d4-console-serving-cert\") pod \"d3cc54b9-cc90-420d-adae-f60121d771d4\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " Apr 21 07:04:41.798259 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:41.798057 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d3cc54b9-cc90-420d-adae-f60121d771d4-oauth-serving-cert\") pod \"d3cc54b9-cc90-420d-adae-f60121d771d4\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " Apr 21 07:04:41.798259 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:41.798086 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d3cc54b9-cc90-420d-adae-f60121d771d4-console-config\") pod \"d3cc54b9-cc90-420d-adae-f60121d771d4\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " Apr 21 07:04:41.798259 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:41.798111 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d3cc54b9-cc90-420d-adae-f60121d771d4-service-ca\") pod \"d3cc54b9-cc90-420d-adae-f60121d771d4\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " Apr 21 07:04:41.798259 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:41.798137 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d3cc54b9-cc90-420d-adae-f60121d771d4-console-oauth-config\") pod \"d3cc54b9-cc90-420d-adae-f60121d771d4\" (UID: \"d3cc54b9-cc90-420d-adae-f60121d771d4\") " Apr 21 07:04:41.798636 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:41.798552 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3cc54b9-cc90-420d-adae-f60121d771d4-console-config" (OuterVolumeSpecName: "console-config") pod "d3cc54b9-cc90-420d-adae-f60121d771d4" (UID: "d3cc54b9-cc90-420d-adae-f60121d771d4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:04:41.798636 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:41.798590 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3cc54b9-cc90-420d-adae-f60121d771d4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d3cc54b9-cc90-420d-adae-f60121d771d4" (UID: "d3cc54b9-cc90-420d-adae-f60121d771d4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:04:41.798636 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:41.798601 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3cc54b9-cc90-420d-adae-f60121d771d4-service-ca" (OuterVolumeSpecName: "service-ca") pod "d3cc54b9-cc90-420d-adae-f60121d771d4" (UID: "d3cc54b9-cc90-420d-adae-f60121d771d4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:04:41.800646 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:41.800621 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3cc54b9-cc90-420d-adae-f60121d771d4-kube-api-access-b6sqv" (OuterVolumeSpecName: "kube-api-access-b6sqv") pod "d3cc54b9-cc90-420d-adae-f60121d771d4" (UID: "d3cc54b9-cc90-420d-adae-f60121d771d4"). InnerVolumeSpecName "kube-api-access-b6sqv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:04:41.800717 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:41.800683 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3cc54b9-cc90-420d-adae-f60121d771d4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d3cc54b9-cc90-420d-adae-f60121d771d4" (UID: "d3cc54b9-cc90-420d-adae-f60121d771d4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:04:41.800757 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:41.800736 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3cc54b9-cc90-420d-adae-f60121d771d4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d3cc54b9-cc90-420d-adae-f60121d771d4" (UID: "d3cc54b9-cc90-420d-adae-f60121d771d4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:04:41.899575 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:41.899537 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b6sqv\" (UniqueName: \"kubernetes.io/projected/d3cc54b9-cc90-420d-adae-f60121d771d4-kube-api-access-b6sqv\") on node \"ip-10-0-137-163.ec2.internal\" DevicePath \"\"" Apr 21 07:04:41.899725 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:41.899598 2581 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d3cc54b9-cc90-420d-adae-f60121d771d4-console-serving-cert\") on node \"ip-10-0-137-163.ec2.internal\" DevicePath \"\"" Apr 21 07:04:41.899725 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:41.899610 2581 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d3cc54b9-cc90-420d-adae-f60121d771d4-oauth-serving-cert\") on node \"ip-10-0-137-163.ec2.internal\" DevicePath \"\"" Apr 21 07:04:41.899725 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:41.899621 2581 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d3cc54b9-cc90-420d-adae-f60121d771d4-console-config\") on node \"ip-10-0-137-163.ec2.internal\" DevicePath \"\"" Apr 21 07:04:41.899725 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:41.899630 2581 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d3cc54b9-cc90-420d-adae-f60121d771d4-service-ca\") on node \"ip-10-0-137-163.ec2.internal\" DevicePath \"\"" Apr 21 07:04:41.899725 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:41.899639 2581 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d3cc54b9-cc90-420d-adae-f60121d771d4-console-oauth-config\") on node \"ip-10-0-137-163.ec2.internal\" DevicePath \"\"" Apr 21 07:04:42.537955 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:42.537920 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6887d699d4-whtf5_d3cc54b9-cc90-420d-adae-f60121d771d4/console/0.log" Apr 21 07:04:42.538404 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:42.537966 2581 generic.go:358] "Generic (PLEG): container finished" podID="d3cc54b9-cc90-420d-adae-f60121d771d4" containerID="93e33d8f1c87232a8aab8fe07c946187fe268b3a7f3d15f2169a173d2ba71963" exitCode=2 Apr 21 07:04:42.538404 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:42.538043 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6887d699d4-whtf5" Apr 21 07:04:42.538404 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:42.538049 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6887d699d4-whtf5" event={"ID":"d3cc54b9-cc90-420d-adae-f60121d771d4","Type":"ContainerDied","Data":"93e33d8f1c87232a8aab8fe07c946187fe268b3a7f3d15f2169a173d2ba71963"} Apr 21 07:04:42.538404 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:42.538095 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6887d699d4-whtf5" event={"ID":"d3cc54b9-cc90-420d-adae-f60121d771d4","Type":"ContainerDied","Data":"bbe36fa6f3708e101b3e5fd3961910cfd9d218ea0ca95fef4d3226c77b6c4dd8"} Apr 21 07:04:42.538404 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:42.538112 2581 scope.go:117] "RemoveContainer" containerID="93e33d8f1c87232a8aab8fe07c946187fe268b3a7f3d15f2169a173d2ba71963" Apr 21 07:04:42.548379 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:42.548359 2581 scope.go:117] "RemoveContainer" containerID="93e33d8f1c87232a8aab8fe07c946187fe268b3a7f3d15f2169a173d2ba71963" Apr 21 07:04:42.548710 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:04:42.548688 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93e33d8f1c87232a8aab8fe07c946187fe268b3a7f3d15f2169a173d2ba71963\": container with ID starting with 93e33d8f1c87232a8aab8fe07c946187fe268b3a7f3d15f2169a173d2ba71963 not found: ID does not exist" containerID="93e33d8f1c87232a8aab8fe07c946187fe268b3a7f3d15f2169a173d2ba71963" Apr 21 07:04:42.548764 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:42.548728 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93e33d8f1c87232a8aab8fe07c946187fe268b3a7f3d15f2169a173d2ba71963"} err="failed to get container status \"93e33d8f1c87232a8aab8fe07c946187fe268b3a7f3d15f2169a173d2ba71963\": rpc error: code = NotFound desc = could not find container \"93e33d8f1c87232a8aab8fe07c946187fe268b3a7f3d15f2169a173d2ba71963\": container with ID starting with 93e33d8f1c87232a8aab8fe07c946187fe268b3a7f3d15f2169a173d2ba71963 not found: ID does not exist" Apr 21 07:04:42.556653 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:42.556626 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6887d699d4-whtf5"] Apr 21 07:04:42.560303 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:42.560277 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6887d699d4-whtf5"] Apr 21 07:04:43.982687 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:43.982651 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3cc54b9-cc90-420d-adae-f60121d771d4" path="/var/lib/kubelet/pods/d3cc54b9-cc90-420d-adae-f60121d771d4/volumes" Apr 21 07:04:45.359236 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:45.359202 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-4qpb2" Apr 21 07:04:57.087800 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.087732 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5d546df6b8-p5485" podUID="8beae26a-560a-4d38-bc41-007709b4a3de" containerName="console" containerID="cri-o://7bd668ea4866db49f15a3dd01e1910ce7182813ab0d0a5e5bbe91233b03c5cbc" gracePeriod=15 Apr 21 07:04:57.372446 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.372421 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d546df6b8-p5485_8beae26a-560a-4d38-bc41-007709b4a3de/console/0.log" Apr 21 07:04:57.372594 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.372490 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:57.442233 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.442197 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-trusted-ca-bundle\") pod \"8beae26a-560a-4d38-bc41-007709b4a3de\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " Apr 21 07:04:57.442233 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.442239 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-service-ca\") pod \"8beae26a-560a-4d38-bc41-007709b4a3de\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " Apr 21 07:04:57.442460 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.442264 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8beae26a-560a-4d38-bc41-007709b4a3de-console-serving-cert\") pod \"8beae26a-560a-4d38-bc41-007709b4a3de\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " Apr 21 07:04:57.442460 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.442299 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-console-config\") pod \"8beae26a-560a-4d38-bc41-007709b4a3de\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " Apr 21 07:04:57.442460 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.442346 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8beae26a-560a-4d38-bc41-007709b4a3de-console-oauth-config\") pod \"8beae26a-560a-4d38-bc41-007709b4a3de\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " Apr 21 07:04:57.442460 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.442385 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-oauth-serving-cert\") pod \"8beae26a-560a-4d38-bc41-007709b4a3de\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " Apr 21 07:04:57.442460 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.442421 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxsrs\" (UniqueName: \"kubernetes.io/projected/8beae26a-560a-4d38-bc41-007709b4a3de-kube-api-access-vxsrs\") pod \"8beae26a-560a-4d38-bc41-007709b4a3de\" (UID: \"8beae26a-560a-4d38-bc41-007709b4a3de\") " Apr 21 07:04:57.442723 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.442662 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8beae26a-560a-4d38-bc41-007709b4a3de" (UID: "8beae26a-560a-4d38-bc41-007709b4a3de"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:04:57.442781 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.442726 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-service-ca" (OuterVolumeSpecName: "service-ca") pod "8beae26a-560a-4d38-bc41-007709b4a3de" (UID: "8beae26a-560a-4d38-bc41-007709b4a3de"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:04:57.442781 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.442774 2581 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-trusted-ca-bundle\") on node \"ip-10-0-137-163.ec2.internal\" DevicePath \"\"" Apr 21 07:04:57.442895 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.442859 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-console-config" (OuterVolumeSpecName: "console-config") pod "8beae26a-560a-4d38-bc41-007709b4a3de" (UID: "8beae26a-560a-4d38-bc41-007709b4a3de"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:04:57.443009 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.442980 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8beae26a-560a-4d38-bc41-007709b4a3de" (UID: "8beae26a-560a-4d38-bc41-007709b4a3de"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:04:57.444896 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.444853 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8beae26a-560a-4d38-bc41-007709b4a3de-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8beae26a-560a-4d38-bc41-007709b4a3de" (UID: "8beae26a-560a-4d38-bc41-007709b4a3de"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:04:57.445021 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.444896 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8beae26a-560a-4d38-bc41-007709b4a3de-kube-api-access-vxsrs" (OuterVolumeSpecName: "kube-api-access-vxsrs") pod "8beae26a-560a-4d38-bc41-007709b4a3de" (UID: "8beae26a-560a-4d38-bc41-007709b4a3de"). InnerVolumeSpecName "kube-api-access-vxsrs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:04:57.445021 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.444952 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8beae26a-560a-4d38-bc41-007709b4a3de-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8beae26a-560a-4d38-bc41-007709b4a3de" (UID: "8beae26a-560a-4d38-bc41-007709b4a3de"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:04:57.543618 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.543547 2581 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-console-config\") on node \"ip-10-0-137-163.ec2.internal\" DevicePath \"\"" Apr 21 07:04:57.543618 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.543609 2581 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8beae26a-560a-4d38-bc41-007709b4a3de-console-oauth-config\") on node \"ip-10-0-137-163.ec2.internal\" DevicePath \"\"" Apr 21 07:04:57.543618 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.543623 2581 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-oauth-serving-cert\") on node \"ip-10-0-137-163.ec2.internal\" DevicePath \"\"" Apr 21 07:04:57.543857 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.543637 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vxsrs\" (UniqueName: \"kubernetes.io/projected/8beae26a-560a-4d38-bc41-007709b4a3de-kube-api-access-vxsrs\") on node \"ip-10-0-137-163.ec2.internal\" DevicePath \"\"" Apr 21 07:04:57.543857 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.543650 2581 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8beae26a-560a-4d38-bc41-007709b4a3de-service-ca\") on node \"ip-10-0-137-163.ec2.internal\" DevicePath \"\"" Apr 21 07:04:57.543857 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.543660 2581 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8beae26a-560a-4d38-bc41-007709b4a3de-console-serving-cert\") on node \"ip-10-0-137-163.ec2.internal\" DevicePath \"\"" Apr 21 07:04:57.585456 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.585424 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d546df6b8-p5485_8beae26a-560a-4d38-bc41-007709b4a3de/console/0.log" Apr 21 07:04:57.585662 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.585464 2581 generic.go:358] "Generic (PLEG): container finished" podID="8beae26a-560a-4d38-bc41-007709b4a3de" containerID="7bd668ea4866db49f15a3dd01e1910ce7182813ab0d0a5e5bbe91233b03c5cbc" exitCode=2 Apr 21 07:04:57.585662 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.585500 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d546df6b8-p5485" event={"ID":"8beae26a-560a-4d38-bc41-007709b4a3de","Type":"ContainerDied","Data":"7bd668ea4866db49f15a3dd01e1910ce7182813ab0d0a5e5bbe91233b03c5cbc"} Apr 21 07:04:57.585662 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.585542 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d546df6b8-p5485" Apr 21 07:04:57.585662 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.585550 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d546df6b8-p5485" event={"ID":"8beae26a-560a-4d38-bc41-007709b4a3de","Type":"ContainerDied","Data":"b2a6024a7e20ac5b3069d2a741ea429c99dfccb0aa475f50aad44eedee6b3924"} Apr 21 07:04:57.585662 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.585590 2581 scope.go:117] "RemoveContainer" containerID="7bd668ea4866db49f15a3dd01e1910ce7182813ab0d0a5e5bbe91233b03c5cbc" Apr 21 07:04:57.594619 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.594599 2581 scope.go:117] "RemoveContainer" containerID="7bd668ea4866db49f15a3dd01e1910ce7182813ab0d0a5e5bbe91233b03c5cbc" Apr 21 07:04:57.594942 ip-10-0-137-163 kubenswrapper[2581]: E0421 07:04:57.594926 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bd668ea4866db49f15a3dd01e1910ce7182813ab0d0a5e5bbe91233b03c5cbc\": container with ID starting with 7bd668ea4866db49f15a3dd01e1910ce7182813ab0d0a5e5bbe91233b03c5cbc not found: ID does not exist" containerID="7bd668ea4866db49f15a3dd01e1910ce7182813ab0d0a5e5bbe91233b03c5cbc" Apr 21 07:04:57.595001 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.594953 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bd668ea4866db49f15a3dd01e1910ce7182813ab0d0a5e5bbe91233b03c5cbc"} err="failed to get container status \"7bd668ea4866db49f15a3dd01e1910ce7182813ab0d0a5e5bbe91233b03c5cbc\": rpc error: code = NotFound desc = could not find container \"7bd668ea4866db49f15a3dd01e1910ce7182813ab0d0a5e5bbe91233b03c5cbc\": container with ID starting with 7bd668ea4866db49f15a3dd01e1910ce7182813ab0d0a5e5bbe91233b03c5cbc not found: ID does not exist" Apr 21 07:04:57.607963 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.607891 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d546df6b8-p5485"] Apr 21 07:04:57.616695 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.616669 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5d546df6b8-p5485"] Apr 21 07:04:57.979541 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:57.979455 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8beae26a-560a-4d38-bc41-007709b4a3de" path="/var/lib/kubelet/pods/8beae26a-560a-4d38-bc41-007709b4a3de/volumes" Apr 21 07:04:59.683298 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:59.683263 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:04:59.688032 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:04:59.688010 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-59bf57b49c-nsmxm" Apr 21 07:05:21.590652 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:21.590602 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:05:21.611149 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:21.611116 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:05:21.680457 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:21.680430 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:05:39.364875 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.364836 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-7c546994b4-zr458"] Apr 21 07:05:39.365353 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.365168 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3cc54b9-cc90-420d-adae-f60121d771d4" containerName="console" Apr 21 07:05:39.365353 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.365178 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3cc54b9-cc90-420d-adae-f60121d771d4" containerName="console" Apr 21 07:05:39.365353 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.365193 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8beae26a-560a-4d38-bc41-007709b4a3de" containerName="console" Apr 21 07:05:39.365353 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.365199 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="8beae26a-560a-4d38-bc41-007709b4a3de" containerName="console" Apr 21 07:05:39.365353 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.365258 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="8beae26a-560a-4d38-bc41-007709b4a3de" containerName="console" Apr 21 07:05:39.365353 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.365271 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="d3cc54b9-cc90-420d-adae-f60121d771d4" containerName="console" Apr 21 07:05:39.369860 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.369841 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.373152 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.373122 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-g5s2z\"" Apr 21 07:05:39.373286 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.373163 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 21 07:05:39.373336 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.373320 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 21 07:05:39.373445 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.373427 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 21 07:05:39.373671 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.373652 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 21 07:05:39.373811 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.373789 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 21 07:05:39.375044 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.375011 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7c546994b4-zr458"] Apr 21 07:05:39.378601 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.378559 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 21 07:05:39.419168 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.419125 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11748718-2f6b-488b-ac46-e9be66ad5213-metrics-client-ca\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.419363 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.419178 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q62qc\" (UniqueName: \"kubernetes.io/projected/11748718-2f6b-488b-ac46-e9be66ad5213-kube-api-access-q62qc\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.419363 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.419233 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11748718-2f6b-488b-ac46-e9be66ad5213-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.419363 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.419258 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/11748718-2f6b-488b-ac46-e9be66ad5213-telemeter-client-tls\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.419465 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.419359 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11748718-2f6b-488b-ac46-e9be66ad5213-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.419465 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.419396 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/11748718-2f6b-488b-ac46-e9be66ad5213-secret-telemeter-client\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.419465 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.419418 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11748718-2f6b-488b-ac46-e9be66ad5213-serving-certs-ca-bundle\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.419465 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.419443 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/11748718-2f6b-488b-ac46-e9be66ad5213-federate-client-tls\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.520426 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.520370 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/11748718-2f6b-488b-ac46-e9be66ad5213-telemeter-client-tls\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.520759 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.520464 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11748718-2f6b-488b-ac46-e9be66ad5213-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.520759 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.520514 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/11748718-2f6b-488b-ac46-e9be66ad5213-secret-telemeter-client\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.520759 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.520540 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11748718-2f6b-488b-ac46-e9be66ad5213-serving-certs-ca-bundle\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.520759 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.520604 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/11748718-2f6b-488b-ac46-e9be66ad5213-federate-client-tls\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.520759 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.520644 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11748718-2f6b-488b-ac46-e9be66ad5213-metrics-client-ca\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.520759 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.520691 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q62qc\" (UniqueName: \"kubernetes.io/projected/11748718-2f6b-488b-ac46-e9be66ad5213-kube-api-access-q62qc\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.521079 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.520775 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11748718-2f6b-488b-ac46-e9be66ad5213-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.521721 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.521680 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11748718-2f6b-488b-ac46-e9be66ad5213-serving-certs-ca-bundle\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.521847 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.521680 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11748718-2f6b-488b-ac46-e9be66ad5213-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.521847 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.521745 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11748718-2f6b-488b-ac46-e9be66ad5213-metrics-client-ca\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.524120 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.524096 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/11748718-2f6b-488b-ac46-e9be66ad5213-telemeter-client-tls\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.524243 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.524147 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/11748718-2f6b-488b-ac46-e9be66ad5213-federate-client-tls\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.524286 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.524264 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11748718-2f6b-488b-ac46-e9be66ad5213-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.524321 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.524307 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/11748718-2f6b-488b-ac46-e9be66ad5213-secret-telemeter-client\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.529348 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.529316 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q62qc\" (UniqueName: \"kubernetes.io/projected/11748718-2f6b-488b-ac46-e9be66ad5213-kube-api-access-q62qc\") pod \"telemeter-client-7c546994b4-zr458\" (UID: \"11748718-2f6b-488b-ac46-e9be66ad5213\") " pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.681471 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.681384 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" Apr 21 07:05:39.816116 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:39.816080 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7c546994b4-zr458"] Apr 21 07:05:39.819912 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:05:39.819874 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11748718_2f6b_488b_ac46_e9be66ad5213.slice/crio-9fcdd6f9ee6adc4552d7344b109d1574c43ec56f1bde5277a90c7dc56f674659 WatchSource:0}: Error finding container 9fcdd6f9ee6adc4552d7344b109d1574c43ec56f1bde5277a90c7dc56f674659: Status 404 returned error can't find the container with id 9fcdd6f9ee6adc4552d7344b109d1574c43ec56f1bde5277a90c7dc56f674659 Apr 21 07:05:40.731617 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:40.731553 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" event={"ID":"11748718-2f6b-488b-ac46-e9be66ad5213","Type":"ContainerStarted","Data":"9fcdd6f9ee6adc4552d7344b109d1574c43ec56f1bde5277a90c7dc56f674659"} Apr 21 07:05:42.740238 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:42.740196 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" event={"ID":"11748718-2f6b-488b-ac46-e9be66ad5213","Type":"ContainerStarted","Data":"dd6e19f66e3891696d0db075cd6d0a93a7b2989dcb0e3109dc55e9d2012641e4"} Apr 21 07:05:42.740238 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:42.740241 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" event={"ID":"11748718-2f6b-488b-ac46-e9be66ad5213","Type":"ContainerStarted","Data":"4e6fc9981d82bf9d00c74d00407dbd53f1656400fa6d0090d81f6c914b38e485"} Apr 21 07:05:42.740707 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:42.740255 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" event={"ID":"11748718-2f6b-488b-ac46-e9be66ad5213","Type":"ContainerStarted","Data":"90c01b57d8a818e6aa0ea7c51e49a8872c1998184333c2a0f557976a97b5b791"} Apr 21 07:05:42.777390 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:05:42.777325 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-7c546994b4-zr458" podStartSLOduration=1.777085998 podStartE2EDuration="3.777306116s" podCreationTimestamp="2026-04-21 07:05:39 +0000 UTC" firstStartedPulling="2026-04-21 07:05:39.821705502 +0000 UTC m=+156.441039573" lastFinishedPulling="2026-04-21 07:05:41.821925623 +0000 UTC m=+158.441259691" observedRunningTime="2026-04-21 07:05:42.777103827 +0000 UTC m=+159.396437919" watchObservedRunningTime="2026-04-21 07:05:42.777306116 +0000 UTC m=+159.396640218" Apr 21 07:06:46.009903 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:06:46.009860 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-wqmnx"] Apr 21 07:06:46.013456 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:06:46.013430 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wqmnx" Apr 21 07:06:46.016384 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:06:46.016360 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 07:06:46.023096 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:06:46.023068 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wqmnx"] Apr 21 07:06:46.120337 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:06:46.120304 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dca5cb9a-85e7-469d-aacc-8d12c2e84795-kubelet-config\") pod \"global-pull-secret-syncer-wqmnx\" (UID: \"dca5cb9a-85e7-469d-aacc-8d12c2e84795\") " pod="kube-system/global-pull-secret-syncer-wqmnx" Apr 21 07:06:46.120529 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:06:46.120372 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dca5cb9a-85e7-469d-aacc-8d12c2e84795-original-pull-secret\") pod \"global-pull-secret-syncer-wqmnx\" (UID: \"dca5cb9a-85e7-469d-aacc-8d12c2e84795\") " pod="kube-system/global-pull-secret-syncer-wqmnx" Apr 21 07:06:46.120529 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:06:46.120415 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dca5cb9a-85e7-469d-aacc-8d12c2e84795-dbus\") pod \"global-pull-secret-syncer-wqmnx\" (UID: \"dca5cb9a-85e7-469d-aacc-8d12c2e84795\") " pod="kube-system/global-pull-secret-syncer-wqmnx" Apr 21 07:06:46.221736 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:06:46.221679 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dca5cb9a-85e7-469d-aacc-8d12c2e84795-original-pull-secret\") pod \"global-pull-secret-syncer-wqmnx\" (UID: \"dca5cb9a-85e7-469d-aacc-8d12c2e84795\") " pod="kube-system/global-pull-secret-syncer-wqmnx" Apr 21 07:06:46.221736 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:06:46.221744 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dca5cb9a-85e7-469d-aacc-8d12c2e84795-dbus\") pod \"global-pull-secret-syncer-wqmnx\" (UID: \"dca5cb9a-85e7-469d-aacc-8d12c2e84795\") " pod="kube-system/global-pull-secret-syncer-wqmnx" Apr 21 07:06:46.221966 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:06:46.221818 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dca5cb9a-85e7-469d-aacc-8d12c2e84795-kubelet-config\") pod \"global-pull-secret-syncer-wqmnx\" (UID: \"dca5cb9a-85e7-469d-aacc-8d12c2e84795\") " pod="kube-system/global-pull-secret-syncer-wqmnx" Apr 21 07:06:46.221966 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:06:46.221902 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/dca5cb9a-85e7-469d-aacc-8d12c2e84795-kubelet-config\") pod \"global-pull-secret-syncer-wqmnx\" (UID: \"dca5cb9a-85e7-469d-aacc-8d12c2e84795\") " pod="kube-system/global-pull-secret-syncer-wqmnx" Apr 21 07:06:46.221966 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:06:46.221962 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/dca5cb9a-85e7-469d-aacc-8d12c2e84795-dbus\") pod \"global-pull-secret-syncer-wqmnx\" (UID: \"dca5cb9a-85e7-469d-aacc-8d12c2e84795\") " pod="kube-system/global-pull-secret-syncer-wqmnx" Apr 21 07:06:46.224228 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:06:46.224197 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/dca5cb9a-85e7-469d-aacc-8d12c2e84795-original-pull-secret\") pod \"global-pull-secret-syncer-wqmnx\" (UID: \"dca5cb9a-85e7-469d-aacc-8d12c2e84795\") " pod="kube-system/global-pull-secret-syncer-wqmnx" Apr 21 07:06:46.323865 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:06:46.323768 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wqmnx" Apr 21 07:06:46.450475 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:06:46.450442 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wqmnx"] Apr 21 07:06:46.453954 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:06:46.453923 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddca5cb9a_85e7_469d_aacc_8d12c2e84795.slice/crio-edd17e3dc138ddb33c676bf3a39c4bff6f7602e9a25e3f89c47938153dbc0c3d WatchSource:0}: Error finding container edd17e3dc138ddb33c676bf3a39c4bff6f7602e9a25e3f89c47938153dbc0c3d: Status 404 returned error can't find the container with id edd17e3dc138ddb33c676bf3a39c4bff6f7602e9a25e3f89c47938153dbc0c3d Apr 21 07:06:46.940415 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:06:46.940376 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wqmnx" event={"ID":"dca5cb9a-85e7-469d-aacc-8d12c2e84795","Type":"ContainerStarted","Data":"edd17e3dc138ddb33c676bf3a39c4bff6f7602e9a25e3f89c47938153dbc0c3d"} Apr 21 07:06:50.955295 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:06:50.955255 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wqmnx" event={"ID":"dca5cb9a-85e7-469d-aacc-8d12c2e84795","Type":"ContainerStarted","Data":"f7dfb34f502f4ffc38e11cd01f9cf1296939da7b22bea7c1ccd22a973d5c50b4"} Apr 21 07:06:50.972975 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:06:50.972915 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-wqmnx" podStartSLOduration=2.204500538 podStartE2EDuration="5.97289968s" podCreationTimestamp="2026-04-21 07:06:45 +0000 UTC" firstStartedPulling="2026-04-21 07:06:46.455932008 +0000 UTC m=+223.075266076" lastFinishedPulling="2026-04-21 07:06:50.22433115 +0000 UTC m=+226.843665218" observedRunningTime="2026-04-21 07:06:50.971299839 +0000 UTC m=+227.590633942" watchObservedRunningTime="2026-04-21 07:06:50.97289968 +0000 UTC m=+227.592233791" Apr 21 07:08:03.880626 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:03.880588 2581 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 07:08:50.602331 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:50.602241 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-7c5547bb65-gp899"] Apr 21 07:08:50.605682 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:50.605660 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-7c5547bb65-gp899" Apr 21 07:08:50.608345 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:50.608308 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-webhook-cert\"" Apr 21 07:08:50.608345 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:50.608327 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-controller-manager-dockercfg-4ct9d\"" Apr 21 07:08:50.609451 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:50.609431 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 07:08:50.609523 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:50.609478 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 07:08:50.609523 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:50.609486 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kubeflow-trainer-config\"" Apr 21 07:08:50.618793 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:50.618766 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-7c5547bb65-gp899"] Apr 21 07:08:50.645936 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:50.645890 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fztx\" (UniqueName: \"kubernetes.io/projected/920b464c-0448-4341-83f7-0b9c2bd12356-kube-api-access-2fztx\") pod \"kubeflow-trainer-controller-manager-7c5547bb65-gp899\" (UID: \"920b464c-0448-4341-83f7-0b9c2bd12356\") " pod="opendatahub/kubeflow-trainer-controller-manager-7c5547bb65-gp899" Apr 21 07:08:50.646132 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:50.645948 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/920b464c-0448-4341-83f7-0b9c2bd12356-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-7c5547bb65-gp899\" (UID: \"920b464c-0448-4341-83f7-0b9c2bd12356\") " pod="opendatahub/kubeflow-trainer-controller-manager-7c5547bb65-gp899" Apr 21 07:08:50.646132 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:50.646073 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/920b464c-0448-4341-83f7-0b9c2bd12356-cert\") pod \"kubeflow-trainer-controller-manager-7c5547bb65-gp899\" (UID: \"920b464c-0448-4341-83f7-0b9c2bd12356\") " pod="opendatahub/kubeflow-trainer-controller-manager-7c5547bb65-gp899" Apr 21 07:08:50.747312 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:50.747264 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/920b464c-0448-4341-83f7-0b9c2bd12356-cert\") pod \"kubeflow-trainer-controller-manager-7c5547bb65-gp899\" (UID: \"920b464c-0448-4341-83f7-0b9c2bd12356\") " pod="opendatahub/kubeflow-trainer-controller-manager-7c5547bb65-gp899" Apr 21 07:08:50.747532 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:50.747356 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fztx\" (UniqueName: \"kubernetes.io/projected/920b464c-0448-4341-83f7-0b9c2bd12356-kube-api-access-2fztx\") pod \"kubeflow-trainer-controller-manager-7c5547bb65-gp899\" (UID: \"920b464c-0448-4341-83f7-0b9c2bd12356\") " pod="opendatahub/kubeflow-trainer-controller-manager-7c5547bb65-gp899" Apr 21 07:08:50.747532 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:50.747394 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/920b464c-0448-4341-83f7-0b9c2bd12356-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-7c5547bb65-gp899\" (UID: \"920b464c-0448-4341-83f7-0b9c2bd12356\") " pod="opendatahub/kubeflow-trainer-controller-manager-7c5547bb65-gp899" Apr 21 07:08:50.748083 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:50.748056 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/920b464c-0448-4341-83f7-0b9c2bd12356-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-7c5547bb65-gp899\" (UID: \"920b464c-0448-4341-83f7-0b9c2bd12356\") " pod="opendatahub/kubeflow-trainer-controller-manager-7c5547bb65-gp899" Apr 21 07:08:50.749928 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:50.749895 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/920b464c-0448-4341-83f7-0b9c2bd12356-cert\") pod \"kubeflow-trainer-controller-manager-7c5547bb65-gp899\" (UID: \"920b464c-0448-4341-83f7-0b9c2bd12356\") " pod="opendatahub/kubeflow-trainer-controller-manager-7c5547bb65-gp899" Apr 21 07:08:50.759802 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:50.759767 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fztx\" (UniqueName: \"kubernetes.io/projected/920b464c-0448-4341-83f7-0b9c2bd12356-kube-api-access-2fztx\") pod \"kubeflow-trainer-controller-manager-7c5547bb65-gp899\" (UID: \"920b464c-0448-4341-83f7-0b9c2bd12356\") " pod="opendatahub/kubeflow-trainer-controller-manager-7c5547bb65-gp899" Apr 21 07:08:50.928448 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:50.928351 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-7c5547bb65-gp899" Apr 21 07:08:51.057657 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:51.057623 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-7c5547bb65-gp899"] Apr 21 07:08:51.061409 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:08:51.061378 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod920b464c_0448_4341_83f7_0b9c2bd12356.slice/crio-dede25fd5b7a073770a2dcff39525a263113543a97e9b67258ed8ee60e141d07 WatchSource:0}: Error finding container dede25fd5b7a073770a2dcff39525a263113543a97e9b67258ed8ee60e141d07: Status 404 returned error can't find the container with id dede25fd5b7a073770a2dcff39525a263113543a97e9b67258ed8ee60e141d07 Apr 21 07:08:51.063177 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:51.063159 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 07:08:51.321936 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:51.321842 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-7c5547bb65-gp899" event={"ID":"920b464c-0448-4341-83f7-0b9c2bd12356","Type":"ContainerStarted","Data":"dede25fd5b7a073770a2dcff39525a263113543a97e9b67258ed8ee60e141d07"} Apr 21 07:08:54.333241 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:54.333204 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-7c5547bb65-gp899" event={"ID":"920b464c-0448-4341-83f7-0b9c2bd12356","Type":"ContainerStarted","Data":"a4c0a37b75178c7d6a52e1d0ba6bc35ef84d80a48c1bae1db482667a9fb59b3f"} Apr 21 07:08:54.333671 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:54.333332 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kubeflow-trainer-controller-manager-7c5547bb65-gp899" Apr 21 07:08:54.357197 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:08:54.357127 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kubeflow-trainer-controller-manager-7c5547bb65-gp899" podStartSLOduration=2.064948774 podStartE2EDuration="4.357105774s" podCreationTimestamp="2026-04-21 07:08:50 +0000 UTC" firstStartedPulling="2026-04-21 07:08:51.063286105 +0000 UTC m=+347.682620174" lastFinishedPulling="2026-04-21 07:08:53.355443106 +0000 UTC m=+349.974777174" observedRunningTime="2026-04-21 07:08:54.354779249 +0000 UTC m=+350.974113339" watchObservedRunningTime="2026-04-21 07:08:54.357105774 +0000 UTC m=+350.976439865" Apr 21 07:09:10.341896 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:09:10.341853 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kubeflow-trainer-controller-manager-7c5547bb65-gp899" Apr 21 07:10:45.074027 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:10:45.073987 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47"] Apr 21 07:10:45.076411 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:10:45.076379 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47" Apr 21 07:10:45.079606 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:10:45.079584 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-ncrpp\"/\"openshift-service-ca.crt\"" Apr 21 07:10:45.079753 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:10:45.079626 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-ncrpp\"/\"kube-root-ca.crt\"" Apr 21 07:10:45.079753 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:10:45.079626 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-ncrpp\"/\"default-dockercfg-lkwr7\"" Apr 21 07:10:45.095442 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:10:45.095406 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47"] Apr 21 07:10:45.198177 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:10:45.198136 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnrlc\" (UniqueName: \"kubernetes.io/projected/acbd1863-f026-42e1-820e-7d26e599e564-kube-api-access-hnrlc\") pod \"progression-enabled-node-0-0-zjh47\" (UID: \"acbd1863-f026-42e1-820e-7d26e599e564\") " pod="rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47" Apr 21 07:10:45.299430 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:10:45.299359 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnrlc\" (UniqueName: \"kubernetes.io/projected/acbd1863-f026-42e1-820e-7d26e599e564-kube-api-access-hnrlc\") pod \"progression-enabled-node-0-0-zjh47\" (UID: \"acbd1863-f026-42e1-820e-7d26e599e564\") " pod="rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47" Apr 21 07:10:45.309214 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:10:45.309178 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnrlc\" (UniqueName: \"kubernetes.io/projected/acbd1863-f026-42e1-820e-7d26e599e564-kube-api-access-hnrlc\") pod \"progression-enabled-node-0-0-zjh47\" (UID: \"acbd1863-f026-42e1-820e-7d26e599e564\") " pod="rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47" Apr 21 07:10:45.387714 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:10:45.387671 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47" Apr 21 07:10:45.516128 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:10:45.516082 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47"] Apr 21 07:10:45.519035 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:10:45.518991 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacbd1863_f026_42e1_820e_7d26e599e564.slice/crio-27f88e4aea923121fcf7d480ccb3a51caef1696f3c951b11e0c623c284cfe6c9 WatchSource:0}: Error finding container 27f88e4aea923121fcf7d480ccb3a51caef1696f3c951b11e0c623c284cfe6c9: Status 404 returned error can't find the container with id 27f88e4aea923121fcf7d480ccb3a51caef1696f3c951b11e0c623c284cfe6c9 Apr 21 07:10:45.680037 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:10:45.679937 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47" event={"ID":"acbd1863-f026-42e1-820e-7d26e599e564","Type":"ContainerStarted","Data":"27f88e4aea923121fcf7d480ccb3a51caef1696f3c951b11e0c623c284cfe6c9"} Apr 21 07:12:49.102913 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:12:49.102874 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47" event={"ID":"acbd1863-f026-42e1-820e-7d26e599e564","Type":"ContainerStarted","Data":"5a2d30192c51cae6d955d9bcab0e8807447fd5456759a3ba0622812a841eacdc"} Apr 21 07:12:49.103441 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:12:49.103022 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47" Apr 21 07:12:49.123332 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:12:49.123256 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47" podStartSLOduration=0.754154991 podStartE2EDuration="2m4.123235087s" podCreationTimestamp="2026-04-21 07:10:45 +0000 UTC" firstStartedPulling="2026-04-21 07:10:45.521210766 +0000 UTC m=+462.140544835" lastFinishedPulling="2026-04-21 07:12:48.890290861 +0000 UTC m=+585.509624931" observedRunningTime="2026-04-21 07:12:49.12005655 +0000 UTC m=+585.739390641" watchObservedRunningTime="2026-04-21 07:12:49.123235087 +0000 UTC m=+585.742569180" Apr 21 07:12:50.105634 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:12:50.105594 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47" podUID="acbd1863-f026-42e1-820e-7d26e599e564" containerName="node" probeResult="failure" output="Get \"http://10.134.0.23:28080/metrics\": dial tcp 10.134.0.23:28080: connect: connection refused" Apr 21 07:12:50.107997 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:12:50.107960 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47" podUID="acbd1863-f026-42e1-820e-7d26e599e564" containerName="node" probeResult="failure" output="Get \"http://10.134.0.23:28080/metrics\": dial tcp 10.134.0.23:28080: connect: connection refused" Apr 21 07:12:51.109962 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:12:51.109929 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47" Apr 21 07:13:12.251830 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:12.251777 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47" podUID="acbd1863-f026-42e1-820e-7d26e599e564" containerName="node" probeResult="failure" output="Get \"http://10.134.0.23:28080/metrics\": read tcp 10.134.0.2:40436->10.134.0.23:28080: read: connection reset by peer" Apr 21 07:13:13.108523 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:13.108473 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47" podUID="acbd1863-f026-42e1-820e-7d26e599e564" containerName="node" probeResult="failure" output="Get \"http://10.134.0.23:28080/metrics\": dial tcp 10.134.0.23:28080: connect: connection refused" Apr 21 07:13:13.108741 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:13.108702 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47" Apr 21 07:13:13.109274 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:13.109242 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47" podUID="acbd1863-f026-42e1-820e-7d26e599e564" containerName="node" probeResult="failure" output="Get \"http://10.134.0.23:28080/metrics\": dial tcp 10.134.0.23:28080: connect: connection refused" Apr 21 07:13:13.179184 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:13.179144 2581 generic.go:358] "Generic (PLEG): container finished" podID="acbd1863-f026-42e1-820e-7d26e599e564" containerID="5a2d30192c51cae6d955d9bcab0e8807447fd5456759a3ba0622812a841eacdc" exitCode=0 Apr 21 07:13:13.179364 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:13.179201 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47" event={"ID":"acbd1863-f026-42e1-820e-7d26e599e564","Type":"ContainerDied","Data":"5a2d30192c51cae6d955d9bcab0e8807447fd5456759a3ba0622812a841eacdc"} Apr 21 07:13:14.309867 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:14.309843 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47" Apr 21 07:13:14.421100 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:14.421065 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnrlc\" (UniqueName: \"kubernetes.io/projected/acbd1863-f026-42e1-820e-7d26e599e564-kube-api-access-hnrlc\") pod \"acbd1863-f026-42e1-820e-7d26e599e564\" (UID: \"acbd1863-f026-42e1-820e-7d26e599e564\") " Apr 21 07:13:14.423482 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:14.423452 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acbd1863-f026-42e1-820e-7d26e599e564-kube-api-access-hnrlc" (OuterVolumeSpecName: "kube-api-access-hnrlc") pod "acbd1863-f026-42e1-820e-7d26e599e564" (UID: "acbd1863-f026-42e1-820e-7d26e599e564"). InnerVolumeSpecName "kube-api-access-hnrlc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:13:14.522332 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:14.522234 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hnrlc\" (UniqueName: \"kubernetes.io/projected/acbd1863-f026-42e1-820e-7d26e599e564-kube-api-access-hnrlc\") on node \"ip-10-0-137-163.ec2.internal\" DevicePath \"\"" Apr 21 07:13:15.186591 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:15.186526 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47" event={"ID":"acbd1863-f026-42e1-820e-7d26e599e564","Type":"ContainerDied","Data":"27f88e4aea923121fcf7d480ccb3a51caef1696f3c951b11e0c623c284cfe6c9"} Apr 21 07:13:15.186591 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:15.186591 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27f88e4aea923121fcf7d480ccb3a51caef1696f3c951b11e0c623c284cfe6c9" Apr 21 07:13:15.186806 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:15.186535 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47" Apr 21 07:13:15.549481 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:15.549393 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g"] Apr 21 07:13:15.549855 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:15.549763 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="acbd1863-f026-42e1-820e-7d26e599e564" containerName="node" Apr 21 07:13:15.549855 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:15.549775 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="acbd1863-f026-42e1-820e-7d26e599e564" containerName="node" Apr 21 07:13:15.549855 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:15.549844 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="acbd1863-f026-42e1-820e-7d26e599e564" containerName="node" Apr 21 07:13:15.570756 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:15.570716 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g"] Apr 21 07:13:15.570920 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:15.570861 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g" Apr 21 07:13:15.573451 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:15.573425 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-ncrpp\"/\"openshift-service-ca.crt\"" Apr 21 07:13:15.573617 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:15.573495 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-ncrpp\"/\"default-dockercfg-lkwr7\"" Apr 21 07:13:15.573617 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:15.573523 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-ncrpp\"/\"kube-root-ca.crt\"" Apr 21 07:13:15.631748 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:15.631712 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcz8v\" (UniqueName: \"kubernetes.io/projected/d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1-kube-api-access-hcz8v\") pod \"progression-disabled-node-0-0-r8w6g\" (UID: \"d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1\") " pod="rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g" Apr 21 07:13:15.732272 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:15.732233 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcz8v\" (UniqueName: \"kubernetes.io/projected/d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1-kube-api-access-hcz8v\") pod \"progression-disabled-node-0-0-r8w6g\" (UID: \"d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1\") " pod="rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g" Apr 21 07:13:15.741346 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:15.741309 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcz8v\" (UniqueName: \"kubernetes.io/projected/d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1-kube-api-access-hcz8v\") pod \"progression-disabled-node-0-0-r8w6g\" (UID: \"d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1\") " pod="rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g" Apr 21 07:13:15.880934 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:15.880896 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g" Apr 21 07:13:16.010177 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:16.010148 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g"] Apr 21 07:13:16.012727 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:13:16.012696 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26b7c14_10bb_44dc_9bf5_3704e0d9ceb1.slice/crio-ebe68f9a99221c2ac1b61dd3ba328d9e9068acf6ab7579d583194f08c61bbc2e WatchSource:0}: Error finding container ebe68f9a99221c2ac1b61dd3ba328d9e9068acf6ab7579d583194f08c61bbc2e: Status 404 returned error can't find the container with id ebe68f9a99221c2ac1b61dd3ba328d9e9068acf6ab7579d583194f08c61bbc2e Apr 21 07:13:16.194918 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:16.194822 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g" event={"ID":"d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1","Type":"ContainerStarted","Data":"1bce2ae70ee6b61df57ad3772159aee3668fa5ac2a1b08efde7fdadde0e11935"} Apr 21 07:13:16.194918 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:16.194869 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g" Apr 21 07:13:16.194918 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:16.194886 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g" event={"ID":"d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1","Type":"ContainerStarted","Data":"ebe68f9a99221c2ac1b61dd3ba328d9e9068acf6ab7579d583194f08c61bbc2e"} Apr 21 07:13:16.211434 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:16.211368 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g" podStartSLOduration=1.2113508849999999 podStartE2EDuration="1.211350885s" podCreationTimestamp="2026-04-21 07:13:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:13:16.209035215 +0000 UTC m=+612.828369304" watchObservedRunningTime="2026-04-21 07:13:16.211350885 +0000 UTC m=+612.830684974" Apr 21 07:13:18.201416 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:18.201383 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g" Apr 21 07:13:39.350831 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:39.350781 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g" podUID="d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1" containerName="node" probeResult="failure" output="Get \"http://10.134.0.24:28080/metrics\": read tcp 10.134.0.2:60142->10.134.0.24:28080: read: connection reset by peer" Apr 21 07:13:40.200015 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:40.199966 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g" podUID="d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1" containerName="node" probeResult="failure" output="Get \"http://10.134.0.24:28080/metrics\": dial tcp 10.134.0.24:28080: connect: connection refused" Apr 21 07:13:40.200239 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:40.200092 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g" Apr 21 07:13:40.200659 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:40.200635 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g" podUID="d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1" containerName="node" probeResult="failure" output="Get \"http://10.134.0.24:28080/metrics\": dial tcp 10.134.0.24:28080: connect: connection refused" Apr 21 07:13:40.276707 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:40.276671 2581 generic.go:358] "Generic (PLEG): container finished" podID="d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1" containerID="1bce2ae70ee6b61df57ad3772159aee3668fa5ac2a1b08efde7fdadde0e11935" exitCode=0 Apr 21 07:13:40.276876 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:40.276742 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g" event={"ID":"d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1","Type":"ContainerDied","Data":"1bce2ae70ee6b61df57ad3772159aee3668fa5ac2a1b08efde7fdadde0e11935"} Apr 21 07:13:41.403455 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:41.403430 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g" Apr 21 07:13:41.476827 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:41.476777 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcz8v\" (UniqueName: \"kubernetes.io/projected/d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1-kube-api-access-hcz8v\") pod \"d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1\" (UID: \"d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1\") " Apr 21 07:13:41.479056 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:41.479024 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1-kube-api-access-hcz8v" (OuterVolumeSpecName: "kube-api-access-hcz8v") pod "d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1" (UID: "d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1"). InnerVolumeSpecName "kube-api-access-hcz8v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:13:41.577733 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:41.577643 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hcz8v\" (UniqueName: \"kubernetes.io/projected/d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1-kube-api-access-hcz8v\") on node \"ip-10-0-137-163.ec2.internal\" DevicePath \"\"" Apr 21 07:13:42.284608 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:42.284494 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g" Apr 21 07:13:42.284608 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:42.284494 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g" event={"ID":"d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1","Type":"ContainerDied","Data":"ebe68f9a99221c2ac1b61dd3ba328d9e9068acf6ab7579d583194f08c61bbc2e"} Apr 21 07:13:42.284820 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:42.284616 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebe68f9a99221c2ac1b61dd3ba328d9e9068acf6ab7579d583194f08c61bbc2e" Apr 21 07:13:50.607156 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:50.607108 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-ncrpp/progression-invalid-node-0-0-nqk2r"] Apr 21 07:13:50.607730 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:50.607709 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1" containerName="node" Apr 21 07:13:50.607807 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:50.607733 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1" containerName="node" Apr 21 07:13:50.607866 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:50.607821 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1" containerName="node" Apr 21 07:13:50.610907 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:50.610884 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-ncrpp/progression-invalid-node-0-0-nqk2r" Apr 21 07:13:50.613769 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:50.613744 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-ncrpp\"/\"default-dockercfg-lkwr7\"" Apr 21 07:13:50.614188 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:50.614168 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-ncrpp\"/\"openshift-service-ca.crt\"" Apr 21 07:13:50.614711 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:50.614697 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-ncrpp\"/\"kube-root-ca.crt\"" Apr 21 07:13:50.620678 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:50.620647 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-ncrpp/progression-invalid-node-0-0-nqk2r"] Apr 21 07:13:50.654498 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:50.654462 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb2xx\" (UniqueName: \"kubernetes.io/projected/7b725272-26d7-4fa7-94e0-a82625e401cb-kube-api-access-lb2xx\") pod \"progression-invalid-node-0-0-nqk2r\" (UID: \"7b725272-26d7-4fa7-94e0-a82625e401cb\") " pod="rhai-e2e-progression-ncrpp/progression-invalid-node-0-0-nqk2r" Apr 21 07:13:50.755945 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:50.755886 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lb2xx\" (UniqueName: \"kubernetes.io/projected/7b725272-26d7-4fa7-94e0-a82625e401cb-kube-api-access-lb2xx\") pod \"progression-invalid-node-0-0-nqk2r\" (UID: \"7b725272-26d7-4fa7-94e0-a82625e401cb\") " pod="rhai-e2e-progression-ncrpp/progression-invalid-node-0-0-nqk2r" Apr 21 07:13:50.765251 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:50.765222 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb2xx\" (UniqueName: \"kubernetes.io/projected/7b725272-26d7-4fa7-94e0-a82625e401cb-kube-api-access-lb2xx\") pod \"progression-invalid-node-0-0-nqk2r\" (UID: \"7b725272-26d7-4fa7-94e0-a82625e401cb\") " pod="rhai-e2e-progression-ncrpp/progression-invalid-node-0-0-nqk2r" Apr 21 07:13:50.920964 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:50.920867 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-ncrpp/progression-invalid-node-0-0-nqk2r" Apr 21 07:13:51.054877 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:51.054849 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-ncrpp/progression-invalid-node-0-0-nqk2r"] Apr 21 07:13:51.057463 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:13:51.057434 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b725272_26d7_4fa7_94e0_a82625e401cb.slice/crio-5b75078747a1a0f9aba08f5aaff82f4fd46d02a1d8482ae34bb685d44b73cc8f WatchSource:0}: Error finding container 5b75078747a1a0f9aba08f5aaff82f4fd46d02a1d8482ae34bb685d44b73cc8f: Status 404 returned error can't find the container with id 5b75078747a1a0f9aba08f5aaff82f4fd46d02a1d8482ae34bb685d44b73cc8f Apr 21 07:13:51.316589 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:51.316483 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-ncrpp/progression-invalid-node-0-0-nqk2r" event={"ID":"7b725272-26d7-4fa7-94e0-a82625e401cb","Type":"ContainerStarted","Data":"8522b6eb8de45ae29c456fcedbc18be2785219d918fe1983931924abb2b8d949"} Apr 21 07:13:51.316589 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:51.316525 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-ncrpp/progression-invalid-node-0-0-nqk2r" event={"ID":"7b725272-26d7-4fa7-94e0-a82625e401cb","Type":"ContainerStarted","Data":"5b75078747a1a0f9aba08f5aaff82f4fd46d02a1d8482ae34bb685d44b73cc8f"} Apr 21 07:13:51.316771 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:51.316604 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="rhai-e2e-progression-ncrpp/progression-invalid-node-0-0-nqk2r" Apr 21 07:13:51.340540 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:51.340488 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-ncrpp/progression-invalid-node-0-0-nqk2r" podStartSLOduration=1.340468159 podStartE2EDuration="1.340468159s" podCreationTimestamp="2026-04-21 07:13:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:13:51.339816023 +0000 UTC m=+647.959150123" watchObservedRunningTime="2026-04-21 07:13:51.340468159 +0000 UTC m=+647.959802251" Apr 21 07:13:53.324067 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:13:53.324040 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="rhai-e2e-progression-ncrpp/progression-invalid-node-0-0-nqk2r" Apr 21 07:14:14.321550 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:14:14.321503 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="rhai-e2e-progression-ncrpp/progression-invalid-node-0-0-nqk2r" podUID="7b725272-26d7-4fa7-94e0-a82625e401cb" containerName="node" probeResult="failure" output="Get \"http://10.134.0.25:28080/metrics\": dial tcp 10.134.0.25:28080: connect: connection refused" Apr 21 07:14:14.408394 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:14:14.408359 2581 generic.go:358] "Generic (PLEG): container finished" podID="7b725272-26d7-4fa7-94e0-a82625e401cb" containerID="8522b6eb8de45ae29c456fcedbc18be2785219d918fe1983931924abb2b8d949" exitCode=0 Apr 21 07:14:14.408603 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:14:14.408438 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-ncrpp/progression-invalid-node-0-0-nqk2r" event={"ID":"7b725272-26d7-4fa7-94e0-a82625e401cb","Type":"ContainerDied","Data":"8522b6eb8de45ae29c456fcedbc18be2785219d918fe1983931924abb2b8d949"} Apr 21 07:14:15.548381 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:14:15.548356 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-ncrpp/progression-invalid-node-0-0-nqk2r" Apr 21 07:14:15.593257 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:14:15.593219 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb2xx\" (UniqueName: \"kubernetes.io/projected/7b725272-26d7-4fa7-94e0-a82625e401cb-kube-api-access-lb2xx\") pod \"7b725272-26d7-4fa7-94e0-a82625e401cb\" (UID: \"7b725272-26d7-4fa7-94e0-a82625e401cb\") " Apr 21 07:14:15.595615 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:14:15.595587 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b725272-26d7-4fa7-94e0-a82625e401cb-kube-api-access-lb2xx" (OuterVolumeSpecName: "kube-api-access-lb2xx") pod "7b725272-26d7-4fa7-94e0-a82625e401cb" (UID: "7b725272-26d7-4fa7-94e0-a82625e401cb"). InnerVolumeSpecName "kube-api-access-lb2xx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:14:15.694265 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:14:15.694162 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lb2xx\" (UniqueName: \"kubernetes.io/projected/7b725272-26d7-4fa7-94e0-a82625e401cb-kube-api-access-lb2xx\") on node \"ip-10-0-137-163.ec2.internal\" DevicePath \"\"" Apr 21 07:14:16.417231 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:14:16.417191 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-ncrpp/progression-invalid-node-0-0-nqk2r" event={"ID":"7b725272-26d7-4fa7-94e0-a82625e401cb","Type":"ContainerDied","Data":"5b75078747a1a0f9aba08f5aaff82f4fd46d02a1d8482ae34bb685d44b73cc8f"} Apr 21 07:14:16.417231 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:14:16.417223 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-ncrpp/progression-invalid-node-0-0-nqk2r" Apr 21 07:14:16.417231 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:14:16.417233 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b75078747a1a0f9aba08f5aaff82f4fd46d02a1d8482ae34bb685d44b73cc8f" Apr 21 07:16:10.270827 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:10.270781 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["rhai-e2e-progression-ncrpp/progression-no-metrics-node-0-0-7vqvg"] Apr 21 07:16:10.271365 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:10.271322 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b725272-26d7-4fa7-94e0-a82625e401cb" containerName="node" Apr 21 07:16:10.271365 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:10.271344 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b725272-26d7-4fa7-94e0-a82625e401cb" containerName="node" Apr 21 07:16:10.271496 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:10.271447 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b725272-26d7-4fa7-94e0-a82625e401cb" containerName="node" Apr 21 07:16:10.274667 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:10.274642 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-ncrpp/progression-no-metrics-node-0-0-7vqvg" Apr 21 07:16:10.277322 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:10.277290 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-ncrpp\"/\"openshift-service-ca.crt\"" Apr 21 07:16:10.277443 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:10.277402 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"rhai-e2e-progression-ncrpp\"/\"default-dockercfg-lkwr7\"" Apr 21 07:16:10.277443 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:10.277414 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"rhai-e2e-progression-ncrpp\"/\"kube-root-ca.crt\"" Apr 21 07:16:10.284201 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:10.284173 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-ncrpp/progression-no-metrics-node-0-0-7vqvg"] Apr 21 07:16:10.420283 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:10.420250 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpqn2\" (UniqueName: \"kubernetes.io/projected/4e8a69cf-0e0b-4046-b101-effcc11d072a-kube-api-access-rpqn2\") pod \"progression-no-metrics-node-0-0-7vqvg\" (UID: \"4e8a69cf-0e0b-4046-b101-effcc11d072a\") " pod="rhai-e2e-progression-ncrpp/progression-no-metrics-node-0-0-7vqvg" Apr 21 07:16:10.521519 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:10.521427 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpqn2\" (UniqueName: \"kubernetes.io/projected/4e8a69cf-0e0b-4046-b101-effcc11d072a-kube-api-access-rpqn2\") pod \"progression-no-metrics-node-0-0-7vqvg\" (UID: \"4e8a69cf-0e0b-4046-b101-effcc11d072a\") " pod="rhai-e2e-progression-ncrpp/progression-no-metrics-node-0-0-7vqvg" Apr 21 07:16:10.530222 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:10.530193 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpqn2\" (UniqueName: \"kubernetes.io/projected/4e8a69cf-0e0b-4046-b101-effcc11d072a-kube-api-access-rpqn2\") pod \"progression-no-metrics-node-0-0-7vqvg\" (UID: \"4e8a69cf-0e0b-4046-b101-effcc11d072a\") " pod="rhai-e2e-progression-ncrpp/progression-no-metrics-node-0-0-7vqvg" Apr 21 07:16:10.584694 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:10.584651 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-ncrpp/progression-no-metrics-node-0-0-7vqvg" Apr 21 07:16:10.712702 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:10.712616 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["rhai-e2e-progression-ncrpp/progression-no-metrics-node-0-0-7vqvg"] Apr 21 07:16:10.715657 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:16:10.715625 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e8a69cf_0e0b_4046_b101_effcc11d072a.slice/crio-62fb28cff963ead35ba86ad9905420215f51f5073b2053f03955332b018e8672 WatchSource:0}: Error finding container 62fb28cff963ead35ba86ad9905420215f51f5073b2053f03955332b018e8672: Status 404 returned error can't find the container with id 62fb28cff963ead35ba86ad9905420215f51f5073b2053f03955332b018e8672 Apr 21 07:16:10.718101 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:10.718084 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 07:16:10.813261 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:10.813219 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-ncrpp/progression-no-metrics-node-0-0-7vqvg" event={"ID":"4e8a69cf-0e0b-4046-b101-effcc11d072a","Type":"ContainerStarted","Data":"55f7b203e593cbadf250f42c062ac5006c71fe8520b23e68717c8af56b697f0a"} Apr 21 07:16:10.813261 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:10.813268 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-ncrpp/progression-no-metrics-node-0-0-7vqvg" event={"ID":"4e8a69cf-0e0b-4046-b101-effcc11d072a","Type":"ContainerStarted","Data":"62fb28cff963ead35ba86ad9905420215f51f5073b2053f03955332b018e8672"} Apr 21 07:16:10.831044 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:10.830980 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="rhai-e2e-progression-ncrpp/progression-no-metrics-node-0-0-7vqvg" podStartSLOduration=0.830958751 podStartE2EDuration="830.958751ms" podCreationTimestamp="2026-04-21 07:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:16:10.829401934 +0000 UTC m=+787.448736034" watchObservedRunningTime="2026-04-21 07:16:10.830958751 +0000 UTC m=+787.450292842" Apr 21 07:16:15.831832 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:15.831799 2581 generic.go:358] "Generic (PLEG): container finished" podID="4e8a69cf-0e0b-4046-b101-effcc11d072a" containerID="55f7b203e593cbadf250f42c062ac5006c71fe8520b23e68717c8af56b697f0a" exitCode=0 Apr 21 07:16:15.831832 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:15.831840 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-ncrpp/progression-no-metrics-node-0-0-7vqvg" event={"ID":"4e8a69cf-0e0b-4046-b101-effcc11d072a","Type":"ContainerDied","Data":"55f7b203e593cbadf250f42c062ac5006c71fe8520b23e68717c8af56b697f0a"} Apr 21 07:16:16.967721 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:16.967694 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-ncrpp/progression-no-metrics-node-0-0-7vqvg" Apr 21 07:16:17.083039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:17.083004 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpqn2\" (UniqueName: \"kubernetes.io/projected/4e8a69cf-0e0b-4046-b101-effcc11d072a-kube-api-access-rpqn2\") pod \"4e8a69cf-0e0b-4046-b101-effcc11d072a\" (UID: \"4e8a69cf-0e0b-4046-b101-effcc11d072a\") " Apr 21 07:16:17.085367 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:17.085292 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e8a69cf-0e0b-4046-b101-effcc11d072a-kube-api-access-rpqn2" (OuterVolumeSpecName: "kube-api-access-rpqn2") pod "4e8a69cf-0e0b-4046-b101-effcc11d072a" (UID: "4e8a69cf-0e0b-4046-b101-effcc11d072a"). InnerVolumeSpecName "kube-api-access-rpqn2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:16:17.184135 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:17.184091 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rpqn2\" (UniqueName: \"kubernetes.io/projected/4e8a69cf-0e0b-4046-b101-effcc11d072a-kube-api-access-rpqn2\") on node \"ip-10-0-137-163.ec2.internal\" DevicePath \"\"" Apr 21 07:16:17.840668 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:17.840630 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="rhai-e2e-progression-ncrpp/progression-no-metrics-node-0-0-7vqvg" event={"ID":"4e8a69cf-0e0b-4046-b101-effcc11d072a","Type":"ContainerDied","Data":"62fb28cff963ead35ba86ad9905420215f51f5073b2053f03955332b018e8672"} Apr 21 07:16:17.840668 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:17.840672 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62fb28cff963ead35ba86ad9905420215f51f5073b2053f03955332b018e8672" Apr 21 07:16:17.840895 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:17.840674 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="rhai-e2e-progression-ncrpp/progression-no-metrics-node-0-0-7vqvg" Apr 21 07:16:27.237000 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:27.236955 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g"] Apr 21 07:16:27.240329 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:27.240296 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-ncrpp/progression-disabled-node-0-0-r8w6g"] Apr 21 07:16:27.245587 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:27.245543 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47"] Apr 21 07:16:27.249180 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:27.249142 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-ncrpp/progression-enabled-node-0-0-zjh47"] Apr 21 07:16:27.254264 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:27.254232 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-ncrpp/progression-invalid-node-0-0-nqk2r"] Apr 21 07:16:27.257399 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:27.257373 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-ncrpp/progression-invalid-node-0-0-nqk2r"] Apr 21 07:16:27.275815 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:27.275768 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["rhai-e2e-progression-ncrpp/progression-no-metrics-node-0-0-7vqvg"] Apr 21 07:16:27.277668 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:27.277644 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["rhai-e2e-progression-ncrpp/progression-no-metrics-node-0-0-7vqvg"] Apr 21 07:16:27.979323 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:27.979274 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e8a69cf-0e0b-4046-b101-effcc11d072a" path="/var/lib/kubelet/pods/4e8a69cf-0e0b-4046-b101-effcc11d072a/volumes" Apr 21 07:16:27.979737 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:27.979715 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b725272-26d7-4fa7-94e0-a82625e401cb" path="/var/lib/kubelet/pods/7b725272-26d7-4fa7-94e0-a82625e401cb/volumes" Apr 21 07:16:27.980065 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:27.980048 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acbd1863-f026-42e1-820e-7d26e599e564" path="/var/lib/kubelet/pods/acbd1863-f026-42e1-820e-7d26e599e564/volumes" Apr 21 07:16:27.980374 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:27.980358 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1" path="/var/lib/kubelet/pods/d26b7c14-10bb-44dc-9bf5-3704e0d9ceb1/volumes" Apr 21 07:16:38.934048 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:38.934012 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-7c5547bb65-gp899_920b464c-0448-4341-83f7-0b9c2bd12356/manager/0.log" Apr 21 07:16:39.377196 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:39.377163 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-7c5547bb65-gp899_920b464c-0448-4341-83f7-0b9c2bd12356/manager/0.log" Apr 21 07:16:39.830740 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:16:39.830643 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-7c5547bb65-gp899_920b464c-0448-4341-83f7-0b9c2bd12356/manager/0.log" Apr 21 07:17:16.356858 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:16.356808 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nj2kz/must-gather-2m5zc"] Apr 21 07:17:16.359607 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:16.357724 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e8a69cf-0e0b-4046-b101-effcc11d072a" containerName="node" Apr 21 07:17:16.359607 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:16.357779 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e8a69cf-0e0b-4046-b101-effcc11d072a" containerName="node" Apr 21 07:17:16.359607 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:16.358064 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e8a69cf-0e0b-4046-b101-effcc11d072a" containerName="node" Apr 21 07:17:16.363668 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:16.363637 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nj2kz/must-gather-2m5zc" Apr 21 07:17:16.366344 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:16.366299 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nj2kz\"/\"openshift-service-ca.crt\"" Apr 21 07:17:16.366494 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:16.366364 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-nj2kz\"/\"kube-root-ca.crt\"" Apr 21 07:17:16.366494 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:16.366422 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-nj2kz\"/\"default-dockercfg-rwzml\"" Apr 21 07:17:16.369904 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:16.369880 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nj2kz/must-gather-2m5zc"] Apr 21 07:17:16.420361 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:16.420321 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qt5r\" (UniqueName: \"kubernetes.io/projected/57afbb96-0031-4929-87fc-7ddb00b6a052-kube-api-access-6qt5r\") pod \"must-gather-2m5zc\" (UID: \"57afbb96-0031-4929-87fc-7ddb00b6a052\") " pod="openshift-must-gather-nj2kz/must-gather-2m5zc" Apr 21 07:17:16.420526 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:16.420379 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/57afbb96-0031-4929-87fc-7ddb00b6a052-must-gather-output\") pod \"must-gather-2m5zc\" (UID: \"57afbb96-0031-4929-87fc-7ddb00b6a052\") " pod="openshift-must-gather-nj2kz/must-gather-2m5zc" Apr 21 07:17:16.521345 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:16.521303 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6qt5r\" (UniqueName: \"kubernetes.io/projected/57afbb96-0031-4929-87fc-7ddb00b6a052-kube-api-access-6qt5r\") pod \"must-gather-2m5zc\" (UID: \"57afbb96-0031-4929-87fc-7ddb00b6a052\") " pod="openshift-must-gather-nj2kz/must-gather-2m5zc" Apr 21 07:17:16.521519 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:16.521359 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/57afbb96-0031-4929-87fc-7ddb00b6a052-must-gather-output\") pod \"must-gather-2m5zc\" (UID: \"57afbb96-0031-4929-87fc-7ddb00b6a052\") " pod="openshift-must-gather-nj2kz/must-gather-2m5zc" Apr 21 07:17:16.521700 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:16.521685 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/57afbb96-0031-4929-87fc-7ddb00b6a052-must-gather-output\") pod \"must-gather-2m5zc\" (UID: \"57afbb96-0031-4929-87fc-7ddb00b6a052\") " pod="openshift-must-gather-nj2kz/must-gather-2m5zc" Apr 21 07:17:16.529710 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:16.529680 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qt5r\" (UniqueName: \"kubernetes.io/projected/57afbb96-0031-4929-87fc-7ddb00b6a052-kube-api-access-6qt5r\") pod \"must-gather-2m5zc\" (UID: \"57afbb96-0031-4929-87fc-7ddb00b6a052\") " pod="openshift-must-gather-nj2kz/must-gather-2m5zc" Apr 21 07:17:16.674243 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:16.674162 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nj2kz/must-gather-2m5zc" Apr 21 07:17:16.806270 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:16.806249 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nj2kz/must-gather-2m5zc"] Apr 21 07:17:16.809027 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:17:16.808991 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57afbb96_0031_4929_87fc_7ddb00b6a052.slice/crio-38dd5842e2281801bdcf6f6b69ba63e239625a595c2abf64c60661f4ba20d796 WatchSource:0}: Error finding container 38dd5842e2281801bdcf6f6b69ba63e239625a595c2abf64c60661f4ba20d796: Status 404 returned error can't find the container with id 38dd5842e2281801bdcf6f6b69ba63e239625a595c2abf64c60661f4ba20d796 Apr 21 07:17:17.050937 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:17.050846 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nj2kz/must-gather-2m5zc" event={"ID":"57afbb96-0031-4929-87fc-7ddb00b6a052","Type":"ContainerStarted","Data":"38dd5842e2281801bdcf6f6b69ba63e239625a595c2abf64c60661f4ba20d796"} Apr 21 07:17:18.057588 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:18.057525 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nj2kz/must-gather-2m5zc" event={"ID":"57afbb96-0031-4929-87fc-7ddb00b6a052","Type":"ContainerStarted","Data":"2f08efb29499c4e56a915bc2c7926c70f5a3a0b05a8fb9592ea3bec4aec21f6a"} Apr 21 07:17:18.058054 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:18.057596 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nj2kz/must-gather-2m5zc" event={"ID":"57afbb96-0031-4929-87fc-7ddb00b6a052","Type":"ContainerStarted","Data":"5b9e1dedadfc2135bec4164db9b6342e1c5700f7f4620eb5cd30e9bafd9f1d94"} Apr 21 07:17:18.074755 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:18.074620 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nj2kz/must-gather-2m5zc" podStartSLOduration=1.166524935 podStartE2EDuration="2.07459783s" podCreationTimestamp="2026-04-21 07:17:16 +0000 UTC" firstStartedPulling="2026-04-21 07:17:16.810835249 +0000 UTC m=+853.430169316" lastFinishedPulling="2026-04-21 07:17:17.718908142 +0000 UTC m=+854.338242211" observedRunningTime="2026-04-21 07:17:18.071658782 +0000 UTC m=+854.690992873" watchObservedRunningTime="2026-04-21 07:17:18.07459783 +0000 UTC m=+854.693931921" Apr 21 07:17:19.153029 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:19.152991 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-wqmnx_dca5cb9a-85e7-469d-aacc-8d12c2e84795/global-pull-secret-syncer/0.log" Apr 21 07:17:19.215638 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:19.215608 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-nq67t_140fdd65-e7b7-4a63-bcd0-c990e87edf65/konnectivity-agent/0.log" Apr 21 07:17:19.261397 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:19.261365 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-163.ec2.internal_bb1b7fe43ed1fe0c8375d218f68d3580/haproxy/0.log" Apr 21 07:17:22.090689 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:22.090657 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_796f028f-061e-4efc-93ed-97f5cd3a0802/alertmanager/0.log" Apr 21 07:17:22.119146 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:22.118996 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_796f028f-061e-4efc-93ed-97f5cd3a0802/config-reloader/0.log" Apr 21 07:17:22.147261 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:22.147074 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_796f028f-061e-4efc-93ed-97f5cd3a0802/kube-rbac-proxy-web/0.log" Apr 21 07:17:22.176157 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:22.175975 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_796f028f-061e-4efc-93ed-97f5cd3a0802/kube-rbac-proxy/0.log" Apr 21 07:17:22.204166 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:22.204106 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_796f028f-061e-4efc-93ed-97f5cd3a0802/kube-rbac-proxy-metric/0.log" Apr 21 07:17:22.243051 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:22.243020 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_796f028f-061e-4efc-93ed-97f5cd3a0802/prom-label-proxy/0.log" Apr 21 07:17:22.277274 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:22.277248 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_796f028f-061e-4efc-93ed-97f5cd3a0802/init-config-reloader/0.log" Apr 21 07:17:22.366008 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:22.365857 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-5dkkb_968c02a1-912f-4c82-8093-ef9cc71fdca3/kube-state-metrics/0.log" Apr 21 07:17:22.396254 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:22.396210 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-5dkkb_968c02a1-912f-4c82-8093-ef9cc71fdca3/kube-rbac-proxy-main/0.log" Apr 21 07:17:22.427529 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:22.427499 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-5dkkb_968c02a1-912f-4c82-8093-ef9cc71fdca3/kube-rbac-proxy-self/0.log" Apr 21 07:17:22.464761 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:22.464731 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-59bf57b49c-nsmxm_363b86d0-bdf4-44a2-96f5-80c829a4f375/metrics-server/0.log" Apr 21 07:17:22.718482 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:22.718443 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zxf4k_93a318c1-6dc6-41e2-8ce6-10df3a949d4c/node-exporter/0.log" Apr 21 07:17:22.742052 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:22.742019 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zxf4k_93a318c1-6dc6-41e2-8ce6-10df3a949d4c/kube-rbac-proxy/0.log" Apr 21 07:17:22.767707 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:22.767603 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zxf4k_93a318c1-6dc6-41e2-8ce6-10df3a949d4c/init-textfile/0.log" Apr 21 07:17:22.797458 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:22.797416 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zf5cc_15a09b94-ea25-4ffe-8eaf-9ed2025b01a6/kube-rbac-proxy-main/0.log" Apr 21 07:17:22.824302 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:22.824266 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zf5cc_15a09b94-ea25-4ffe-8eaf-9ed2025b01a6/kube-rbac-proxy-self/0.log" Apr 21 07:17:22.854130 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:22.854091 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zf5cc_15a09b94-ea25-4ffe-8eaf-9ed2025b01a6/openshift-state-metrics/0.log" Apr 21 07:17:22.892763 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:22.892728 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_148a6d34-cf46-4ad1-b017-ed5bba1d35a0/prometheus/0.log" Apr 21 07:17:22.917151 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:22.917062 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_148a6d34-cf46-4ad1-b017-ed5bba1d35a0/config-reloader/0.log" Apr 21 07:17:22.943038 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:22.942994 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_148a6d34-cf46-4ad1-b017-ed5bba1d35a0/thanos-sidecar/0.log" Apr 21 07:17:22.970224 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:22.970196 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_148a6d34-cf46-4ad1-b017-ed5bba1d35a0/kube-rbac-proxy-web/0.log" Apr 21 07:17:22.994118 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:22.994089 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_148a6d34-cf46-4ad1-b017-ed5bba1d35a0/kube-rbac-proxy/0.log" Apr 21 07:17:23.019289 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:23.019243 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_148a6d34-cf46-4ad1-b017-ed5bba1d35a0/kube-rbac-proxy-thanos/0.log" Apr 21 07:17:23.045167 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:23.045140 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_148a6d34-cf46-4ad1-b017-ed5bba1d35a0/init-config-reloader/0.log" Apr 21 07:17:23.157007 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:23.156922 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7c546994b4-zr458_11748718-2f6b-488b-ac46-e9be66ad5213/telemeter-client/0.log" Apr 21 07:17:23.180494 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:23.180420 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7c546994b4-zr458_11748718-2f6b-488b-ac46-e9be66ad5213/reload/0.log" Apr 21 07:17:23.201990 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:23.201958 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7c546994b4-zr458_11748718-2f6b-488b-ac46-e9be66ad5213/kube-rbac-proxy/0.log" Apr 21 07:17:23.238157 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:23.238117 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8dcbc7c47-wc8fk_ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719/thanos-query/0.log" Apr 21 07:17:23.261366 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:23.261329 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8dcbc7c47-wc8fk_ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719/kube-rbac-proxy-web/0.log" Apr 21 07:17:23.285782 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:23.285751 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8dcbc7c47-wc8fk_ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719/kube-rbac-proxy/0.log" Apr 21 07:17:23.308114 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:23.308084 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8dcbc7c47-wc8fk_ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719/prom-label-proxy/0.log" Apr 21 07:17:23.330975 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:23.330948 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8dcbc7c47-wc8fk_ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719/kube-rbac-proxy-rules/0.log" Apr 21 07:17:23.352926 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:23.352865 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-8dcbc7c47-wc8fk_ad0bcf8e-d7f5-45d0-a43e-1ffd4c0eb719/kube-rbac-proxy-metrics/0.log" Apr 21 07:17:25.434309 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:25.434270 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-ncs5j_6ce9ee09-1262-4278-8b0b-72dce2cc896a/download-server/0.log" Apr 21 07:17:26.423114 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.423077 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5"] Apr 21 07:17:26.428023 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.427987 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5" Apr 21 07:17:26.436755 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.436722 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5"] Apr 21 07:17:26.517632 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.517557 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85-lib-modules\") pod \"perf-node-gather-daemonset-vqcf5\" (UID: \"fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5" Apr 21 07:17:26.517833 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.517735 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qj2h\" (UniqueName: \"kubernetes.io/projected/fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85-kube-api-access-8qj2h\") pod \"perf-node-gather-daemonset-vqcf5\" (UID: \"fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5" Apr 21 07:17:26.517944 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.517838 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85-proc\") pod \"perf-node-gather-daemonset-vqcf5\" (UID: \"fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5" Apr 21 07:17:26.517944 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.517859 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85-podres\") pod \"perf-node-gather-daemonset-vqcf5\" (UID: \"fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5" Apr 21 07:17:26.517944 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.517940 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85-sys\") pod \"perf-node-gather-daemonset-vqcf5\" (UID: \"fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5" Apr 21 07:17:26.580306 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.580254 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xtdjm_2427cd84-1ecc-4868-adb1-7e6205d1a291/dns/0.log" Apr 21 07:17:26.601632 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.601599 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xtdjm_2427cd84-1ecc-4868-adb1-7e6205d1a291/kube-rbac-proxy/0.log" Apr 21 07:17:26.619039 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.619007 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85-sys\") pod \"perf-node-gather-daemonset-vqcf5\" (UID: \"fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5" Apr 21 07:17:26.619213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.619049 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85-lib-modules\") pod \"perf-node-gather-daemonset-vqcf5\" (UID: \"fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5" Apr 21 07:17:26.619213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.619093 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8qj2h\" (UniqueName: \"kubernetes.io/projected/fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85-kube-api-access-8qj2h\") pod \"perf-node-gather-daemonset-vqcf5\" (UID: \"fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5" Apr 21 07:17:26.619213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.619121 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85-sys\") pod \"perf-node-gather-daemonset-vqcf5\" (UID: \"fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5" Apr 21 07:17:26.619213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.619171 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85-proc\") pod \"perf-node-gather-daemonset-vqcf5\" (UID: \"fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5" Apr 21 07:17:26.619213 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.619189 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85-podres\") pod \"perf-node-gather-daemonset-vqcf5\" (UID: \"fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5" Apr 21 07:17:26.619428 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.619278 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85-proc\") pod \"perf-node-gather-daemonset-vqcf5\" (UID: \"fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5" Apr 21 07:17:26.619428 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.619276 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85-lib-modules\") pod \"perf-node-gather-daemonset-vqcf5\" (UID: \"fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5" Apr 21 07:17:26.619428 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.619328 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85-podres\") pod \"perf-node-gather-daemonset-vqcf5\" (UID: \"fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5" Apr 21 07:17:26.627425 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.627380 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qj2h\" (UniqueName: \"kubernetes.io/projected/fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85-kube-api-access-8qj2h\") pod \"perf-node-gather-daemonset-vqcf5\" (UID: \"fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85\") " pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5" Apr 21 07:17:26.646779 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.646741 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ptn2z_21307e09-27b9-492e-ac26-b3d09e5794af/dns-node-resolver/0.log" Apr 21 07:17:26.742847 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.742749 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5" Apr 21 07:17:26.893594 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:26.893548 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5"] Apr 21 07:17:26.895844 ip-10-0-137-163 kubenswrapper[2581]: W0421 07:17:26.895809 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfe4c85fe_87ac_44d5_ad4e_f2bd0ec97b85.slice/crio-d646c31b434137a246ea9ea1293e7dcbb2ec85f2115f8c33b38f374b96063e35 WatchSource:0}: Error finding container d646c31b434137a246ea9ea1293e7dcbb2ec85f2115f8c33b38f374b96063e35: Status 404 returned error can't find the container with id d646c31b434137a246ea9ea1293e7dcbb2ec85f2115f8c33b38f374b96063e35 Apr 21 07:17:27.083000 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:27.082963 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5977bd9744-9cf64_e67323c8-e3cc-4745-b61d-27f2a2459601/registry/0.log" Apr 21 07:17:27.100135 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:27.100101 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5" event={"ID":"fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85","Type":"ContainerStarted","Data":"db112ea256ec97b81bc8441520e18a2d6a1ffce35fb72ef52266545f8b0bb396"} Apr 21 07:17:27.100135 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:27.100142 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5" event={"ID":"fe4c85fe-87ac-44d5-ad4e-f2bd0ec97b85","Type":"ContainerStarted","Data":"d646c31b434137a246ea9ea1293e7dcbb2ec85f2115f8c33b38f374b96063e35"} Apr 21 07:17:27.100360 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:27.100211 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5" Apr 21 07:17:27.119611 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:27.119532 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5" podStartSLOduration=1.119515808 podStartE2EDuration="1.119515808s" podCreationTimestamp="2026-04-21 07:17:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:17:27.118957399 +0000 UTC m=+863.738291503" watchObservedRunningTime="2026-04-21 07:17:27.119515808 +0000 UTC m=+863.738849968" Apr 21 07:17:27.136331 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:27.136294 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vrvfp_df97fb0c-eb01-481b-ab26-0073456033cd/node-ca/0.log" Apr 21 07:17:28.301029 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:28.300992 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-bdt5g_a6868c94-bebf-4199-8e95-b97042cabdc1/serve-healthcheck-canary/0.log" Apr 21 07:17:28.713770 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:28.713738 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6g68p_fba56b2f-24aa-46b9-b5c7-88cc67c2fb44/kube-rbac-proxy/0.log" Apr 21 07:17:28.736127 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:28.736093 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6g68p_fba56b2f-24aa-46b9-b5c7-88cc67c2fb44/exporter/0.log" Apr 21 07:17:28.761026 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:28.760998 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6g68p_fba56b2f-24aa-46b9-b5c7-88cc67c2fb44/extractor/0.log" Apr 21 07:17:33.116640 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:33.116607 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-nj2kz/perf-node-gather-daemonset-vqcf5" Apr 21 07:17:33.802032 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:33.802000 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-t4h72_1911b9d6-1a67-4559-b684-2a2fc0ad29c0/migrator/0.log" Apr 21 07:17:33.823854 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:33.823824 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-t4h72_1911b9d6-1a67-4559-b684-2a2fc0ad29c0/graceful-termination/0.log" Apr 21 07:17:35.124688 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:35.124645 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b76h9_d6e8c99f-04e0-4c02-b29b-c5d5e6e76763/kube-multus-additional-cni-plugins/0.log" Apr 21 07:17:35.149844 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:35.149753 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b76h9_d6e8c99f-04e0-4c02-b29b-c5d5e6e76763/egress-router-binary-copy/0.log" Apr 21 07:17:35.173698 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:35.173658 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b76h9_d6e8c99f-04e0-4c02-b29b-c5d5e6e76763/cni-plugins/0.log" Apr 21 07:17:35.204589 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:35.204539 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b76h9_d6e8c99f-04e0-4c02-b29b-c5d5e6e76763/bond-cni-plugin/0.log" Apr 21 07:17:35.232947 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:35.232920 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b76h9_d6e8c99f-04e0-4c02-b29b-c5d5e6e76763/routeoverride-cni/0.log" Apr 21 07:17:35.262259 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:35.262225 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b76h9_d6e8c99f-04e0-4c02-b29b-c5d5e6e76763/whereabouts-cni-bincopy/0.log" Apr 21 07:17:35.287304 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:35.287269 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b76h9_d6e8c99f-04e0-4c02-b29b-c5d5e6e76763/whereabouts-cni/0.log" Apr 21 07:17:35.684527 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:35.684489 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mhtw2_d553e50a-31de-42be-99de-2bc791bca6e2/kube-multus/0.log" Apr 21 07:17:35.805006 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:35.804883 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-r4v6n_42b45a2d-c99c-40f2-97f6-2d31aff6854f/network-metrics-daemon/0.log" Apr 21 07:17:35.828575 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:35.828530 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-r4v6n_42b45a2d-c99c-40f2-97f6-2d31aff6854f/kube-rbac-proxy/0.log" Apr 21 07:17:36.621816 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:36.621788 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7cnmr_ff68b29a-db87-4ff2-882d-9f1e312dd5ce/ovn-controller/0.log" Apr 21 07:17:36.646277 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:36.646250 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7cnmr_ff68b29a-db87-4ff2-882d-9f1e312dd5ce/ovn-acl-logging/0.log" Apr 21 07:17:36.666159 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:36.666133 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7cnmr_ff68b29a-db87-4ff2-882d-9f1e312dd5ce/kube-rbac-proxy-node/0.log" Apr 21 07:17:36.690302 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:36.690268 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7cnmr_ff68b29a-db87-4ff2-882d-9f1e312dd5ce/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 07:17:36.735069 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:36.735034 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7cnmr_ff68b29a-db87-4ff2-882d-9f1e312dd5ce/northd/0.log" Apr 21 07:17:36.759344 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:36.759304 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7cnmr_ff68b29a-db87-4ff2-882d-9f1e312dd5ce/nbdb/0.log" Apr 21 07:17:36.786121 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:36.786089 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7cnmr_ff68b29a-db87-4ff2-882d-9f1e312dd5ce/sbdb/0.log" Apr 21 07:17:36.906177 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:36.906096 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7cnmr_ff68b29a-db87-4ff2-882d-9f1e312dd5ce/ovnkube-controller/0.log" Apr 21 07:17:38.457659 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:38.457630 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-4qpb2_f00d904c-86da-4e00-801a-3bd1d7dbe5f4/network-check-target-container/0.log" Apr 21 07:17:39.421886 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:39.421847 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-dr8n8_0a0d5c34-e09f-40bc-8eec-7f880a3de770/iptables-alerter/0.log" Apr 21 07:17:40.091649 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:40.091599 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-4jl8x_86aeed30-b464-4b2f-a813-f3fb4f3e9998/tuned/0.log" Apr 21 07:17:43.637574 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:43.637543 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-tw92r_49a3e211-6f6c-4501-878b-c01a12dfbbb1/csi-driver/0.log" Apr 21 07:17:43.661391 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:43.661352 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-tw92r_49a3e211-6f6c-4501-878b-c01a12dfbbb1/csi-node-driver-registrar/0.log" Apr 21 07:17:43.687222 ip-10-0-137-163 kubenswrapper[2581]: I0421 07:17:43.687178 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-tw92r_49a3e211-6f6c-4501-878b-c01a12dfbbb1/csi-liveness-probe/0.log"