Apr 22 16:18:47.223018 ip-10-0-141-251 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 16:18:47.223031 ip-10-0-141-251 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 16:18:47.223037 ip-10-0-141-251 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 16:18:47.223460 ip-10-0-141-251 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 16:18:57.398586 ip-10-0-141-251 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 16:18:57.398603 ip-10-0-141-251 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot d9b828fb41c94d9abc28c3cea48662c0 -- Apr 22 16:21:35.958028 ip-10-0-141-251 systemd[1]: Starting Kubernetes Kubelet... Apr 22 16:21:36.365969 ip-10-0-141-251 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 16:21:36.365969 ip-10-0-141-251 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 16:21:36.365969 ip-10-0-141-251 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 16:21:36.365969 ip-10-0-141-251 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 16:21:36.365969 ip-10-0-141-251 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 16:21:36.367640 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.367546 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 16:21:36.371987 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.371964 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 16:21:36.371987 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.371985 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 16:21:36.371987 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.371990 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 16:21:36.371987 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.371994 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 16:21:36.372165 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.371997 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 16:21:36.372165 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372000 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 16:21:36.372165 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372003 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 16:21:36.372165 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372006 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 16:21:36.372165 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372009 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 16:21:36.372165 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372012 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 16:21:36.372165 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372015 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 16:21:36.372165 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372017 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 16:21:36.372165 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372020 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 16:21:36.372165 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372023 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 16:21:36.372165 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372025 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 16:21:36.372165 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372028 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 16:21:36.372165 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372031 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 16:21:36.372165 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372033 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 16:21:36.372165 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372036 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 16:21:36.372165 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372039 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 16:21:36.372165 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372042 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 16:21:36.372165 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372047 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 16:21:36.372165 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372050 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 16:21:36.372165 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372052 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 16:21:36.372707 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372054 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 16:21:36.372707 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372057 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 16:21:36.372707 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372060 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 16:21:36.372707 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372062 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 16:21:36.372707 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372064 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 16:21:36.372707 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372067 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 16:21:36.372707 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372070 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 16:21:36.372707 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372072 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 16:21:36.372707 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372074 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 16:21:36.372707 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372077 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 16:21:36.372707 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372079 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 16:21:36.372707 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372082 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 16:21:36.372707 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372084 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 16:21:36.372707 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372088 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 16:21:36.372707 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372090 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 16:21:36.372707 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372093 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 16:21:36.372707 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372096 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 16:21:36.372707 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372098 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 16:21:36.372707 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372101 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 16:21:36.373267 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372103 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 16:21:36.373267 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372105 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 16:21:36.373267 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372109 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 16:21:36.373267 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372111 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 16:21:36.373267 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372114 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 16:21:36.373267 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372117 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 16:21:36.373267 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372120 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 16:21:36.373267 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372122 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 16:21:36.373267 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372126 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 16:21:36.373267 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372129 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 16:21:36.373267 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372131 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 16:21:36.373267 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372134 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 16:21:36.373267 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372137 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 16:21:36.373267 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372139 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 16:21:36.373267 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372142 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 16:21:36.373267 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372144 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 16:21:36.373267 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372147 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 16:21:36.373267 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372149 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 16:21:36.373267 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372152 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 16:21:36.373267 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372155 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 16:21:36.373845 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372157 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 16:21:36.373845 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372160 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 16:21:36.373845 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372162 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 16:21:36.373845 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372165 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 16:21:36.373845 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372167 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 16:21:36.373845 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372170 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 16:21:36.373845 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372174 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 16:21:36.373845 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372183 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 16:21:36.373845 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372187 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 16:21:36.373845 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372190 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 16:21:36.373845 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372193 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 16:21:36.373845 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372195 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 22 16:21:36.373845 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372198 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 16:21:36.373845 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372200 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 16:21:36.373845 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372203 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 16:21:36.373845 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372206 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 16:21:36.373845 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372209 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 16:21:36.373845 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372211 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 16:21:36.373845 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372214 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 16:21:36.373845 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372216 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 16:21:36.374398 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372219 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 16:21:36.374398 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372222 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 16:21:36.374398 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372224 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 16:21:36.374398 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372604 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 16:21:36.374398 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372609 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 16:21:36.374398 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372612 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 16:21:36.374398 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372615 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 16:21:36.374398 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372618 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 16:21:36.374398 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372620 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 16:21:36.374398 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372623 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 16:21:36.374398 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372626 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 16:21:36.374398 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372628 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 16:21:36.374398 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372631 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 16:21:36.374398 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372634 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 16:21:36.374398 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372636 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 16:21:36.374398 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372638 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 16:21:36.374398 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372641 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 16:21:36.374398 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372643 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 16:21:36.374398 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372647 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 16:21:36.374398 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372650 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 16:21:36.374947 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372653 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 16:21:36.374947 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372655 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 16:21:36.374947 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372658 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 16:21:36.374947 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372660 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 16:21:36.374947 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372663 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 16:21:36.374947 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372666 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 16:21:36.374947 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372668 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 22 16:21:36.374947 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372671 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 16:21:36.374947 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372674 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 16:21:36.374947 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372676 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 16:21:36.374947 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372681 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 16:21:36.374947 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372684 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 16:21:36.374947 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372687 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 16:21:36.374947 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372690 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 16:21:36.374947 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372693 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 16:21:36.374947 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372695 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 16:21:36.374947 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372698 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 16:21:36.374947 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372701 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 16:21:36.374947 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372704 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 16:21:36.375498 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372707 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 16:21:36.375498 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372710 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 16:21:36.375498 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372712 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 16:21:36.375498 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372715 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 16:21:36.375498 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372718 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 16:21:36.375498 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372721 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 16:21:36.375498 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372724 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 16:21:36.375498 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372726 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 16:21:36.375498 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372729 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 16:21:36.375498 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372732 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 16:21:36.375498 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372735 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 16:21:36.375498 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372738 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 16:21:36.375498 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372741 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 16:21:36.375498 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372743 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 16:21:36.375498 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372746 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 16:21:36.375498 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372748 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 16:21:36.375498 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372765 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 16:21:36.375498 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372768 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 16:21:36.375498 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372771 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 16:21:36.375498 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372774 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 16:21:36.376059 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372776 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 16:21:36.376059 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372779 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 16:21:36.376059 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372782 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 16:21:36.376059 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372785 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 16:21:36.376059 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372787 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 16:21:36.376059 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372790 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 16:21:36.376059 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372793 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 16:21:36.376059 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372795 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 16:21:36.376059 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372797 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 16:21:36.376059 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372800 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 16:21:36.376059 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372802 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 16:21:36.376059 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372805 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 16:21:36.376059 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372808 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 16:21:36.376059 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372810 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 16:21:36.376059 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372813 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 16:21:36.376059 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372815 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 16:21:36.376059 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372818 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 16:21:36.376059 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372820 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 16:21:36.376059 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372825 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 16:21:36.376513 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372829 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 16:21:36.376513 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372832 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 16:21:36.376513 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372834 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 16:21:36.376513 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372837 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 16:21:36.376513 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372840 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 16:21:36.376513 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372843 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 16:21:36.376513 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372845 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 16:21:36.376513 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372848 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 16:21:36.376513 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372851 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 16:21:36.376513 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372853 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 16:21:36.376513 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.372856 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 16:21:36.376513 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373463 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 16:21:36.376513 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373472 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 16:21:36.376513 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373480 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 16:21:36.376513 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373485 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 16:21:36.376513 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373490 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 16:21:36.376513 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373494 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 16:21:36.376513 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373498 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 16:21:36.376513 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373503 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 16:21:36.376513 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373506 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 16:21:36.376513 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373509 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373512 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373515 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373519 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373522 2573 flags.go:64] FLAG: --cgroup-root="" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373525 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373528 2573 flags.go:64] FLAG: --client-ca-file="" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373531 2573 flags.go:64] FLAG: --cloud-config="" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373534 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373537 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373542 2573 flags.go:64] FLAG: --cluster-domain="" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373544 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373548 2573 flags.go:64] FLAG: --config-dir="" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373551 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373554 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373563 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373566 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373570 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373573 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373576 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373579 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373582 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373585 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373588 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373592 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 16:21:36.377060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373595 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373598 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373601 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373605 2573 flags.go:64] FLAG: --enable-server="true" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373608 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373613 2573 flags.go:64] FLAG: --event-burst="100" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373616 2573 flags.go:64] FLAG: --event-qps="50" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373619 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373622 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373625 2573 flags.go:64] FLAG: --eviction-hard="" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373629 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373632 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373635 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373638 2573 flags.go:64] FLAG: --eviction-soft="" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373641 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373644 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373647 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373653 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373657 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373660 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373662 2573 flags.go:64] FLAG: --feature-gates="" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373667 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373670 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373673 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373677 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373680 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 22 16:21:36.377648 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373683 2573 flags.go:64] FLAG: --help="false" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373686 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-141-251.ec2.internal" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373689 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373692 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373695 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373698 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373701 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373704 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373707 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373710 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373720 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373723 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373726 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373729 2573 flags.go:64] FLAG: --kube-reserved="" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373732 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373735 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373738 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373740 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373743 2573 flags.go:64] FLAG: --lock-file="" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373746 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373749 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373765 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373770 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 16:21:36.378298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373775 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373778 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373781 2573 flags.go:64] FLAG: --logging-format="text" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373784 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373787 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373790 2573 flags.go:64] FLAG: --manifest-url="" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373793 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373798 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373801 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373805 2573 flags.go:64] FLAG: --max-pods="110" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373807 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373810 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373813 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373816 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373819 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373822 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373825 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373833 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373836 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373838 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373847 2573 flags.go:64] FLAG: --pod-cidr="" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373851 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373856 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373860 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 16:21:36.378867 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373863 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373866 2573 flags.go:64] FLAG: --port="10250" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373869 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373872 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0918a752900ae1a14" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373875 2573 flags.go:64] FLAG: --qos-reserved="" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373878 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373881 2573 flags.go:64] FLAG: --register-node="true" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373885 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373887 2573 flags.go:64] FLAG: --register-with-taints="" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373892 2573 flags.go:64] FLAG: --registry-burst="10" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373895 2573 flags.go:64] FLAG: --registry-qps="5" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373898 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373900 2573 flags.go:64] FLAG: --reserved-memory="" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373905 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373908 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373911 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373913 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373916 2573 flags.go:64] FLAG: --runonce="false" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373919 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373922 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373925 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373928 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373931 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373934 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373937 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373940 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 16:21:36.379437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373942 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373945 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373948 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373951 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373954 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373957 2573 flags.go:64] FLAG: --system-cgroups="" Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373960 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373966 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373969 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373971 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373975 2573 flags.go:64] FLAG: --tls-min-version="" Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373977 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373980 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373983 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373986 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373989 2573 flags.go:64] FLAG: --v="2" Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.373996 2573 flags.go:64] FLAG: --version="false" Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.374000 2573 flags.go:64] FLAG: --vmodule="" Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.374004 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.374008 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374105 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374108 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374112 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374114 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 16:21:36.380075 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374118 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 16:21:36.380645 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374120 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 16:21:36.380645 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374123 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 16:21:36.380645 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374125 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 16:21:36.380645 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374128 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 16:21:36.380645 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374130 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 16:21:36.380645 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374133 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 16:21:36.380645 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374135 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 16:21:36.380645 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374138 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 16:21:36.380645 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374140 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 16:21:36.380645 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374143 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 16:21:36.380645 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374146 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 16:21:36.380645 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374152 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 16:21:36.380645 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374156 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 16:21:36.380645 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374159 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 16:21:36.380645 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374161 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 16:21:36.380645 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374164 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 16:21:36.380645 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374167 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 16:21:36.380645 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374169 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 16:21:36.380645 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374172 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 16:21:36.380645 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374174 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 16:21:36.381193 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374177 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 16:21:36.381193 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374179 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 16:21:36.381193 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374182 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 16:21:36.381193 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374185 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 16:21:36.381193 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374188 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 16:21:36.381193 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374190 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 16:21:36.381193 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374194 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 16:21:36.381193 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374197 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 16:21:36.381193 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374200 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 16:21:36.381193 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374202 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 16:21:36.381193 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374205 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 16:21:36.381193 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374207 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 16:21:36.381193 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374210 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 16:21:36.381193 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374212 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 16:21:36.381193 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374215 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 16:21:36.381193 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374217 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 16:21:36.381193 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374220 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 16:21:36.381193 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374222 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 16:21:36.381193 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374225 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 16:21:36.381656 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374228 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 16:21:36.381656 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374230 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 16:21:36.381656 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374233 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 16:21:36.381656 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374235 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 16:21:36.381656 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374238 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 16:21:36.381656 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374241 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 16:21:36.381656 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374243 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 16:21:36.381656 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374247 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 16:21:36.381656 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374251 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 16:21:36.381656 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374254 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 16:21:36.381656 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374257 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 16:21:36.381656 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374260 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 16:21:36.381656 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374262 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 16:21:36.381656 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374265 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 16:21:36.381656 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374268 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 16:21:36.381656 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374271 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 16:21:36.381656 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374274 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 16:21:36.381656 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374276 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 16:21:36.381656 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374279 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 16:21:36.382149 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374282 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 16:21:36.382149 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374285 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 16:21:36.382149 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374287 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 16:21:36.382149 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374290 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 16:21:36.382149 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374292 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 16:21:36.382149 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374295 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 16:21:36.382149 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374297 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 16:21:36.382149 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374299 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 16:21:36.382149 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374302 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 16:21:36.382149 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374304 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 16:21:36.382149 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374307 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 16:21:36.382149 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374309 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 16:21:36.382149 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374312 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 16:21:36.382149 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374315 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 16:21:36.382149 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374317 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 16:21:36.382149 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374320 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 16:21:36.382149 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374322 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 16:21:36.382149 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374325 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 16:21:36.382149 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374327 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 22 16:21:36.382149 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374330 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 16:21:36.382728 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374333 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 16:21:36.382728 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374335 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 16:21:36.382728 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.374337 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 16:21:36.382728 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.375144 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 16:21:36.382728 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.382414 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 16:21:36.382728 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.382429 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 16:21:36.382728 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382490 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 16:21:36.382728 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382495 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 16:21:36.382728 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382498 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 16:21:36.382728 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382501 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 16:21:36.382728 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382504 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 16:21:36.382728 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382507 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 16:21:36.382728 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382509 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 16:21:36.382728 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382512 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 16:21:36.382728 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382515 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 16:21:36.382728 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382518 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 16:21:36.383152 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382522 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 16:21:36.383152 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382526 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 16:21:36.383152 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382529 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 16:21:36.383152 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382532 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 16:21:36.383152 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382535 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 16:21:36.383152 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382537 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 16:21:36.383152 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382540 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 16:21:36.383152 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382543 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 16:21:36.383152 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382545 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 16:21:36.383152 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382548 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 16:21:36.383152 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382550 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 16:21:36.383152 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382553 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 16:21:36.383152 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382555 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 16:21:36.383152 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382558 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 16:21:36.383152 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382560 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 16:21:36.383152 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382563 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 16:21:36.383152 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382566 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 16:21:36.383152 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382569 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 16:21:36.383152 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382571 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 16:21:36.383612 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382574 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 16:21:36.383612 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382576 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 16:21:36.383612 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382580 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 16:21:36.383612 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382584 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 16:21:36.383612 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382586 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 16:21:36.383612 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382589 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 16:21:36.383612 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382591 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 16:21:36.383612 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382594 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 16:21:36.383612 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382596 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 16:21:36.383612 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382599 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 16:21:36.383612 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382601 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 16:21:36.383612 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382604 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 16:21:36.383612 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382606 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 16:21:36.383612 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382609 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 16:21:36.383612 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382611 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 16:21:36.383612 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382614 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 16:21:36.383612 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382616 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 16:21:36.383612 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382619 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 16:21:36.383612 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382621 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 16:21:36.383612 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382624 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 16:21:36.384112 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382626 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 16:21:36.384112 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382629 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 16:21:36.384112 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382632 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 16:21:36.384112 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382634 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 16:21:36.384112 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382637 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 16:21:36.384112 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382639 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 16:21:36.384112 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382642 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 16:21:36.384112 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382644 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 16:21:36.384112 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382647 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 16:21:36.384112 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382649 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 16:21:36.384112 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382652 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 16:21:36.384112 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382654 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 16:21:36.384112 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382658 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 16:21:36.384112 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382662 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 16:21:36.384112 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382665 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 16:21:36.384112 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382669 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 16:21:36.384112 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382672 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 16:21:36.384112 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382675 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 16:21:36.384112 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382677 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 16:21:36.384112 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382680 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 16:21:36.384587 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382683 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 16:21:36.384587 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382686 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 16:21:36.384587 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382688 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 16:21:36.384587 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382691 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 16:21:36.384587 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382694 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 16:21:36.384587 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382696 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 16:21:36.384587 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382699 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 22 16:21:36.384587 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382701 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 16:21:36.384587 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382704 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 16:21:36.384587 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382706 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 16:21:36.384587 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382709 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 16:21:36.384587 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382711 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 16:21:36.384587 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382714 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 16:21:36.384587 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382716 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 16:21:36.384587 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382719 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 16:21:36.384587 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382722 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 16:21:36.384587 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382724 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 16:21:36.385046 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.382729 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 16:21:36.385046 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382856 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 16:21:36.385046 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382862 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 16:21:36.385046 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382865 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 16:21:36.385046 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382868 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 16:21:36.385046 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382872 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 16:21:36.385046 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382874 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 16:21:36.385046 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382877 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 16:21:36.385046 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382881 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 16:21:36.385046 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382883 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 16:21:36.385046 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382886 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 16:21:36.385046 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382890 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 16:21:36.385046 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382892 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 16:21:36.385046 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382895 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 16:21:36.385046 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382897 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 16:21:36.385418 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382901 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 16:21:36.385418 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382913 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 16:21:36.385418 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382916 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 16:21:36.385418 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382919 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 16:21:36.385418 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382921 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 16:21:36.385418 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382923 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 16:21:36.385418 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382926 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 16:21:36.385418 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382929 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 16:21:36.385418 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382932 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 16:21:36.385418 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382934 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 16:21:36.385418 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382937 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 16:21:36.385418 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382939 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 16:21:36.385418 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382941 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 16:21:36.385418 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382944 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 16:21:36.385418 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382946 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 16:21:36.385418 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382949 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 16:21:36.385418 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382952 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 16:21:36.385418 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382954 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 16:21:36.385418 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382956 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 16:21:36.385418 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382959 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 16:21:36.385934 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382961 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 16:21:36.385934 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382964 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 16:21:36.385934 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382966 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 16:21:36.385934 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382969 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 16:21:36.385934 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382972 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 16:21:36.385934 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382974 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 16:21:36.385934 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382977 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 16:21:36.385934 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382980 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 16:21:36.385934 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382982 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 16:21:36.385934 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382986 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 16:21:36.385934 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382988 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 16:21:36.385934 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382991 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 16:21:36.385934 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382993 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 16:21:36.385934 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382995 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 16:21:36.385934 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.382998 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 16:21:36.385934 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383000 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 16:21:36.385934 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383003 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 16:21:36.385934 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383005 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 16:21:36.385934 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383008 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 16:21:36.385934 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383010 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 16:21:36.386414 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383013 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 16:21:36.386414 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383016 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 16:21:36.386414 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383020 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 16:21:36.386414 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383023 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 16:21:36.386414 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383026 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 16:21:36.386414 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383028 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 16:21:36.386414 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383031 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 16:21:36.386414 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383033 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 16:21:36.386414 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383036 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 16:21:36.386414 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383038 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 16:21:36.386414 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383041 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 16:21:36.386414 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383043 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 16:21:36.386414 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383046 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 16:21:36.386414 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383048 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 16:21:36.386414 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383050 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 16:21:36.386414 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383053 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 22 16:21:36.386414 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383055 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 16:21:36.386414 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383058 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 16:21:36.386414 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383061 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 16:21:36.386976 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383063 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 16:21:36.386976 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383066 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 16:21:36.386976 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383068 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 16:21:36.386976 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383072 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 16:21:36.386976 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383074 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 16:21:36.386976 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383077 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 16:21:36.386976 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383079 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 16:21:36.386976 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383082 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 16:21:36.386976 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383085 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 16:21:36.386976 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383087 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 16:21:36.386976 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383090 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 16:21:36.386976 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383092 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 16:21:36.386976 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:36.383094 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 16:21:36.386976 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.383099 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 16:21:36.386976 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.383777 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 16:21:36.387346 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.387232 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 16:21:36.388040 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.388028 2573 server.go:1019] "Starting client certificate rotation" Apr 22 16:21:36.388141 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.388125 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 16:21:36.388170 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.388159 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 16:21:36.410973 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.410958 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 16:21:36.415451 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.415426 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 16:21:36.429699 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.429676 2573 log.go:25] "Validated CRI v1 runtime API" Apr 22 16:21:36.435397 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.435381 2573 log.go:25] "Validated CRI v1 image API" Apr 22 16:21:36.437247 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.437228 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 16:21:36.439492 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.439473 2573 fs.go:135] Filesystem UUIDs: map[07a49862-e362-45e5-9461-c03911c177f8:/dev/nvme0n1p4 71ce01ec-7a62-4504-86af-92368574465a:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 22 16:21:36.439561 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.439491 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 16:21:36.444477 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.444371 2573 manager.go:217] Machine: {Timestamp:2026-04-22 16:21:36.443242078 +0000 UTC m=+0.378556294 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3096469 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2423621a125b5699031816ea2c9172 SystemUUID:ec242362-1a12-5b56-9903-1816ea2c9172 BootID:d9b828fb-41c9-4d9a-bc28-c3cea48662c0 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:14:86:93:6a:e1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:14:86:93:6a:e1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ca:0a:60:42:48:a4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 16:21:36.444477 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.444472 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 16:21:36.444584 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.444548 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 16:21:36.445528 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.445511 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 16:21:36.445833 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.445807 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 16:21:36.445974 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.445836 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-251.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 16:21:36.446016 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.445984 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 16:21:36.446016 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.445996 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 16:21:36.446016 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.446013 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 16:21:36.447536 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.447524 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 16:21:36.448993 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.448983 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 22 16:21:36.449096 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.449087 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 16:21:36.451978 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.451969 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 22 16:21:36.452017 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.451982 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 16:21:36.452017 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.451997 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 16:21:36.452017 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.452006 2573 kubelet.go:397] "Adding apiserver pod source" Apr 22 16:21:36.452017 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.452015 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 16:21:36.453112 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.453100 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 16:21:36.453151 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.453117 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 16:21:36.455723 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.455707 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 16:21:36.457112 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.457099 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 16:21:36.459071 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.459046 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 16:21:36.459071 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.459073 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 16:21:36.459226 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.459084 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 16:21:36.459226 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.459094 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 16:21:36.459226 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.459103 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 16:21:36.459226 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.459111 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 16:21:36.459226 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.459120 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 16:21:36.459226 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.459130 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 16:21:36.459226 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.459141 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 16:21:36.459226 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.459149 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 16:21:36.459226 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.459158 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 16:21:36.459226 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.459170 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 16:21:36.459974 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.459961 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 16:21:36.459974 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.459975 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 16:21:36.460564 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.460545 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5h98v" Apr 22 16:21:36.463806 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.463792 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 16:21:36.463872 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.463829 2573 server.go:1295] "Started kubelet" Apr 22 16:21:36.463926 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.463904 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 16:21:36.464081 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.464015 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 16:21:36.464125 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.464114 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 16:21:36.464624 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.464605 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-251.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 16:21:36.464624 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:36.464599 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-251.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 16:21:36.464637 ip-10-0-141-251 systemd[1]: Started Kubernetes Kubelet. Apr 22 16:21:36.464837 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:36.464734 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 16:21:36.466851 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.466834 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 22 16:21:36.468050 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.468035 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 16:21:36.471047 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.471028 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5h98v" Apr 22 16:21:36.475630 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:36.474412 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-251.ec2.internal.18a8ba4dd608b75b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-251.ec2.internal,UID:ip-10-0-141-251.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-251.ec2.internal,},FirstTimestamp:2026-04-22 16:21:36.463804251 +0000 UTC m=+0.399118467,LastTimestamp:2026-04-22 16:21:36.463804251 +0000 UTC m=+0.399118467,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-251.ec2.internal,}" Apr 22 16:21:36.477947 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.477930 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 16:21:36.478456 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.478438 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 16:21:36.478588 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:36.478561 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 16:21:36.479245 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.479219 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 16:21:36.479245 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.479236 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 16:21:36.479377 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.479217 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 16:21:36.479377 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.479340 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 22 16:21:36.479377 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.479349 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 22 16:21:36.479377 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.479359 2573 factory.go:55] Registering systemd factory Apr 22 16:21:36.479377 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.479377 2573 factory.go:223] Registration of the systemd container factory successfully Apr 22 16:21:36.479614 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:36.479380 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-251.ec2.internal\" not found" Apr 22 16:21:36.479614 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.479578 2573 factory.go:153] Registering CRI-O factory Apr 22 16:21:36.479614 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.479589 2573 factory.go:223] Registration of the crio container factory successfully Apr 22 16:21:36.479769 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.479641 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 16:21:36.479769 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.479660 2573 factory.go:103] Registering Raw factory Apr 22 16:21:36.479769 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.479683 2573 manager.go:1196] Started watching for new ooms in manager Apr 22 16:21:36.480957 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.480939 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 16:21:36.480957 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.480949 2573 manager.go:319] Starting recovery of all containers Apr 22 16:21:36.482904 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:36.482887 2573 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-141-251.ec2.internal\" not found" node="ip-10-0-141-251.ec2.internal" Apr 22 16:21:36.491341 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.491220 2573 manager.go:324] Recovery completed Apr 22 16:21:36.495101 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.495087 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 16:21:36.497497 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.497471 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-251.ec2.internal" event="NodeHasSufficientMemory" Apr 22 16:21:36.497543 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.497510 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-251.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 16:21:36.497543 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.497520 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-251.ec2.internal" event="NodeHasSufficientPID" Apr 22 16:21:36.497959 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.497947 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 16:21:36.498040 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.497960 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 16:21:36.498040 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.497980 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 22 16:21:36.503142 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.503128 2573 policy_none.go:49] "None policy: Start" Apr 22 16:21:36.503223 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.503146 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 16:21:36.503223 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.503159 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 22 16:21:36.534323 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.534303 2573 manager.go:341] "Starting Device Plugin manager" Apr 22 16:21:36.540537 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:36.534339 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 16:21:36.540537 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.534355 2573 server.go:85] "Starting device plugin registration server" Apr 22 16:21:36.540537 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.534568 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 16:21:36.540537 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.534578 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 16:21:36.540537 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.534686 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 16:21:36.540537 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.534805 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 16:21:36.540537 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.534817 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 16:21:36.540537 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:36.535350 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 16:21:36.540537 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:36.535387 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-251.ec2.internal\" not found" Apr 22 16:21:36.635319 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.635257 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 16:21:36.636221 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.636205 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-251.ec2.internal" event="NodeHasSufficientMemory" Apr 22 16:21:36.636318 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.636232 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-251.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 16:21:36.636318 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.636242 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-251.ec2.internal" event="NodeHasSufficientPID" Apr 22 16:21:36.636318 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.636269 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-251.ec2.internal" Apr 22 16:21:36.642083 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.642066 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-251.ec2.internal" Apr 22 16:21:36.642163 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:36.642089 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-251.ec2.internal\": node \"ip-10-0-141-251.ec2.internal\" not found" Apr 22 16:21:36.656907 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:36.656886 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-251.ec2.internal\" not found" Apr 22 16:21:36.757307 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:36.757283 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-251.ec2.internal\" not found" Apr 22 16:21:36.857744 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:36.857713 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-251.ec2.internal\" not found" Apr 22 16:21:36.894773 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.894686 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 16:21:36.895960 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.895945 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 16:21:36.896021 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.895972 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 16:21:36.896021 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.895992 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 16:21:36.896021 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.895998 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 16:21:36.896166 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:36.896035 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 16:21:36.902251 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.902233 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 16:21:36.958634 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:36.958611 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-251.ec2.internal\" not found" Apr 22 16:21:36.997084 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.997055 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-251.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-251.ec2.internal"] Apr 22 16:21:36.997143 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.997128 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 16:21:36.998653 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.998637 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-251.ec2.internal" event="NodeHasSufficientMemory" Apr 22 16:21:36.998720 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.998666 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-251.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 16:21:36.998720 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:36.998679 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-251.ec2.internal" event="NodeHasSufficientPID" Apr 22 16:21:37.000077 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.000064 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 16:21:37.000201 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.000187 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-251.ec2.internal" Apr 22 16:21:37.000240 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.000218 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 16:21:37.000764 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.000737 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-251.ec2.internal" event="NodeHasSufficientMemory" Apr 22 16:21:37.000828 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.000744 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-251.ec2.internal" event="NodeHasSufficientMemory" Apr 22 16:21:37.000828 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.000785 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-251.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 16:21:37.000828 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.000787 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-251.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 16:21:37.000828 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.000794 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-251.ec2.internal" event="NodeHasSufficientPID" Apr 22 16:21:37.000828 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.000801 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-251.ec2.internal" event="NodeHasSufficientPID" Apr 22 16:21:37.002098 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.002083 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-251.ec2.internal" Apr 22 16:21:37.002176 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.002105 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 16:21:37.002714 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.002699 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-251.ec2.internal" event="NodeHasSufficientMemory" Apr 22 16:21:37.002808 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.002722 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-251.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 16:21:37.002808 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.002735 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-251.ec2.internal" event="NodeHasSufficientPID" Apr 22 16:21:37.027336 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:37.027316 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-251.ec2.internal\" not found" node="ip-10-0-141-251.ec2.internal" Apr 22 16:21:37.031594 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:37.031580 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-251.ec2.internal\" not found" node="ip-10-0-141-251.ec2.internal" Apr 22 16:21:37.059125 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:37.059106 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-251.ec2.internal\" not found" Apr 22 16:21:37.082781 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.082746 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0c1aad3db1097ad1e69a197a09060d7e-config\") pod \"kube-apiserver-proxy-ip-10-0-141-251.ec2.internal\" (UID: \"0c1aad3db1097ad1e69a197a09060d7e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-251.ec2.internal" Apr 22 16:21:37.082864 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.082787 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ad598d75d06ec2ecc31bfa284ccec4f8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-251.ec2.internal\" (UID: \"ad598d75d06ec2ecc31bfa284ccec4f8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-251.ec2.internal" Apr 22 16:21:37.082864 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.082816 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad598d75d06ec2ecc31bfa284ccec4f8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-251.ec2.internal\" (UID: \"ad598d75d06ec2ecc31bfa284ccec4f8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-251.ec2.internal" Apr 22 16:21:37.159907 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:37.159856 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-251.ec2.internal\" not found" Apr 22 16:21:37.183273 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.183249 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad598d75d06ec2ecc31bfa284ccec4f8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-251.ec2.internal\" (UID: \"ad598d75d06ec2ecc31bfa284ccec4f8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-251.ec2.internal" Apr 22 16:21:37.183337 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.183285 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0c1aad3db1097ad1e69a197a09060d7e-config\") pod \"kube-apiserver-proxy-ip-10-0-141-251.ec2.internal\" (UID: \"0c1aad3db1097ad1e69a197a09060d7e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-251.ec2.internal" Apr 22 16:21:37.183337 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.183303 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ad598d75d06ec2ecc31bfa284ccec4f8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-251.ec2.internal\" (UID: \"ad598d75d06ec2ecc31bfa284ccec4f8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-251.ec2.internal" Apr 22 16:21:37.183422 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.183346 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ad598d75d06ec2ecc31bfa284ccec4f8-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-251.ec2.internal\" (UID: \"ad598d75d06ec2ecc31bfa284ccec4f8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-251.ec2.internal" Apr 22 16:21:37.183422 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.183351 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ad598d75d06ec2ecc31bfa284ccec4f8-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-251.ec2.internal\" (UID: \"ad598d75d06ec2ecc31bfa284ccec4f8\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-251.ec2.internal" Apr 22 16:21:37.183422 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.183372 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0c1aad3db1097ad1e69a197a09060d7e-config\") pod \"kube-apiserver-proxy-ip-10-0-141-251.ec2.internal\" (UID: \"0c1aad3db1097ad1e69a197a09060d7e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-251.ec2.internal" Apr 22 16:21:37.260403 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:37.260377 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-251.ec2.internal\" not found" Apr 22 16:21:37.294990 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.294967 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 16:21:37.329800 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.329776 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-251.ec2.internal" Apr 22 16:21:37.334277 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.334263 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-251.ec2.internal" Apr 22 16:21:37.360785 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:37.360762 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-251.ec2.internal\" not found" Apr 22 16:21:37.388277 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.388257 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 16:21:37.388650 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.388361 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 16:21:37.388650 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.388399 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 16:21:37.388650 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.388406 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 16:21:37.461900 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:37.461850 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-251.ec2.internal\" not found" Apr 22 16:21:37.474131 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.474102 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 16:16:36 +0000 UTC" deadline="2027-11-14 04:27:47.765287489 +0000 UTC" Apr 22 16:21:37.474131 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.474123 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13692h6m10.291166833s" Apr 22 16:21:37.478187 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.478172 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 16:21:37.488676 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.488654 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 16:21:37.511444 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.511428 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-9f6dp" Apr 22 16:21:37.518875 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.518854 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-9f6dp" Apr 22 16:21:37.562112 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:37.562091 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-251.ec2.internal\" not found" Apr 22 16:21:37.662619 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:37.662594 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-251.ec2.internal\" not found" Apr 22 16:21:37.763103 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:37.763086 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-251.ec2.internal\" not found" Apr 22 16:21:37.846185 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:37.846148 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad598d75d06ec2ecc31bfa284ccec4f8.slice/crio-0ff05455fb733d487a10ae928f4d73bc0a929454f58f8f995c966a47e03a068e WatchSource:0}: Error finding container 0ff05455fb733d487a10ae928f4d73bc0a929454f58f8f995c966a47e03a068e: Status 404 returned error can't find the container with id 0ff05455fb733d487a10ae928f4d73bc0a929454f58f8f995c966a47e03a068e Apr 22 16:21:37.852102 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.852087 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 16:21:37.863193 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:37.863174 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-251.ec2.internal\" not found" Apr 22 16:21:37.878896 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:37.878877 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c1aad3db1097ad1e69a197a09060d7e.slice/crio-7371671f169c3ccdbf45d4131e492722e512693a0d46deed4df928d8ea677452 WatchSource:0}: Error finding container 7371671f169c3ccdbf45d4131e492722e512693a0d46deed4df928d8ea677452: Status 404 returned error can't find the container with id 7371671f169c3ccdbf45d4131e492722e512693a0d46deed4df928d8ea677452 Apr 22 16:21:37.898912 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.898873 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-251.ec2.internal" event={"ID":"0c1aad3db1097ad1e69a197a09060d7e","Type":"ContainerStarted","Data":"7371671f169c3ccdbf45d4131e492722e512693a0d46deed4df928d8ea677452"} Apr 22 16:21:37.899744 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:37.899725 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-251.ec2.internal" event={"ID":"ad598d75d06ec2ecc31bfa284ccec4f8","Type":"ContainerStarted","Data":"0ff05455fb733d487a10ae928f4d73bc0a929454f58f8f995c966a47e03a068e"} Apr 22 16:21:37.964177 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:37.964156 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-251.ec2.internal\" not found" Apr 22 16:21:38.035622 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.035577 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 16:21:38.079021 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.078994 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-251.ec2.internal" Apr 22 16:21:38.089996 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.089981 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 16:21:38.090780 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.090769 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-251.ec2.internal" Apr 22 16:21:38.098583 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.098570 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 16:21:38.296998 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.296934 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 16:21:38.453166 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.453139 2573 apiserver.go:52] "Watching apiserver" Apr 22 16:21:38.459562 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.459381 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 16:21:38.462269 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.462059 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb","openshift-dns/node-resolver-ljkt5","openshift-image-registry/node-ca-zgppz","openshift-multus/multus-additional-cni-plugins-gh4hm","openshift-multus/network-metrics-daemon-cwt8x","openshift-network-diagnostics/network-check-target-2r4t4","openshift-cluster-node-tuning-operator/tuned-5ml9p","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-251.ec2.internal","openshift-multus/multus-k6bwx","openshift-network-operator/iptables-alerter-c98f2","openshift-ovn-kubernetes/ovnkube-node-2mskd","kube-system/konnectivity-agent-q9qkz","kube-system/kube-apiserver-proxy-ip-10-0-141-251.ec2.internal"] Apr 22 16:21:38.464675 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.464649 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ljkt5" Apr 22 16:21:38.466127 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.466104 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zgppz" Apr 22 16:21:38.466222 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.466141 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.468462 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.467272 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.468462 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.468296 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 16:21:38.471602 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.470695 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:21:38.471602 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:38.470930 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2r4t4" podUID="95147dbd-9393-4f07-9051-3461d90ddadb" Apr 22 16:21:38.471602 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.471503 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 16:21:38.472111 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.472092 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 16:21:38.475799 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.472393 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 16:21:38.475799 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.472612 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 16:21:38.475799 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.473199 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-d5dct\"" Apr 22 16:21:38.475799 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.473262 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.475799 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.473480 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 16:21:38.475799 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.473965 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-bqrwn\"" Apr 22 16:21:38.475799 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.474322 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 16:21:38.475799 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.474637 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-tx5sg\"" Apr 22 16:21:38.475799 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.474880 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 16:21:38.475799 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.475172 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 16:21:38.475799 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.475469 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 16:21:38.475799 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.475741 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.476596 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.476580 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 16:21:38.477844 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.477825 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 16:21:38.480066 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.480047 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7g8r8\"" Apr 22 16:21:38.480477 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.480293 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wzh6z\"" Apr 22 16:21:38.480477 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.480465 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 16:21:38.480906 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.480723 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 16:21:38.482023 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.482005 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8j2dt\"" Apr 22 16:21:38.482311 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.482293 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 16:21:38.482525 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.482510 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 16:21:38.485342 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.484917 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:21:38.485342 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:38.484978 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cwt8x" podUID="e324836e-ef75-432e-978a-639279d2702e" Apr 22 16:21:38.486455 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.486427 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-c98f2" Apr 22 16:21:38.486606 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.486587 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.488235 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.488219 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q9qkz" Apr 22 16:21:38.488454 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.488433 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 16:21:38.488553 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.488534 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-r28mx\"" Apr 22 16:21:38.488964 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.488745 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 16:21:38.489845 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.489210 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 16:21:38.489845 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.489417 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-h4nm5\"" Apr 22 16:21:38.489845 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.489481 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 16:21:38.489845 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.489539 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 16:21:38.489845 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.489725 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 16:21:38.492518 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.490338 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 16:21:38.492518 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.490690 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 16:21:38.492518 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.490834 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-x4zlt\"" Apr 22 16:21:38.492518 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.490692 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 16:21:38.492518 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491077 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-host-var-lib-kubelet\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.492518 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491103 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-var-lib-kubelet\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.492518 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491136 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-cnibin\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.492518 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491159 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.492518 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491188 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-host-run-k8s-cni-cncf-io\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.492518 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491205 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 16:21:38.492518 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491211 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn64q\" (UniqueName: \"kubernetes.io/projected/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-kube-api-access-rn64q\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.492518 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491248 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.492518 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491278 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/53214233-7ecc-46c2-acc3-f4fdcd12bddb-socket-dir\") pod \"aws-ebs-csi-driver-node-zcdlb\" (UID: \"53214233-7ecc-46c2-acc3-f4fdcd12bddb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.492518 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491317 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/53214233-7ecc-46c2-acc3-f4fdcd12bddb-registration-dir\") pod \"aws-ebs-csi-driver-node-zcdlb\" (UID: \"53214233-7ecc-46c2-acc3-f4fdcd12bddb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.492518 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491343 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85nq4\" (UniqueName: \"kubernetes.io/projected/d9d485c9-31be-4577-affa-18e7f5a7b2bf-kube-api-access-85nq4\") pod \"iptables-alerter-c98f2\" (UID: \"d9d485c9-31be-4577-affa-18e7f5a7b2bf\") " pod="openshift-network-operator/iptables-alerter-c98f2" Apr 22 16:21:38.492518 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491366 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-etc-sysconfig\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.492518 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491389 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4bm9\" (UniqueName: \"kubernetes.io/projected/e324836e-ef75-432e-978a-639279d2702e-kube-api-access-j4bm9\") pod \"network-metrics-daemon-cwt8x\" (UID: \"e324836e-ef75-432e-978a-639279d2702e\") " pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:21:38.492518 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491410 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 16:21:38.492518 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491453 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/53214233-7ecc-46c2-acc3-f4fdcd12bddb-device-dir\") pod \"aws-ebs-csi-driver-node-zcdlb\" (UID: \"53214233-7ecc-46c2-acc3-f4fdcd12bddb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.493421 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491498 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-hostroot\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.493421 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491530 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmxxj\" (UniqueName: \"kubernetes.io/projected/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-kube-api-access-nmxxj\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.493421 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491553 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-etc-sysctl-conf\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.493421 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491586 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-host\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.493421 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491616 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-tmp\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.493421 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491650 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-multus-cni-dir\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.493421 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491683 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-host-run-multus-certs\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.493421 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491707 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-sys\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.493421 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491730 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-system-cni-dir\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.493421 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491781 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcs58\" (UniqueName: \"kubernetes.io/projected/ca6eb44a-3f8d-408b-ae0d-0ef553dc08d8-kube-api-access-dcs58\") pod \"node-ca-zgppz\" (UID: \"ca6eb44a-3f8d-408b-ae0d-0ef553dc08d8\") " pod="openshift-image-registry/node-ca-zgppz" Apr 22 16:21:38.493421 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491815 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-multus-conf-dir\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.493421 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491839 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d9d485c9-31be-4577-affa-18e7f5a7b2bf-iptables-alerter-script\") pod \"iptables-alerter-c98f2\" (UID: \"d9d485c9-31be-4577-affa-18e7f5a7b2bf\") " pod="openshift-network-operator/iptables-alerter-c98f2" Apr 22 16:21:38.493421 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491871 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-cnibin\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.493421 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491913 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-os-release\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.493421 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491946 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-cni-binary-copy\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.493421 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.491982 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-cni-binary-copy\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.493421 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492005 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ca6eb44a-3f8d-408b-ae0d-0ef553dc08d8-serviceca\") pod \"node-ca-zgppz\" (UID: \"ca6eb44a-3f8d-408b-ae0d-0ef553dc08d8\") " pod="openshift-image-registry/node-ca-zgppz" Apr 22 16:21:38.494329 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492042 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-multus-daemon-config\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.494329 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492067 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-etc-sysctl-d\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.494329 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492089 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-run\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.494329 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492113 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/53214233-7ecc-46c2-acc3-f4fdcd12bddb-etc-selinux\") pod \"aws-ebs-csi-driver-node-zcdlb\" (UID: \"53214233-7ecc-46c2-acc3-f4fdcd12bddb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.494329 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492135 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/53214233-7ecc-46c2-acc3-f4fdcd12bddb-sys-fs\") pod \"aws-ebs-csi-driver-node-zcdlb\" (UID: \"53214233-7ecc-46c2-acc3-f4fdcd12bddb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.494329 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492186 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5sx9\" (UniqueName: \"kubernetes.io/projected/53214233-7ecc-46c2-acc3-f4fdcd12bddb-kube-api-access-l5sx9\") pod \"aws-ebs-csi-driver-node-zcdlb\" (UID: \"53214233-7ecc-46c2-acc3-f4fdcd12bddb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.494329 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492214 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-host-var-lib-cni-multus\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.494329 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492240 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtblp\" (UniqueName: \"kubernetes.io/projected/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-kube-api-access-dtblp\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.494329 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492267 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1f2b5ca6-7540-4d0e-88b4-b34788bdeb77-hosts-file\") pod \"node-resolver-ljkt5\" (UID: \"1f2b5ca6-7540-4d0e-88b4-b34788bdeb77\") " pod="openshift-dns/node-resolver-ljkt5" Apr 22 16:21:38.494329 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492317 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-etc-tuned\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.494329 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492342 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q76d\" (UniqueName: \"kubernetes.io/projected/95147dbd-9393-4f07-9051-3461d90ddadb-kube-api-access-6q76d\") pod \"network-check-target-2r4t4\" (UID: \"95147dbd-9393-4f07-9051-3461d90ddadb\") " pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:21:38.494329 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492365 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53214233-7ecc-46c2-acc3-f4fdcd12bddb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zcdlb\" (UID: \"53214233-7ecc-46c2-acc3-f4fdcd12bddb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.494329 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492388 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-multus-socket-dir-parent\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.494329 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492438 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-host-var-lib-cni-bin\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.494329 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492467 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs\") pod \"network-metrics-daemon-cwt8x\" (UID: \"e324836e-ef75-432e-978a-639279d2702e\") " pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:21:38.494329 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492489 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1f2b5ca6-7540-4d0e-88b4-b34788bdeb77-tmp-dir\") pod \"node-resolver-ljkt5\" (UID: \"1f2b5ca6-7540-4d0e-88b4-b34788bdeb77\") " pod="openshift-dns/node-resolver-ljkt5" Apr 22 16:21:38.495078 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492513 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vkcl\" (UniqueName: \"kubernetes.io/projected/1f2b5ca6-7540-4d0e-88b4-b34788bdeb77-kube-api-access-4vkcl\") pod \"node-resolver-ljkt5\" (UID: \"1f2b5ca6-7540-4d0e-88b4-b34788bdeb77\") " pod="openshift-dns/node-resolver-ljkt5" Apr 22 16:21:38.495078 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492537 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ca6eb44a-3f8d-408b-ae0d-0ef553dc08d8-host\") pod \"node-ca-zgppz\" (UID: \"ca6eb44a-3f8d-408b-ae0d-0ef553dc08d8\") " pod="openshift-image-registry/node-ca-zgppz" Apr 22 16:21:38.495078 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492556 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-system-cni-dir\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.495078 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492580 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-etc-modprobe-d\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.495078 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492603 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-etc-systemd\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.495078 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492627 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.495078 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492669 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-etc-kubernetes\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.495078 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492704 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-etc-kubernetes\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.495078 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492730 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-lib-modules\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.495078 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492769 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9d485c9-31be-4577-affa-18e7f5a7b2bf-host-slash\") pod \"iptables-alerter-c98f2\" (UID: \"d9d485c9-31be-4577-affa-18e7f5a7b2bf\") " pod="openshift-network-operator/iptables-alerter-c98f2" Apr 22 16:21:38.495078 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492820 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-host-run-netns\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.495078 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.492965 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-os-release\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.520553 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.520527 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 16:16:37 +0000 UTC" deadline="2027-12-01 02:47:29.81877058 +0000 UTC" Apr 22 16:21:38.520633 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.520554 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14098h25m51.298219938s" Apr 22 16:21:38.580130 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.580094 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 16:21:38.593676 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.593650 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-host-kubelet\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.593814 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.593692 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-ovn-node-metrics-cert\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.593814 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.593722 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-etc-tuned\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.593814 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.593749 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6q76d\" (UniqueName: \"kubernetes.io/projected/95147dbd-9393-4f07-9051-3461d90ddadb-kube-api-access-6q76d\") pod \"network-check-target-2r4t4\" (UID: \"95147dbd-9393-4f07-9051-3461d90ddadb\") " pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:21:38.593814 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.593790 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53214233-7ecc-46c2-acc3-f4fdcd12bddb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zcdlb\" (UID: \"53214233-7ecc-46c2-acc3-f4fdcd12bddb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.594127 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.593815 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-multus-socket-dir-parent\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.594127 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.593839 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-host-var-lib-cni-bin\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.594127 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.593863 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs\") pod \"network-metrics-daemon-cwt8x\" (UID: \"e324836e-ef75-432e-978a-639279d2702e\") " pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:21:38.594127 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.593893 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1f2b5ca6-7540-4d0e-88b4-b34788bdeb77-tmp-dir\") pod \"node-resolver-ljkt5\" (UID: \"1f2b5ca6-7540-4d0e-88b4-b34788bdeb77\") " pod="openshift-dns/node-resolver-ljkt5" Apr 22 16:21:38.594127 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.593919 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vkcl\" (UniqueName: \"kubernetes.io/projected/1f2b5ca6-7540-4d0e-88b4-b34788bdeb77-kube-api-access-4vkcl\") pod \"node-resolver-ljkt5\" (UID: \"1f2b5ca6-7540-4d0e-88b4-b34788bdeb77\") " pod="openshift-dns/node-resolver-ljkt5" Apr 22 16:21:38.594127 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.593943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ca6eb44a-3f8d-408b-ae0d-0ef553dc08d8-host\") pod \"node-ca-zgppz\" (UID: \"ca6eb44a-3f8d-408b-ae0d-0ef553dc08d8\") " pod="openshift-image-registry/node-ca-zgppz" Apr 22 16:21:38.594127 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.593968 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-systemd-units\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.594127 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.593990 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-run-ovn\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.594127 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594014 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-system-cni-dir\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.594127 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594039 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-etc-modprobe-d\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.594127 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594064 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-etc-systemd\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.594127 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594096 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.594127 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594123 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-etc-kubernetes\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.594703 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594146 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-etc-kubernetes\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.594703 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594177 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-lib-modules\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.594703 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594202 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9d485c9-31be-4577-affa-18e7f5a7b2bf-host-slash\") pod \"iptables-alerter-c98f2\" (UID: \"d9d485c9-31be-4577-affa-18e7f5a7b2bf\") " pod="openshift-network-operator/iptables-alerter-c98f2" Apr 22 16:21:38.594703 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594230 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-host-run-netns\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.594703 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594256 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-ovnkube-script-lib\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.594703 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594282 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-host-run-netns\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.594703 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594331 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-os-release\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.594703 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594360 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-run-systemd\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.594703 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594388 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb8gk\" (UniqueName: \"kubernetes.io/projected/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-kube-api-access-cb8gk\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.594703 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594415 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-host-var-lib-kubelet\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.594703 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594441 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-var-lib-kubelet\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.594703 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594466 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-cnibin\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.594703 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594472 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1f2b5ca6-7540-4d0e-88b4-b34788bdeb77-tmp-dir\") pod \"node-resolver-ljkt5\" (UID: \"1f2b5ca6-7540-4d0e-88b4-b34788bdeb77\") " pod="openshift-dns/node-resolver-ljkt5" Apr 22 16:21:38.594703 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594490 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.594703 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594525 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-host-cni-netd\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.594703 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594551 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-ovnkube-config\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.594703 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594583 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-host-run-k8s-cni-cncf-io\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.595460 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594609 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rn64q\" (UniqueName: \"kubernetes.io/projected/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-kube-api-access-rn64q\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.595460 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594639 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.595460 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594667 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/53214233-7ecc-46c2-acc3-f4fdcd12bddb-socket-dir\") pod \"aws-ebs-csi-driver-node-zcdlb\" (UID: \"53214233-7ecc-46c2-acc3-f4fdcd12bddb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.595460 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594692 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/53214233-7ecc-46c2-acc3-f4fdcd12bddb-registration-dir\") pod \"aws-ebs-csi-driver-node-zcdlb\" (UID: \"53214233-7ecc-46c2-acc3-f4fdcd12bddb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.595460 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594718 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85nq4\" (UniqueName: \"kubernetes.io/projected/d9d485c9-31be-4577-affa-18e7f5a7b2bf-kube-api-access-85nq4\") pod \"iptables-alerter-c98f2\" (UID: \"d9d485c9-31be-4577-affa-18e7f5a7b2bf\") " pod="openshift-network-operator/iptables-alerter-c98f2" Apr 22 16:21:38.595460 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594744 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-host-slash\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.595460 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594781 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-etc-sysconfig\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.595460 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594798 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4bm9\" (UniqueName: \"kubernetes.io/projected/e324836e-ef75-432e-978a-639279d2702e-kube-api-access-j4bm9\") pod \"network-metrics-daemon-cwt8x\" (UID: \"e324836e-ef75-432e-978a-639279d2702e\") " pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:21:38.595460 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594818 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/53214233-7ecc-46c2-acc3-f4fdcd12bddb-device-dir\") pod \"aws-ebs-csi-driver-node-zcdlb\" (UID: \"53214233-7ecc-46c2-acc3-f4fdcd12bddb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.595460 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594817 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 16:21:38.595460 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594834 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-node-log\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.595460 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594849 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-hostroot\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.595460 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594864 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmxxj\" (UniqueName: \"kubernetes.io/projected/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-kube-api-access-nmxxj\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.595460 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594881 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-etc-sysctl-conf\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.595460 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594895 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-host\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.595460 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594909 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-tmp\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.595460 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594925 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-env-overrides\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.596306 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594940 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-multus-cni-dir\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.596306 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594957 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-host-run-multus-certs\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.596306 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594971 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-sys\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.596306 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.594993 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-system-cni-dir\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.596306 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595010 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dcs58\" (UniqueName: \"kubernetes.io/projected/ca6eb44a-3f8d-408b-ae0d-0ef553dc08d8-kube-api-access-dcs58\") pod \"node-ca-zgppz\" (UID: \"ca6eb44a-3f8d-408b-ae0d-0ef553dc08d8\") " pod="openshift-image-registry/node-ca-zgppz" Apr 22 16:21:38.596306 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595025 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-run-openvswitch\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.596306 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595040 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-host-cni-bin\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.596306 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595057 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.596306 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595075 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-multus-conf-dir\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.596306 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595107 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d9d485c9-31be-4577-affa-18e7f5a7b2bf-iptables-alerter-script\") pod \"iptables-alerter-c98f2\" (UID: \"d9d485c9-31be-4577-affa-18e7f5a7b2bf\") " pod="openshift-network-operator/iptables-alerter-c98f2" Apr 22 16:21:38.596306 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595135 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-log-socket\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.596306 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595160 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-etc-openvswitch\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.596306 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595195 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-cnibin\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.596306 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595223 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-os-release\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.596306 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595249 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-cni-binary-copy\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.596306 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595284 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-cni-binary-copy\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.596306 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595311 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ca6eb44a-3f8d-408b-ae0d-0ef553dc08d8-serviceca\") pod \"node-ca-zgppz\" (UID: \"ca6eb44a-3f8d-408b-ae0d-0ef553dc08d8\") " pod="openshift-image-registry/node-ca-zgppz" Apr 22 16:21:38.597074 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595338 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-var-lib-openvswitch\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.597074 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595365 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-host-run-ovn-kubernetes\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.597074 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595394 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-multus-daemon-config\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.597074 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595418 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-etc-sysctl-d\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.597074 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595434 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-system-cni-dir\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.597074 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595441 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-run\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.597074 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595488 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-run\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.597074 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595487 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/53214233-7ecc-46c2-acc3-f4fdcd12bddb-etc-selinux\") pod \"aws-ebs-csi-driver-node-zcdlb\" (UID: \"53214233-7ecc-46c2-acc3-f4fdcd12bddb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.597074 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595519 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/53214233-7ecc-46c2-acc3-f4fdcd12bddb-sys-fs\") pod \"aws-ebs-csi-driver-node-zcdlb\" (UID: \"53214233-7ecc-46c2-acc3-f4fdcd12bddb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.597074 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595549 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5sx9\" (UniqueName: \"kubernetes.io/projected/53214233-7ecc-46c2-acc3-f4fdcd12bddb-kube-api-access-l5sx9\") pod \"aws-ebs-csi-driver-node-zcdlb\" (UID: \"53214233-7ecc-46c2-acc3-f4fdcd12bddb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.597074 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595576 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/751cb35b-96a4-4016-bb77-4c55bff2e4d6-agent-certs\") pod \"konnectivity-agent-q9qkz\" (UID: \"751cb35b-96a4-4016-bb77-4c55bff2e4d6\") " pod="kube-system/konnectivity-agent-q9qkz" Apr 22 16:21:38.597074 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595581 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/53214233-7ecc-46c2-acc3-f4fdcd12bddb-etc-selinux\") pod \"aws-ebs-csi-driver-node-zcdlb\" (UID: \"53214233-7ecc-46c2-acc3-f4fdcd12bddb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.597074 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595599 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-host-var-lib-cni-multus\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.597074 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595622 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtblp\" (UniqueName: \"kubernetes.io/projected/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-kube-api-access-dtblp\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.597074 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595644 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1f2b5ca6-7540-4d0e-88b4-b34788bdeb77-hosts-file\") pod \"node-resolver-ljkt5\" (UID: \"1f2b5ca6-7540-4d0e-88b4-b34788bdeb77\") " pod="openshift-dns/node-resolver-ljkt5" Apr 22 16:21:38.597074 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595677 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/751cb35b-96a4-4016-bb77-4c55bff2e4d6-konnectivity-ca\") pod \"konnectivity-agent-q9qkz\" (UID: \"751cb35b-96a4-4016-bb77-4c55bff2e4d6\") " pod="kube-system/konnectivity-agent-q9qkz" Apr 22 16:21:38.597074 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595686 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-etc-modprobe-d\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.597858 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595746 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-etc-kubernetes\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.597858 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595813 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-etc-kubernetes\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.597858 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595912 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-lib-modules\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.597858 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.595959 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9d485c9-31be-4577-affa-18e7f5a7b2bf-host-slash\") pod \"iptables-alerter-c98f2\" (UID: \"d9d485c9-31be-4577-affa-18e7f5a7b2bf\") " pod="openshift-network-operator/iptables-alerter-c98f2" Apr 22 16:21:38.597858 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.596023 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-host-run-netns\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.597858 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.596080 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-os-release\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.597858 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.596143 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-host-var-lib-kubelet\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.597858 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.596193 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-var-lib-kubelet\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.597858 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.596232 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-cnibin\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.597858 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.596238 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.597858 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.596373 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.597858 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.596427 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-host-run-k8s-cni-cncf-io\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.597858 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.597037 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.597858 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.597048 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ca6eb44a-3f8d-408b-ae0d-0ef553dc08d8-host\") pod \"node-ca-zgppz\" (UID: \"ca6eb44a-3f8d-408b-ae0d-0ef553dc08d8\") " pod="openshift-image-registry/node-ca-zgppz" Apr 22 16:21:38.597858 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.597174 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-etc-sysctl-d\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.597858 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.597220 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-multus-socket-dir-parent\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.597858 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.597272 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/53214233-7ecc-46c2-acc3-f4fdcd12bddb-sys-fs\") pod \"aws-ebs-csi-driver-node-zcdlb\" (UID: \"53214233-7ecc-46c2-acc3-f4fdcd12bddb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.598624 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.597313 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/53214233-7ecc-46c2-acc3-f4fdcd12bddb-kubelet-dir\") pod \"aws-ebs-csi-driver-node-zcdlb\" (UID: \"53214233-7ecc-46c2-acc3-f4fdcd12bddb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.598624 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.597367 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-host\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.598624 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.597417 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-host-var-lib-cni-multus\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.598624 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.597662 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1f2b5ca6-7540-4d0e-88b4-b34788bdeb77-hosts-file\") pod \"node-resolver-ljkt5\" (UID: \"1f2b5ca6-7540-4d0e-88b4-b34788bdeb77\") " pod="openshift-dns/node-resolver-ljkt5" Apr 22 16:21:38.598624 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.597721 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-etc-systemd\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.598624 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.597774 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-multus-conf-dir\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.598624 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.597865 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/53214233-7ecc-46c2-acc3-f4fdcd12bddb-socket-dir\") pod \"aws-ebs-csi-driver-node-zcdlb\" (UID: \"53214233-7ecc-46c2-acc3-f4fdcd12bddb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.598624 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.597907 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/53214233-7ecc-46c2-acc3-f4fdcd12bddb-registration-dir\") pod \"aws-ebs-csi-driver-node-zcdlb\" (UID: \"53214233-7ecc-46c2-acc3-f4fdcd12bddb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.598624 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.598135 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-etc-sysconfig\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.598624 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.598176 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/53214233-7ecc-46c2-acc3-f4fdcd12bddb-device-dir\") pod \"aws-ebs-csi-driver-node-zcdlb\" (UID: \"53214233-7ecc-46c2-acc3-f4fdcd12bddb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.599095 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.598704 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-multus-daemon-config\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.599095 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.598784 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-multus-cni-dir\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.599095 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.598821 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-host-run-multus-certs\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.599095 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.598861 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-sys\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.599095 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.598892 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-system-cni-dir\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.599095 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.598896 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d9d485c9-31be-4577-affa-18e7f5a7b2bf-iptables-alerter-script\") pod \"iptables-alerter-c98f2\" (UID: \"d9d485c9-31be-4577-affa-18e7f5a7b2bf\") " pod="openshift-network-operator/iptables-alerter-c98f2" Apr 22 16:21:38.599095 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.598983 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-hostroot\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.599428 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.599134 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-cnibin\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.599428 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.599211 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-os-release\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.599428 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.599302 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-etc-sysctl-conf\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.599428 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.599347 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-host-var-lib-cni-bin\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.599721 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.599699 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-cni-binary-copy\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.600389 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.600139 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-cni-binary-copy\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.600389 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.600267 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ca6eb44a-3f8d-408b-ae0d-0ef553dc08d8-serviceca\") pod \"node-ca-zgppz\" (UID: \"ca6eb44a-3f8d-408b-ae0d-0ef553dc08d8\") " pod="openshift-image-registry/node-ca-zgppz" Apr 22 16:21:38.600389 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.600307 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-tmp\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.600670 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:38.600412 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:38.600670 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:38.600487 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs podName:e324836e-ef75-432e-978a-639279d2702e nodeName:}" failed. No retries permitted until 2026-04-22 16:21:39.100465312 +0000 UTC m=+3.035779518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs") pod "network-metrics-daemon-cwt8x" (UID: "e324836e-ef75-432e-978a-639279d2702e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:38.601930 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.601910 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-etc-tuned\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.602671 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:38.602650 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 16:21:38.602671 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:38.602676 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 16:21:38.602825 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:38.602715 2573 projected.go:194] Error preparing data for projected volume kube-api-access-6q76d for pod openshift-network-diagnostics/network-check-target-2r4t4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:38.602825 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:38.602806 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95147dbd-9393-4f07-9051-3461d90ddadb-kube-api-access-6q76d podName:95147dbd-9393-4f07-9051-3461d90ddadb nodeName:}" failed. No retries permitted until 2026-04-22 16:21:39.102787171 +0000 UTC m=+3.038101398 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6q76d" (UniqueName: "kubernetes.io/projected/95147dbd-9393-4f07-9051-3461d90ddadb-kube-api-access-6q76d") pod "network-check-target-2r4t4" (UID: "95147dbd-9393-4f07-9051-3461d90ddadb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:38.606240 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.606216 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcs58\" (UniqueName: \"kubernetes.io/projected/ca6eb44a-3f8d-408b-ae0d-0ef553dc08d8-kube-api-access-dcs58\") pod \"node-ca-zgppz\" (UID: \"ca6eb44a-3f8d-408b-ae0d-0ef553dc08d8\") " pod="openshift-image-registry/node-ca-zgppz" Apr 22 16:21:38.606942 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.606916 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn64q\" (UniqueName: \"kubernetes.io/projected/bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927-kube-api-access-rn64q\") pod \"multus-k6bwx\" (UID: \"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927\") " pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.607031 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.606950 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vkcl\" (UniqueName: \"kubernetes.io/projected/1f2b5ca6-7540-4d0e-88b4-b34788bdeb77-kube-api-access-4vkcl\") pod \"node-resolver-ljkt5\" (UID: \"1f2b5ca6-7540-4d0e-88b4-b34788bdeb77\") " pod="openshift-dns/node-resolver-ljkt5" Apr 22 16:21:38.607031 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.606973 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtblp\" (UniqueName: \"kubernetes.io/projected/c9801573-c6f5-4b2d-a5ea-6b5b53cf411b-kube-api-access-dtblp\") pod \"multus-additional-cni-plugins-gh4hm\" (UID: \"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b\") " pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.607478 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.607456 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4bm9\" (UniqueName: \"kubernetes.io/projected/e324836e-ef75-432e-978a-639279d2702e-kube-api-access-j4bm9\") pod \"network-metrics-daemon-cwt8x\" (UID: \"e324836e-ef75-432e-978a-639279d2702e\") " pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:21:38.607612 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.607590 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5sx9\" (UniqueName: \"kubernetes.io/projected/53214233-7ecc-46c2-acc3-f4fdcd12bddb-kube-api-access-l5sx9\") pod \"aws-ebs-csi-driver-node-zcdlb\" (UID: \"53214233-7ecc-46c2-acc3-f4fdcd12bddb\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.608509 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.608465 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85nq4\" (UniqueName: \"kubernetes.io/projected/d9d485c9-31be-4577-affa-18e7f5a7b2bf-kube-api-access-85nq4\") pod \"iptables-alerter-c98f2\" (UID: \"d9d485c9-31be-4577-affa-18e7f5a7b2bf\") " pod="openshift-network-operator/iptables-alerter-c98f2" Apr 22 16:21:38.609280 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.609255 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmxxj\" (UniqueName: \"kubernetes.io/projected/cc7a8537-a4ed-4ce3-aeda-9026e46f114f-kube-api-access-nmxxj\") pod \"tuned-5ml9p\" (UID: \"cc7a8537-a4ed-4ce3-aeda-9026e46f114f\") " pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.696416 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696383 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-systemd-units\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696560 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696423 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-run-ovn\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696560 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696453 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-host-run-netns\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696560 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696478 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-ovnkube-script-lib\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696560 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696481 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-systemd-units\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696560 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696504 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-run-systemd\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696560 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696528 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cb8gk\" (UniqueName: \"kubernetes.io/projected/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-kube-api-access-cb8gk\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696560 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696533 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-host-run-netns\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696929 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696572 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-host-cni-netd\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696929 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696583 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-run-ovn\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696929 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696584 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-run-systemd\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696929 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696611 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-ovnkube-config\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696929 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696621 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-host-cni-netd\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696929 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696645 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-host-slash\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696929 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696672 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-node-log\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696929 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696700 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-env-overrides\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696929 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696727 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-run-openvswitch\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696929 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696739 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-node-log\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696929 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696741 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-host-slash\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696929 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696803 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-run-openvswitch\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696929 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696805 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-host-cni-bin\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696929 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696750 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-host-cni-bin\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696929 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696850 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696929 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696897 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-log-socket\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.696929 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696911 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.697739 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696923 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-etc-openvswitch\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.697739 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696947 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-log-socket\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.697739 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696953 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-var-lib-openvswitch\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.697739 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696986 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-host-run-ovn-kubernetes\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.697739 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.697001 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-etc-openvswitch\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.697739 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.697014 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-host-run-ovn-kubernetes\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.697739 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.697019 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/751cb35b-96a4-4016-bb77-4c55bff2e4d6-agent-certs\") pod \"konnectivity-agent-q9qkz\" (UID: \"751cb35b-96a4-4016-bb77-4c55bff2e4d6\") " pod="kube-system/konnectivity-agent-q9qkz" Apr 22 16:21:38.697739 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.696988 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-var-lib-openvswitch\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.697739 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.697047 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/751cb35b-96a4-4016-bb77-4c55bff2e4d6-konnectivity-ca\") pod \"konnectivity-agent-q9qkz\" (UID: \"751cb35b-96a4-4016-bb77-4c55bff2e4d6\") " pod="kube-system/konnectivity-agent-q9qkz" Apr 22 16:21:38.697739 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.697070 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-host-kubelet\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.697739 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.697083 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-ovnkube-script-lib\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.697739 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.697093 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-ovn-node-metrics-cert\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.697739 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.697115 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-env-overrides\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.697739 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.697135 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-ovnkube-config\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.697739 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.697129 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-host-kubelet\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.697739 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.697533 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/751cb35b-96a4-4016-bb77-4c55bff2e4d6-konnectivity-ca\") pod \"konnectivity-agent-q9qkz\" (UID: \"751cb35b-96a4-4016-bb77-4c55bff2e4d6\") " pod="kube-system/konnectivity-agent-q9qkz" Apr 22 16:21:38.699599 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.699576 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-ovn-node-metrics-cert\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.699935 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.699912 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/751cb35b-96a4-4016-bb77-4c55bff2e4d6-agent-certs\") pod \"konnectivity-agent-q9qkz\" (UID: \"751cb35b-96a4-4016-bb77-4c55bff2e4d6\") " pod="kube-system/konnectivity-agent-q9qkz" Apr 22 16:21:38.703779 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.703745 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb8gk\" (UniqueName: \"kubernetes.io/projected/239cf2fe-c977-430f-bf19-3a0e5dbd5f8c-kube-api-access-cb8gk\") pod \"ovnkube-node-2mskd\" (UID: \"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c\") " pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.748577 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.748543 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 16:21:38.789264 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.789234 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ljkt5" Apr 22 16:21:38.797949 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.797923 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-zgppz" Apr 22 16:21:38.806987 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.806937 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" Apr 22 16:21:38.813557 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.813539 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gh4hm" Apr 22 16:21:38.821111 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.821091 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" Apr 22 16:21:38.830709 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.830691 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-k6bwx" Apr 22 16:21:38.837363 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.837337 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-c98f2" Apr 22 16:21:38.846960 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.846943 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:21:38.851464 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:38.851445 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-q9qkz" Apr 22 16:21:39.200965 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:39.200880 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6q76d\" (UniqueName: \"kubernetes.io/projected/95147dbd-9393-4f07-9051-3461d90ddadb-kube-api-access-6q76d\") pod \"network-check-target-2r4t4\" (UID: \"95147dbd-9393-4f07-9051-3461d90ddadb\") " pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:21:39.200965 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:39.200928 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs\") pod \"network-metrics-daemon-cwt8x\" (UID: \"e324836e-ef75-432e-978a-639279d2702e\") " pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:21:39.201168 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:39.201048 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:39.201168 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:39.201049 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 16:21:39.201168 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:39.201074 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 16:21:39.201168 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:39.201086 2573 projected.go:194] Error preparing data for projected volume kube-api-access-6q76d for pod openshift-network-diagnostics/network-check-target-2r4t4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:39.201168 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:39.201148 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs podName:e324836e-ef75-432e-978a-639279d2702e nodeName:}" failed. No retries permitted until 2026-04-22 16:21:40.201091407 +0000 UTC m=+4.136405617 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs") pod "network-metrics-daemon-cwt8x" (UID: "e324836e-ef75-432e-978a-639279d2702e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:39.201168 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:39.201168 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95147dbd-9393-4f07-9051-3461d90ddadb-kube-api-access-6q76d podName:95147dbd-9393-4f07-9051-3461d90ddadb nodeName:}" failed. No retries permitted until 2026-04-22 16:21:40.20115735 +0000 UTC m=+4.136471560 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-6q76d" (UniqueName: "kubernetes.io/projected/95147dbd-9393-4f07-9051-3461d90ddadb-kube-api-access-6q76d") pod "network-check-target-2r4t4" (UID: "95147dbd-9393-4f07-9051-3461d90ddadb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:39.520798 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:39.520696 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 16:16:37 +0000 UTC" deadline="2028-01-15 02:54:42.271459824 +0000 UTC" Apr 22 16:21:39.520798 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:39.520732 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15178h33m2.750731467s" Apr 22 16:21:39.620314 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:39.620176 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod239cf2fe_c977_430f_bf19_3a0e5dbd5f8c.slice/crio-e7a7f955c708c183e6c57b6d4cca069770cc6d69164b26b90027dcfd5bdf6faf WatchSource:0}: Error finding container e7a7f955c708c183e6c57b6d4cca069770cc6d69164b26b90027dcfd5bdf6faf: Status 404 returned error can't find the container with id e7a7f955c708c183e6c57b6d4cca069770cc6d69164b26b90027dcfd5bdf6faf Apr 22 16:21:39.622034 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:39.621917 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbeeecf5_1dc0_40b2_bd6f_5a62c3da9927.slice/crio-d0af34214d04b9545a9062e4f159600be6756913c769d60cb8acb0bfd99c8cb2 WatchSource:0}: Error finding container d0af34214d04b9545a9062e4f159600be6756913c769d60cb8acb0bfd99c8cb2: Status 404 returned error can't find the container with id d0af34214d04b9545a9062e4f159600be6756913c769d60cb8acb0bfd99c8cb2 Apr 22 16:21:39.626084 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:39.626059 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f2b5ca6_7540_4d0e_88b4_b34788bdeb77.slice/crio-2bb8116b12a2f93414d096c65f7895491b41203182c27131b435d9f7620a5237 WatchSource:0}: Error finding container 2bb8116b12a2f93414d096c65f7895491b41203182c27131b435d9f7620a5237: Status 404 returned error can't find the container with id 2bb8116b12a2f93414d096c65f7895491b41203182c27131b435d9f7620a5237 Apr 22 16:21:39.626851 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:39.626828 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc7a8537_a4ed_4ce3_aeda_9026e46f114f.slice/crio-3fe72eedf393f0599d7b1c295c6f7f1f30d88db72be9328a0c6aaed50e8cb96d WatchSource:0}: Error finding container 3fe72eedf393f0599d7b1c295c6f7f1f30d88db72be9328a0c6aaed50e8cb96d: Status 404 returned error can't find the container with id 3fe72eedf393f0599d7b1c295c6f7f1f30d88db72be9328a0c6aaed50e8cb96d Apr 22 16:21:39.627847 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:39.627823 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca6eb44a_3f8d_408b_ae0d_0ef553dc08d8.slice/crio-2741063f61b2d4cb0db4dbc74e22af7620e827062f144afe4eaf10e85778e1a6 WatchSource:0}: Error finding container 2741063f61b2d4cb0db4dbc74e22af7620e827062f144afe4eaf10e85778e1a6: Status 404 returned error can't find the container with id 2741063f61b2d4cb0db4dbc74e22af7620e827062f144afe4eaf10e85778e1a6 Apr 22 16:21:39.628943 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:39.628918 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9801573_c6f5_4b2d_a5ea_6b5b53cf411b.slice/crio-89e2116562ef6f81add7a57ca918b65f10346cd514c1d2709f99335b7f3ec2c7 WatchSource:0}: Error finding container 89e2116562ef6f81add7a57ca918b65f10346cd514c1d2709f99335b7f3ec2c7: Status 404 returned error can't find the container with id 89e2116562ef6f81add7a57ca918b65f10346cd514c1d2709f99335b7f3ec2c7 Apr 22 16:21:39.629925 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:39.629490 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9d485c9_31be_4577_affa_18e7f5a7b2bf.slice/crio-b1ba92b73bc0d73daefcb25069051a8d297ef0e72bc7cb1277a115cec2bca8c9 WatchSource:0}: Error finding container b1ba92b73bc0d73daefcb25069051a8d297ef0e72bc7cb1277a115cec2bca8c9: Status 404 returned error can't find the container with id b1ba92b73bc0d73daefcb25069051a8d297ef0e72bc7cb1277a115cec2bca8c9 Apr 22 16:21:39.632026 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:39.631667 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53214233_7ecc_46c2_acc3_f4fdcd12bddb.slice/crio-83f130851523bf774469c3f2b76d5c9ced785cfa7f827650e7b54f8e97f0b3b2 WatchSource:0}: Error finding container 83f130851523bf774469c3f2b76d5c9ced785cfa7f827650e7b54f8e97f0b3b2: Status 404 returned error can't find the container with id 83f130851523bf774469c3f2b76d5c9ced785cfa7f827650e7b54f8e97f0b3b2 Apr 22 16:21:39.633008 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:21:39.632319 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod751cb35b_96a4_4016_bb77_4c55bff2e4d6.slice/crio-6e45c25a4a18e34926d632a67e6a036c273acb948873bc430fcf8d396d38d4cc WatchSource:0}: Error finding container 6e45c25a4a18e34926d632a67e6a036c273acb948873bc430fcf8d396d38d4cc: Status 404 returned error can't find the container with id 6e45c25a4a18e34926d632a67e6a036c273acb948873bc430fcf8d396d38d4cc Apr 22 16:21:39.817501 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:39.817316 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 16:21:39.896374 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:39.896350 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:21:39.896494 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:39.896354 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:21:39.896494 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:39.896442 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2r4t4" podUID="95147dbd-9393-4f07-9051-3461d90ddadb" Apr 22 16:21:39.896607 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:39.896547 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cwt8x" podUID="e324836e-ef75-432e-978a-639279d2702e" Apr 22 16:21:39.903443 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:39.903420 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zgppz" event={"ID":"ca6eb44a-3f8d-408b-ae0d-0ef553dc08d8","Type":"ContainerStarted","Data":"2741063f61b2d4cb0db4dbc74e22af7620e827062f144afe4eaf10e85778e1a6"} Apr 22 16:21:39.904322 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:39.904301 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ljkt5" event={"ID":"1f2b5ca6-7540-4d0e-88b4-b34788bdeb77","Type":"ContainerStarted","Data":"2bb8116b12a2f93414d096c65f7895491b41203182c27131b435d9f7620a5237"} Apr 22 16:21:39.905258 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:39.905239 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k6bwx" event={"ID":"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927","Type":"ContainerStarted","Data":"d0af34214d04b9545a9062e4f159600be6756913c769d60cb8acb0bfd99c8cb2"} Apr 22 16:21:39.906620 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:39.906598 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-251.ec2.internal" event={"ID":"0c1aad3db1097ad1e69a197a09060d7e","Type":"ContainerStarted","Data":"0154cda8d8d9524373197c371382880500ebeb3b6110de129212aba101e16af4"} Apr 22 16:21:39.907494 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:39.907472 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gh4hm" event={"ID":"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b","Type":"ContainerStarted","Data":"89e2116562ef6f81add7a57ca918b65f10346cd514c1d2709f99335b7f3ec2c7"} Apr 22 16:21:39.908302 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:39.908284 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" event={"ID":"cc7a8537-a4ed-4ce3-aeda-9026e46f114f","Type":"ContainerStarted","Data":"3fe72eedf393f0599d7b1c295c6f7f1f30d88db72be9328a0c6aaed50e8cb96d"} Apr 22 16:21:39.909733 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:39.909711 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" event={"ID":"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c","Type":"ContainerStarted","Data":"e7a7f955c708c183e6c57b6d4cca069770cc6d69164b26b90027dcfd5bdf6faf"} Apr 22 16:21:39.910740 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:39.910715 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q9qkz" event={"ID":"751cb35b-96a4-4016-bb77-4c55bff2e4d6","Type":"ContainerStarted","Data":"6e45c25a4a18e34926d632a67e6a036c273acb948873bc430fcf8d396d38d4cc"} Apr 22 16:21:39.911652 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:39.911633 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-c98f2" event={"ID":"d9d485c9-31be-4577-affa-18e7f5a7b2bf","Type":"ContainerStarted","Data":"b1ba92b73bc0d73daefcb25069051a8d297ef0e72bc7cb1277a115cec2bca8c9"} Apr 22 16:21:39.912492 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:39.912473 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" event={"ID":"53214233-7ecc-46c2-acc3-f4fdcd12bddb","Type":"ContainerStarted","Data":"83f130851523bf774469c3f2b76d5c9ced785cfa7f827650e7b54f8e97f0b3b2"} Apr 22 16:21:39.924795 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:39.924726 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-251.ec2.internal" podStartSLOduration=1.924716476 podStartE2EDuration="1.924716476s" podCreationTimestamp="2026-04-22 16:21:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:21:39.924428396 +0000 UTC m=+3.859742621" watchObservedRunningTime="2026-04-22 16:21:39.924716476 +0000 UTC m=+3.860030700" Apr 22 16:21:40.206916 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:40.206845 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6q76d\" (UniqueName: \"kubernetes.io/projected/95147dbd-9393-4f07-9051-3461d90ddadb-kube-api-access-6q76d\") pod \"network-check-target-2r4t4\" (UID: \"95147dbd-9393-4f07-9051-3461d90ddadb\") " pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:21:40.206916 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:40.206885 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs\") pod \"network-metrics-daemon-cwt8x\" (UID: \"e324836e-ef75-432e-978a-639279d2702e\") " pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:21:40.207119 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:40.207006 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:40.207119 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:40.207022 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 16:21:40.207119 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:40.207048 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 16:21:40.207119 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:40.207063 2573 projected.go:194] Error preparing data for projected volume kube-api-access-6q76d for pod openshift-network-diagnostics/network-check-target-2r4t4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:40.207119 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:40.207073 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs podName:e324836e-ef75-432e-978a-639279d2702e nodeName:}" failed. No retries permitted until 2026-04-22 16:21:42.207055599 +0000 UTC m=+6.142369802 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs") pod "network-metrics-daemon-cwt8x" (UID: "e324836e-ef75-432e-978a-639279d2702e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:40.207119 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:40.207121 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95147dbd-9393-4f07-9051-3461d90ddadb-kube-api-access-6q76d podName:95147dbd-9393-4f07-9051-3461d90ddadb nodeName:}" failed. No retries permitted until 2026-04-22 16:21:42.207102131 +0000 UTC m=+6.142416339 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-6q76d" (UniqueName: "kubernetes.io/projected/95147dbd-9393-4f07-9051-3461d90ddadb-kube-api-access-6q76d") pod "network-check-target-2r4t4" (UID: "95147dbd-9393-4f07-9051-3461d90ddadb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:40.933666 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:40.933580 2573 generic.go:358] "Generic (PLEG): container finished" podID="ad598d75d06ec2ecc31bfa284ccec4f8" containerID="20f7c843820c49c4443441e3469340611cb77fd75870baed424052b8fd749914" exitCode=0 Apr 22 16:21:40.934143 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:40.933690 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-251.ec2.internal" event={"ID":"ad598d75d06ec2ecc31bfa284ccec4f8","Type":"ContainerDied","Data":"20f7c843820c49c4443441e3469340611cb77fd75870baed424052b8fd749914"} Apr 22 16:21:41.085493 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:41.084942 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7455g"] Apr 22 16:21:41.087190 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:41.086780 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:41.087190 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:41.086860 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7455g" podUID="f07a62bd-5116-4f09-94bd-5cf21c3a890b" Apr 22 16:21:41.115400 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:41.115208 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f07a62bd-5116-4f09-94bd-5cf21c3a890b-original-pull-secret\") pod \"global-pull-secret-syncer-7455g\" (UID: \"f07a62bd-5116-4f09-94bd-5cf21c3a890b\") " pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:41.115400 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:41.115272 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f07a62bd-5116-4f09-94bd-5cf21c3a890b-dbus\") pod \"global-pull-secret-syncer-7455g\" (UID: \"f07a62bd-5116-4f09-94bd-5cf21c3a890b\") " pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:41.115400 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:41.115326 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f07a62bd-5116-4f09-94bd-5cf21c3a890b-kubelet-config\") pod \"global-pull-secret-syncer-7455g\" (UID: \"f07a62bd-5116-4f09-94bd-5cf21c3a890b\") " pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:41.225816 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:41.221419 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f07a62bd-5116-4f09-94bd-5cf21c3a890b-dbus\") pod \"global-pull-secret-syncer-7455g\" (UID: \"f07a62bd-5116-4f09-94bd-5cf21c3a890b\") " pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:41.225816 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:41.221508 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f07a62bd-5116-4f09-94bd-5cf21c3a890b-kubelet-config\") pod \"global-pull-secret-syncer-7455g\" (UID: \"f07a62bd-5116-4f09-94bd-5cf21c3a890b\") " pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:41.225816 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:41.221556 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f07a62bd-5116-4f09-94bd-5cf21c3a890b-original-pull-secret\") pod \"global-pull-secret-syncer-7455g\" (UID: \"f07a62bd-5116-4f09-94bd-5cf21c3a890b\") " pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:41.225816 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:41.221684 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 16:21:41.225816 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:41.221748 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f07a62bd-5116-4f09-94bd-5cf21c3a890b-original-pull-secret podName:f07a62bd-5116-4f09-94bd-5cf21c3a890b nodeName:}" failed. No retries permitted until 2026-04-22 16:21:41.721730268 +0000 UTC m=+5.657044476 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f07a62bd-5116-4f09-94bd-5cf21c3a890b-original-pull-secret") pod "global-pull-secret-syncer-7455g" (UID: "f07a62bd-5116-4f09-94bd-5cf21c3a890b") : object "kube-system"/"original-pull-secret" not registered Apr 22 16:21:41.225816 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:41.221913 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f07a62bd-5116-4f09-94bd-5cf21c3a890b-dbus\") pod \"global-pull-secret-syncer-7455g\" (UID: \"f07a62bd-5116-4f09-94bd-5cf21c3a890b\") " pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:41.225816 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:41.221981 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f07a62bd-5116-4f09-94bd-5cf21c3a890b-kubelet-config\") pod \"global-pull-secret-syncer-7455g\" (UID: \"f07a62bd-5116-4f09-94bd-5cf21c3a890b\") " pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:41.726108 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:41.726073 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f07a62bd-5116-4f09-94bd-5cf21c3a890b-original-pull-secret\") pod \"global-pull-secret-syncer-7455g\" (UID: \"f07a62bd-5116-4f09-94bd-5cf21c3a890b\") " pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:41.726286 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:41.726256 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 16:21:41.726347 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:41.726315 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f07a62bd-5116-4f09-94bd-5cf21c3a890b-original-pull-secret podName:f07a62bd-5116-4f09-94bd-5cf21c3a890b nodeName:}" failed. No retries permitted until 2026-04-22 16:21:42.726297526 +0000 UTC m=+6.661611736 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f07a62bd-5116-4f09-94bd-5cf21c3a890b-original-pull-secret") pod "global-pull-secret-syncer-7455g" (UID: "f07a62bd-5116-4f09-94bd-5cf21c3a890b") : object "kube-system"/"original-pull-secret" not registered Apr 22 16:21:41.897073 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:41.897040 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:21:41.897247 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:41.897184 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cwt8x" podUID="e324836e-ef75-432e-978a-639279d2702e" Apr 22 16:21:41.897675 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:41.897544 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:21:41.897675 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:41.897632 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2r4t4" podUID="95147dbd-9393-4f07-9051-3461d90ddadb" Apr 22 16:21:41.939857 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:41.939448 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-251.ec2.internal" event={"ID":"ad598d75d06ec2ecc31bfa284ccec4f8","Type":"ContainerStarted","Data":"2d845b7bf0649cca74f218e7730ce12350c86b3a9c85c22ef9d552afc0d211c6"} Apr 22 16:21:42.229081 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:42.228473 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6q76d\" (UniqueName: \"kubernetes.io/projected/95147dbd-9393-4f07-9051-3461d90ddadb-kube-api-access-6q76d\") pod \"network-check-target-2r4t4\" (UID: \"95147dbd-9393-4f07-9051-3461d90ddadb\") " pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:21:42.229081 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:42.228507 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs\") pod \"network-metrics-daemon-cwt8x\" (UID: \"e324836e-ef75-432e-978a-639279d2702e\") " pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:21:42.229081 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:42.228626 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:42.229081 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:42.228667 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs podName:e324836e-ef75-432e-978a-639279d2702e nodeName:}" failed. No retries permitted until 2026-04-22 16:21:46.228654564 +0000 UTC m=+10.163968767 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs") pod "network-metrics-daemon-cwt8x" (UID: "e324836e-ef75-432e-978a-639279d2702e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:42.229081 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:42.228993 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 16:21:42.229081 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:42.229007 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 16:21:42.229081 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:42.229015 2573 projected.go:194] Error preparing data for projected volume kube-api-access-6q76d for pod openshift-network-diagnostics/network-check-target-2r4t4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:42.229081 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:42.229052 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95147dbd-9393-4f07-9051-3461d90ddadb-kube-api-access-6q76d podName:95147dbd-9393-4f07-9051-3461d90ddadb nodeName:}" failed. No retries permitted until 2026-04-22 16:21:46.229037025 +0000 UTC m=+10.164351228 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-6q76d" (UniqueName: "kubernetes.io/projected/95147dbd-9393-4f07-9051-3461d90ddadb-kube-api-access-6q76d") pod "network-check-target-2r4t4" (UID: "95147dbd-9393-4f07-9051-3461d90ddadb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:42.732539 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:42.732503 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f07a62bd-5116-4f09-94bd-5cf21c3a890b-original-pull-secret\") pod \"global-pull-secret-syncer-7455g\" (UID: \"f07a62bd-5116-4f09-94bd-5cf21c3a890b\") " pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:42.732711 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:42.732694 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 16:21:42.732800 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:42.732775 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f07a62bd-5116-4f09-94bd-5cf21c3a890b-original-pull-secret podName:f07a62bd-5116-4f09-94bd-5cf21c3a890b nodeName:}" failed. No retries permitted until 2026-04-22 16:21:44.732741428 +0000 UTC m=+8.668055648 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f07a62bd-5116-4f09-94bd-5cf21c3a890b-original-pull-secret") pod "global-pull-secret-syncer-7455g" (UID: "f07a62bd-5116-4f09-94bd-5cf21c3a890b") : object "kube-system"/"original-pull-secret" not registered Apr 22 16:21:42.900150 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:42.900099 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:42.900324 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:42.900233 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7455g" podUID="f07a62bd-5116-4f09-94bd-5cf21c3a890b" Apr 22 16:21:43.896675 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:43.896646 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:21:43.897121 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:43.896794 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2r4t4" podUID="95147dbd-9393-4f07-9051-3461d90ddadb" Apr 22 16:21:43.897121 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:43.896856 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:21:43.897121 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:43.896983 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cwt8x" podUID="e324836e-ef75-432e-978a-639279d2702e" Apr 22 16:21:44.748766 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:44.748718 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f07a62bd-5116-4f09-94bd-5cf21c3a890b-original-pull-secret\") pod \"global-pull-secret-syncer-7455g\" (UID: \"f07a62bd-5116-4f09-94bd-5cf21c3a890b\") " pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:44.748943 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:44.748887 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 16:21:44.749008 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:44.748969 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f07a62bd-5116-4f09-94bd-5cf21c3a890b-original-pull-secret podName:f07a62bd-5116-4f09-94bd-5cf21c3a890b nodeName:}" failed. No retries permitted until 2026-04-22 16:21:48.748949554 +0000 UTC m=+12.684263770 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f07a62bd-5116-4f09-94bd-5cf21c3a890b-original-pull-secret") pod "global-pull-secret-syncer-7455g" (UID: "f07a62bd-5116-4f09-94bd-5cf21c3a890b") : object "kube-system"/"original-pull-secret" not registered Apr 22 16:21:44.900048 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:44.900019 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:44.900494 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:44.900139 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7455g" podUID="f07a62bd-5116-4f09-94bd-5cf21c3a890b" Apr 22 16:21:45.897371 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:45.896709 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:21:45.897371 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:45.896847 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2r4t4" podUID="95147dbd-9393-4f07-9051-3461d90ddadb" Apr 22 16:21:45.897371 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:45.897223 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:21:45.897371 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:45.897324 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cwt8x" podUID="e324836e-ef75-432e-978a-639279d2702e" Apr 22 16:21:46.259804 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:46.259699 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6q76d\" (UniqueName: \"kubernetes.io/projected/95147dbd-9393-4f07-9051-3461d90ddadb-kube-api-access-6q76d\") pod \"network-check-target-2r4t4\" (UID: \"95147dbd-9393-4f07-9051-3461d90ddadb\") " pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:21:46.259804 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:46.259767 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs\") pod \"network-metrics-daemon-cwt8x\" (UID: \"e324836e-ef75-432e-978a-639279d2702e\") " pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:21:46.260302 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:46.259917 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:46.260302 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:46.259978 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs podName:e324836e-ef75-432e-978a-639279d2702e nodeName:}" failed. No retries permitted until 2026-04-22 16:21:54.259960201 +0000 UTC m=+18.195274431 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs") pod "network-metrics-daemon-cwt8x" (UID: "e324836e-ef75-432e-978a-639279d2702e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:46.260302 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:46.260226 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 16:21:46.260302 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:46.260250 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 16:21:46.260302 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:46.260264 2573 projected.go:194] Error preparing data for projected volume kube-api-access-6q76d for pod openshift-network-diagnostics/network-check-target-2r4t4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:46.260565 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:46.260323 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95147dbd-9393-4f07-9051-3461d90ddadb-kube-api-access-6q76d podName:95147dbd-9393-4f07-9051-3461d90ddadb nodeName:}" failed. No retries permitted until 2026-04-22 16:21:54.260306212 +0000 UTC m=+18.195620431 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-6q76d" (UniqueName: "kubernetes.io/projected/95147dbd-9393-4f07-9051-3461d90ddadb-kube-api-access-6q76d") pod "network-check-target-2r4t4" (UID: "95147dbd-9393-4f07-9051-3461d90ddadb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:46.898009 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:46.897975 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:46.898189 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:46.898102 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7455g" podUID="f07a62bd-5116-4f09-94bd-5cf21c3a890b" Apr 22 16:21:47.896557 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:47.896522 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:21:47.897055 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:47.896716 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cwt8x" podUID="e324836e-ef75-432e-978a-639279d2702e" Apr 22 16:21:47.897055 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:47.896767 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:21:47.897055 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:47.896877 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2r4t4" podUID="95147dbd-9393-4f07-9051-3461d90ddadb" Apr 22 16:21:48.778810 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:48.778624 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f07a62bd-5116-4f09-94bd-5cf21c3a890b-original-pull-secret\") pod \"global-pull-secret-syncer-7455g\" (UID: \"f07a62bd-5116-4f09-94bd-5cf21c3a890b\") " pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:48.778810 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:48.778772 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 16:21:48.779005 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:48.778850 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f07a62bd-5116-4f09-94bd-5cf21c3a890b-original-pull-secret podName:f07a62bd-5116-4f09-94bd-5cf21c3a890b nodeName:}" failed. No retries permitted until 2026-04-22 16:21:56.778833525 +0000 UTC m=+20.714147744 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f07a62bd-5116-4f09-94bd-5cf21c3a890b-original-pull-secret") pod "global-pull-secret-syncer-7455g" (UID: "f07a62bd-5116-4f09-94bd-5cf21c3a890b") : object "kube-system"/"original-pull-secret" not registered Apr 22 16:21:48.899553 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:48.899524 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:48.900019 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:48.899651 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7455g" podUID="f07a62bd-5116-4f09-94bd-5cf21c3a890b" Apr 22 16:21:49.896243 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:49.896210 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:21:49.896408 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:49.896210 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:21:49.896408 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:49.896364 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2r4t4" podUID="95147dbd-9393-4f07-9051-3461d90ddadb" Apr 22 16:21:49.896497 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:49.896473 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cwt8x" podUID="e324836e-ef75-432e-978a-639279d2702e" Apr 22 16:21:50.896586 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:50.896555 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:50.897070 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:50.896668 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7455g" podUID="f07a62bd-5116-4f09-94bd-5cf21c3a890b" Apr 22 16:21:51.896534 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:51.896510 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:21:51.896671 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:51.896510 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:21:51.896671 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:51.896609 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cwt8x" podUID="e324836e-ef75-432e-978a-639279d2702e" Apr 22 16:21:51.896973 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:51.896676 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2r4t4" podUID="95147dbd-9393-4f07-9051-3461d90ddadb" Apr 22 16:21:52.897297 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:52.897257 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:52.897733 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:52.897405 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7455g" podUID="f07a62bd-5116-4f09-94bd-5cf21c3a890b" Apr 22 16:21:53.897144 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:53.897106 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:21:53.897357 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:53.897107 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:21:53.897357 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:53.897259 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cwt8x" podUID="e324836e-ef75-432e-978a-639279d2702e" Apr 22 16:21:53.897724 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:53.897390 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2r4t4" podUID="95147dbd-9393-4f07-9051-3461d90ddadb" Apr 22 16:21:54.323990 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:54.323946 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6q76d\" (UniqueName: \"kubernetes.io/projected/95147dbd-9393-4f07-9051-3461d90ddadb-kube-api-access-6q76d\") pod \"network-check-target-2r4t4\" (UID: \"95147dbd-9393-4f07-9051-3461d90ddadb\") " pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:21:54.324173 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:54.324002 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs\") pod \"network-metrics-daemon-cwt8x\" (UID: \"e324836e-ef75-432e-978a-639279d2702e\") " pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:21:54.324173 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:54.324134 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 16:21:54.324173 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:54.324156 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 16:21:54.324173 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:54.324156 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:54.324173 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:54.324166 2573 projected.go:194] Error preparing data for projected volume kube-api-access-6q76d for pod openshift-network-diagnostics/network-check-target-2r4t4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:54.324433 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:54.324235 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs podName:e324836e-ef75-432e-978a-639279d2702e nodeName:}" failed. No retries permitted until 2026-04-22 16:22:10.324212371 +0000 UTC m=+34.259526597 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs") pod "network-metrics-daemon-cwt8x" (UID: "e324836e-ef75-432e-978a-639279d2702e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:54.324433 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:54.324258 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/95147dbd-9393-4f07-9051-3461d90ddadb-kube-api-access-6q76d podName:95147dbd-9393-4f07-9051-3461d90ddadb nodeName:}" failed. No retries permitted until 2026-04-22 16:22:10.324248617 +0000 UTC m=+34.259562820 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-6q76d" (UniqueName: "kubernetes.io/projected/95147dbd-9393-4f07-9051-3461d90ddadb-kube-api-access-6q76d") pod "network-check-target-2r4t4" (UID: "95147dbd-9393-4f07-9051-3461d90ddadb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:54.896918 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:54.896884 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:54.897096 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:54.897016 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7455g" podUID="f07a62bd-5116-4f09-94bd-5cf21c3a890b" Apr 22 16:21:55.896649 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:55.896614 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:21:55.897124 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:55.896615 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:21:55.897124 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:55.896743 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cwt8x" podUID="e324836e-ef75-432e-978a-639279d2702e" Apr 22 16:21:55.897124 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:55.896820 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2r4t4" podUID="95147dbd-9393-4f07-9051-3461d90ddadb" Apr 22 16:21:56.845382 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:56.845140 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f07a62bd-5116-4f09-94bd-5cf21c3a890b-original-pull-secret\") pod \"global-pull-secret-syncer-7455g\" (UID: \"f07a62bd-5116-4f09-94bd-5cf21c3a890b\") " pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:56.845491 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:56.845424 2573 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 16:21:56.845548 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:56.845501 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f07a62bd-5116-4f09-94bd-5cf21c3a890b-original-pull-secret podName:f07a62bd-5116-4f09-94bd-5cf21c3a890b nodeName:}" failed. No retries permitted until 2026-04-22 16:22:12.845481488 +0000 UTC m=+36.780795690 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f07a62bd-5116-4f09-94bd-5cf21c3a890b-original-pull-secret") pod "global-pull-secret-syncer-7455g" (UID: "f07a62bd-5116-4f09-94bd-5cf21c3a890b") : object "kube-system"/"original-pull-secret" not registered Apr 22 16:21:56.898343 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:56.898314 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:56.898726 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:56.898436 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7455g" podUID="f07a62bd-5116-4f09-94bd-5cf21c3a890b" Apr 22 16:21:56.961926 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:56.961834 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" event={"ID":"53214233-7ecc-46c2-acc3-f4fdcd12bddb","Type":"ContainerStarted","Data":"4e2f77f3fcf5fa3dc534b8fc8251313788030e45cf745aaff179d6c230ec4b60"} Apr 22 16:21:56.963061 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:56.963039 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ljkt5" event={"ID":"1f2b5ca6-7540-4d0e-88b4-b34788bdeb77","Type":"ContainerStarted","Data":"be595b6569827b700554860ea549183c10e867db656b5b0995da9986e05c450b"} Apr 22 16:21:56.964438 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:56.964400 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-k6bwx" event={"ID":"bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927","Type":"ContainerStarted","Data":"5fdd13bf6bd24931a46507969948f2935c788fde7f1f192628f071aa96e6d20e"} Apr 22 16:21:56.965741 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:56.965719 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gh4hm" event={"ID":"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b","Type":"ContainerStarted","Data":"879735008ca838cdf09f78a909ab3d92e48a36eac816b60f74d65dbacc74a272"} Apr 22 16:21:56.966955 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:56.966934 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" event={"ID":"cc7a8537-a4ed-4ce3-aeda-9026e46f114f","Type":"ContainerStarted","Data":"851aef8659c90d193241f1de04b5cca5b5849d44544d9612804836dbf92307c7"} Apr 22 16:21:56.981151 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:56.981110 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-251.ec2.internal" podStartSLOduration=18.981100557 podStartE2EDuration="18.981100557s" podCreationTimestamp="2026-04-22 16:21:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:21:41.955217762 +0000 UTC m=+5.890531989" watchObservedRunningTime="2026-04-22 16:21:56.981100557 +0000 UTC m=+20.916414782" Apr 22 16:21:56.981315 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:56.981170 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-k6bwx" podStartSLOduration=3.958080491 podStartE2EDuration="20.981166847s" podCreationTimestamp="2026-04-22 16:21:36 +0000 UTC" firstStartedPulling="2026-04-22 16:21:39.624258899 +0000 UTC m=+3.559573126" lastFinishedPulling="2026-04-22 16:21:56.647345275 +0000 UTC m=+20.582659482" observedRunningTime="2026-04-22 16:21:56.980898034 +0000 UTC m=+20.916212260" watchObservedRunningTime="2026-04-22 16:21:56.981166847 +0000 UTC m=+20.916481072" Apr 22 16:21:56.996765 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:56.996716 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-5ml9p" podStartSLOduration=3.985371511 podStartE2EDuration="20.996702688s" podCreationTimestamp="2026-04-22 16:21:36 +0000 UTC" firstStartedPulling="2026-04-22 16:21:39.628487881 +0000 UTC m=+3.563802091" lastFinishedPulling="2026-04-22 16:21:56.639819064 +0000 UTC m=+20.575133268" observedRunningTime="2026-04-22 16:21:56.996209463 +0000 UTC m=+20.931523699" watchObservedRunningTime="2026-04-22 16:21:56.996702688 +0000 UTC m=+20.932016912" Apr 22 16:21:57.861296 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:57.861156 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 16:21:57.897146 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:57.897120 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:21:57.897146 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:57.897134 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:21:57.897262 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:57.897201 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2r4t4" podUID="95147dbd-9393-4f07-9051-3461d90ddadb" Apr 22 16:21:57.897262 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:57.897258 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cwt8x" podUID="e324836e-ef75-432e-978a-639279d2702e" Apr 22 16:21:57.971932 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:57.971911 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-q9qkz" event={"ID":"751cb35b-96a4-4016-bb77-4c55bff2e4d6","Type":"ContainerStarted","Data":"7edc62c830ed7c9e9e7a489cdc66155cc784c60e1a51d5570209d9b299e1d811"} Apr 22 16:21:57.973486 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:57.973467 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" event={"ID":"53214233-7ecc-46c2-acc3-f4fdcd12bddb","Type":"ContainerStarted","Data":"9a82e2c56854430b3fa2fbf396b39384cfd6e435b623780d1fc01d595b77a3e2"} Apr 22 16:21:57.974669 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:57.974650 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-zgppz" event={"ID":"ca6eb44a-3f8d-408b-ae0d-0ef553dc08d8","Type":"ContainerStarted","Data":"e2686c5a521171af438f705eeb789c7cc50da0c209edfed5e910af928c5ed4b0"} Apr 22 16:21:57.978600 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:57.978578 2573 generic.go:358] "Generic (PLEG): container finished" podID="c9801573-c6f5-4b2d-a5ea-6b5b53cf411b" containerID="879735008ca838cdf09f78a909ab3d92e48a36eac816b60f74d65dbacc74a272" exitCode=0 Apr 22 16:21:57.978681 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:57.978639 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gh4hm" event={"ID":"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b","Type":"ContainerDied","Data":"879735008ca838cdf09f78a909ab3d92e48a36eac816b60f74d65dbacc74a272"} Apr 22 16:21:57.981299 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:57.981274 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" event={"ID":"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c","Type":"ContainerStarted","Data":"c5c3005ef16d8aa52c69fe911d584c621cdd77cd57850263a580a13b6efb4bd9"} Apr 22 16:21:57.981299 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:57.981306 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" event={"ID":"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c","Type":"ContainerStarted","Data":"bf9b2e54694d8d13d7e7985f745369a93313bcefa008f575682e37ba3540fda3"} Apr 22 16:21:57.981439 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:57.981320 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" event={"ID":"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c","Type":"ContainerStarted","Data":"9119a70dbb2b8ba408e07b3a05f6e31cb65b39f47a4b9ea9d373699aa5820dd5"} Apr 22 16:21:57.981439 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:57.981333 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" event={"ID":"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c","Type":"ContainerStarted","Data":"5c8fa241281e01bcdbb92771e653a03db262dd3eeddd943f5bcc042f93b42d6f"} Apr 22 16:21:57.981439 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:57.981342 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" event={"ID":"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c","Type":"ContainerStarted","Data":"126a50734d9f9d53f4accb15c76f4238640c81672639e4e857f77f581967432c"} Apr 22 16:21:57.981439 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:57.981350 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" event={"ID":"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c","Type":"ContainerStarted","Data":"138aa3f947d5e1c1bbf8685dffbade4dbf82f57c87b3731aec6691ad08fc4650"} Apr 22 16:21:57.985854 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:57.985822 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-q9qkz" podStartSLOduration=13.108744477 podStartE2EDuration="21.9858122s" podCreationTimestamp="2026-04-22 16:21:36 +0000 UTC" firstStartedPulling="2026-04-22 16:21:39.634741581 +0000 UTC m=+3.570055784" lastFinishedPulling="2026-04-22 16:21:48.5118093 +0000 UTC m=+12.447123507" observedRunningTime="2026-04-22 16:21:57.984827369 +0000 UTC m=+21.920141591" watchObservedRunningTime="2026-04-22 16:21:57.9858122 +0000 UTC m=+21.921126427" Apr 22 16:21:57.996775 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:57.996730 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ljkt5" podStartSLOduration=4.98657311 podStartE2EDuration="21.99671931s" podCreationTimestamp="2026-04-22 16:21:36 +0000 UTC" firstStartedPulling="2026-04-22 16:21:39.627778041 +0000 UTC m=+3.563092244" lastFinishedPulling="2026-04-22 16:21:56.637924238 +0000 UTC m=+20.573238444" observedRunningTime="2026-04-22 16:21:57.996703614 +0000 UTC m=+21.932017836" watchObservedRunningTime="2026-04-22 16:21:57.99671931 +0000 UTC m=+21.932033535" Apr 22 16:21:58.008622 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:58.008591 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-zgppz" podStartSLOduration=5.001278912 podStartE2EDuration="22.008580624s" podCreationTimestamp="2026-04-22 16:21:36 +0000 UTC" firstStartedPulling="2026-04-22 16:21:39.630569907 +0000 UTC m=+3.565884110" lastFinishedPulling="2026-04-22 16:21:56.637871615 +0000 UTC m=+20.573185822" observedRunningTime="2026-04-22 16:21:58.008187146 +0000 UTC m=+21.943501382" watchObservedRunningTime="2026-04-22 16:21:58.008580624 +0000 UTC m=+21.943894850" Apr 22 16:21:58.385585 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:58.385555 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-q9qkz" Apr 22 16:21:58.386375 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:58.386350 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-q9qkz" Apr 22 16:21:58.545912 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:58.545817 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T16:21:57.861293283Z","UUID":"4ae70fd4-8339-46aa-9972-5e5526ad8603","Handler":null,"Name":"","Endpoint":""} Apr 22 16:21:58.547526 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:58.547482 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 16:21:58.547526 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:58.547514 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 16:21:58.897294 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:58.897269 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:21:58.897413 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:58.897391 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7455g" podUID="f07a62bd-5116-4f09-94bd-5cf21c3a890b" Apr 22 16:21:58.984177 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:58.984147 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-c98f2" event={"ID":"d9d485c9-31be-4577-affa-18e7f5a7b2bf","Type":"ContainerStarted","Data":"fe1ad5ca304c19454f0aa8cb97b7a068bd8142c906930c2ff70980050b7b5008"} Apr 22 16:21:58.986146 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:58.986123 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" event={"ID":"53214233-7ecc-46c2-acc3-f4fdcd12bddb","Type":"ContainerStarted","Data":"752090efd5a51c14eb1f3ce13cf3a163d5825f730531c2585430e64f97f542af"} Apr 22 16:21:58.986503 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:58.986485 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-q9qkz" Apr 22 16:21:58.987038 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:58.987021 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-q9qkz" Apr 22 16:21:58.997578 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:58.997540 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-c98f2" podStartSLOduration=6.014734568 podStartE2EDuration="22.997525688s" podCreationTimestamp="2026-04-22 16:21:36 +0000 UTC" firstStartedPulling="2026-04-22 16:21:39.633873127 +0000 UTC m=+3.569187333" lastFinishedPulling="2026-04-22 16:21:56.616664244 +0000 UTC m=+20.551978453" observedRunningTime="2026-04-22 16:21:58.996856229 +0000 UTC m=+22.932170453" watchObservedRunningTime="2026-04-22 16:21:58.997525688 +0000 UTC m=+22.932839915" Apr 22 16:21:59.012227 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:59.012182 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-zcdlb" podStartSLOduration=3.871665578 podStartE2EDuration="23.012166664s" podCreationTimestamp="2026-04-22 16:21:36 +0000 UTC" firstStartedPulling="2026-04-22 16:21:39.634153389 +0000 UTC m=+3.569467597" lastFinishedPulling="2026-04-22 16:21:58.774654479 +0000 UTC m=+22.709968683" observedRunningTime="2026-04-22 16:21:59.011480914 +0000 UTC m=+22.946795140" watchObservedRunningTime="2026-04-22 16:21:59.012166664 +0000 UTC m=+22.947480889" Apr 22 16:21:59.897372 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:59.897177 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:21:59.897597 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:59.897181 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:21:59.897597 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:59.897416 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cwt8x" podUID="e324836e-ef75-432e-978a-639279d2702e" Apr 22 16:21:59.897597 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:21:59.897487 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2r4t4" podUID="95147dbd-9393-4f07-9051-3461d90ddadb" Apr 22 16:21:59.991654 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:21:59.991616 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" event={"ID":"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c","Type":"ContainerStarted","Data":"e86176cf8c03f4d0c0da1510cd7d370913f4b7a02e5e2307768c660b23fdc289"} Apr 22 16:22:00.897151 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:00.897116 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:22:00.897340 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:00.897243 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7455g" podUID="f07a62bd-5116-4f09-94bd-5cf21c3a890b" Apr 22 16:22:01.896675 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:01.896641 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:22:01.897314 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:01.896743 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2r4t4" podUID="95147dbd-9393-4f07-9051-3461d90ddadb" Apr 22 16:22:01.897314 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:01.896781 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:22:01.897314 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:01.896883 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cwt8x" podUID="e324836e-ef75-432e-978a-639279d2702e" Apr 22 16:22:01.999949 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:01.999884 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" event={"ID":"239cf2fe-c977-430f-bf19-3a0e5dbd5f8c","Type":"ContainerStarted","Data":"355c80d38ec0fc0c883be7a065ef89f846795affde5ccbb8bd56418e006c51cc"} Apr 22 16:22:02.000424 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:02.000333 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:22:02.019560 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:02.019388 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:22:02.054631 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:02.054584 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" podStartSLOduration=8.634360435 podStartE2EDuration="26.054568869s" podCreationTimestamp="2026-04-22 16:21:36 +0000 UTC" firstStartedPulling="2026-04-22 16:21:39.623003027 +0000 UTC m=+3.558317244" lastFinishedPulling="2026-04-22 16:21:57.043211461 +0000 UTC m=+20.978525678" observedRunningTime="2026-04-22 16:22:02.054282366 +0000 UTC m=+25.989596590" watchObservedRunningTime="2026-04-22 16:22:02.054568869 +0000 UTC m=+25.989883116" Apr 22 16:22:02.896519 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:02.896480 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:22:02.896694 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:02.896621 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7455g" podUID="f07a62bd-5116-4f09-94bd-5cf21c3a890b" Apr 22 16:22:03.003110 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:03.003056 2573 generic.go:358] "Generic (PLEG): container finished" podID="c9801573-c6f5-4b2d-a5ea-6b5b53cf411b" containerID="29a773b7ad1a3c92e18e1cd186a71db1bdff9d59fa2c5a19acbd2379eebd9be6" exitCode=0 Apr 22 16:22:03.003110 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:03.003097 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gh4hm" event={"ID":"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b","Type":"ContainerDied","Data":"29a773b7ad1a3c92e18e1cd186a71db1bdff9d59fa2c5a19acbd2379eebd9be6"} Apr 22 16:22:03.003778 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:03.003714 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:22:03.003778 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:03.003746 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:22:03.018732 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:03.018710 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:22:03.876360 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:03.876116 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7455g"] Apr 22 16:22:03.876538 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:03.876451 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:22:03.876593 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:03.876567 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7455g" podUID="f07a62bd-5116-4f09-94bd-5cf21c3a890b" Apr 22 16:22:03.876959 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:03.876940 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cwt8x"] Apr 22 16:22:03.877040 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:03.877030 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:22:03.877120 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:03.877105 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cwt8x" podUID="e324836e-ef75-432e-978a-639279d2702e" Apr 22 16:22:03.877408 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:03.877388 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2r4t4"] Apr 22 16:22:03.877490 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:03.877477 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:22:03.877572 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:03.877554 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2r4t4" podUID="95147dbd-9393-4f07-9051-3461d90ddadb" Apr 22 16:22:04.006459 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:04.006378 2573 generic.go:358] "Generic (PLEG): container finished" podID="c9801573-c6f5-4b2d-a5ea-6b5b53cf411b" containerID="49b190680f37018a34266be3db4dfc4fb8ae39a686ae22f938cb3d5473e3494d" exitCode=0 Apr 22 16:22:04.006893 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:04.006461 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gh4hm" event={"ID":"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b","Type":"ContainerDied","Data":"49b190680f37018a34266be3db4dfc4fb8ae39a686ae22f938cb3d5473e3494d"} Apr 22 16:22:05.010687 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:05.010651 2573 generic.go:358] "Generic (PLEG): container finished" podID="c9801573-c6f5-4b2d-a5ea-6b5b53cf411b" containerID="331b193642f024a7bdac2a7a7443229ca4d961b8f7c0de3fe2961b9d716250d6" exitCode=0 Apr 22 16:22:05.011059 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:05.010733 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gh4hm" event={"ID":"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b","Type":"ContainerDied","Data":"331b193642f024a7bdac2a7a7443229ca4d961b8f7c0de3fe2961b9d716250d6"} Apr 22 16:22:05.897055 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:05.897024 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:22:05.897230 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:05.897024 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:22:05.897230 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:05.897159 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cwt8x" podUID="e324836e-ef75-432e-978a-639279d2702e" Apr 22 16:22:05.897230 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:05.897184 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2r4t4" podUID="95147dbd-9393-4f07-9051-3461d90ddadb" Apr 22 16:22:05.897230 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:05.897024 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:22:05.897422 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:05.897262 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7455g" podUID="f07a62bd-5116-4f09-94bd-5cf21c3a890b" Apr 22 16:22:07.896912 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:07.896875 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:22:07.897574 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:07.897001 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2r4t4" podUID="95147dbd-9393-4f07-9051-3461d90ddadb" Apr 22 16:22:07.897574 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:07.896875 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:22:07.897574 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:07.897106 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cwt8x" podUID="e324836e-ef75-432e-978a-639279d2702e" Apr 22 16:22:07.897574 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:07.896883 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:22:07.897574 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:07.897184 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7455g" podUID="f07a62bd-5116-4f09-94bd-5cf21c3a890b" Apr 22 16:22:09.857288 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.857252 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-251.ec2.internal" event="NodeReady" Apr 22 16:22:09.857896 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.857407 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 16:22:09.896217 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.896171 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-c2xk7"] Apr 22 16:22:09.896407 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.896242 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:22:09.896407 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.896279 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:22:09.896516 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.896457 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:22:09.898880 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.898588 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 16:22:09.898880 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.898604 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dv4l7\"" Apr 22 16:22:09.898880 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.898603 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 16:22:09.898880 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.898738 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 16:22:09.898880 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.898788 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 16:22:09.899188 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.899160 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-phtvm\"" Apr 22 16:22:09.928411 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.928352 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8524b"] Apr 22 16:22:09.928562 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.928455 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-c2xk7" Apr 22 16:22:09.930600 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.930577 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 16:22:09.930719 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.930610 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pdvf2\"" Apr 22 16:22:09.930719 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.930623 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 16:22:09.930719 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.930690 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 16:22:09.948647 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.948618 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-c2xk7"] Apr 22 16:22:09.948647 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.948648 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8524b"] Apr 22 16:22:09.948847 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.948790 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8524b" Apr 22 16:22:09.950941 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.950919 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 16:22:09.951061 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.950964 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 16:22:09.951061 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:09.951021 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fsrq7\"" Apr 22 16:22:10.056177 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.056140 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ed730508-b5b5-44cd-b56a-f58225697c5d-tmp-dir\") pod \"dns-default-8524b\" (UID: \"ed730508-b5b5-44cd-b56a-f58225697c5d\") " pod="openshift-dns/dns-default-8524b" Apr 22 16:22:10.056369 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.056231 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls\") pod \"dns-default-8524b\" (UID: \"ed730508-b5b5-44cd-b56a-f58225697c5d\") " pod="openshift-dns/dns-default-8524b" Apr 22 16:22:10.056369 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.056269 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed730508-b5b5-44cd-b56a-f58225697c5d-config-volume\") pod \"dns-default-8524b\" (UID: \"ed730508-b5b5-44cd-b56a-f58225697c5d\") " pod="openshift-dns/dns-default-8524b" Apr 22 16:22:10.056493 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.056364 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert\") pod \"ingress-canary-c2xk7\" (UID: \"bf42ef1a-eb82-48c4-b318-0b119dfdda61\") " pod="openshift-ingress-canary/ingress-canary-c2xk7" Apr 22 16:22:10.056493 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.056405 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd97x\" (UniqueName: \"kubernetes.io/projected/ed730508-b5b5-44cd-b56a-f58225697c5d-kube-api-access-gd97x\") pod \"dns-default-8524b\" (UID: \"ed730508-b5b5-44cd-b56a-f58225697c5d\") " pod="openshift-dns/dns-default-8524b" Apr 22 16:22:10.056493 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.056436 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qztwx\" (UniqueName: \"kubernetes.io/projected/bf42ef1a-eb82-48c4-b318-0b119dfdda61-kube-api-access-qztwx\") pod \"ingress-canary-c2xk7\" (UID: \"bf42ef1a-eb82-48c4-b318-0b119dfdda61\") " pod="openshift-ingress-canary/ingress-canary-c2xk7" Apr 22 16:22:10.156951 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.156867 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qztwx\" (UniqueName: \"kubernetes.io/projected/bf42ef1a-eb82-48c4-b318-0b119dfdda61-kube-api-access-qztwx\") pod \"ingress-canary-c2xk7\" (UID: \"bf42ef1a-eb82-48c4-b318-0b119dfdda61\") " pod="openshift-ingress-canary/ingress-canary-c2xk7" Apr 22 16:22:10.156951 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.156926 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ed730508-b5b5-44cd-b56a-f58225697c5d-tmp-dir\") pod \"dns-default-8524b\" (UID: \"ed730508-b5b5-44cd-b56a-f58225697c5d\") " pod="openshift-dns/dns-default-8524b" Apr 22 16:22:10.157219 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.156969 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls\") pod \"dns-default-8524b\" (UID: \"ed730508-b5b5-44cd-b56a-f58225697c5d\") " pod="openshift-dns/dns-default-8524b" Apr 22 16:22:10.157219 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.156991 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed730508-b5b5-44cd-b56a-f58225697c5d-config-volume\") pod \"dns-default-8524b\" (UID: \"ed730508-b5b5-44cd-b56a-f58225697c5d\") " pod="openshift-dns/dns-default-8524b" Apr 22 16:22:10.157219 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.157029 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert\") pod \"ingress-canary-c2xk7\" (UID: \"bf42ef1a-eb82-48c4-b318-0b119dfdda61\") " pod="openshift-ingress-canary/ingress-canary-c2xk7" Apr 22 16:22:10.157219 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.157055 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gd97x\" (UniqueName: \"kubernetes.io/projected/ed730508-b5b5-44cd-b56a-f58225697c5d-kube-api-access-gd97x\") pod \"dns-default-8524b\" (UID: \"ed730508-b5b5-44cd-b56a-f58225697c5d\") " pod="openshift-dns/dns-default-8524b" Apr 22 16:22:10.157219 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:10.157125 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:22:10.157219 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:10.157196 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:22:10.157219 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:10.157211 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls podName:ed730508-b5b5-44cd-b56a-f58225697c5d nodeName:}" failed. No retries permitted until 2026-04-22 16:22:10.657188339 +0000 UTC m=+34.592502558 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls") pod "dns-default-8524b" (UID: "ed730508-b5b5-44cd-b56a-f58225697c5d") : secret "dns-default-metrics-tls" not found Apr 22 16:22:10.157489 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:10.157263 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert podName:bf42ef1a-eb82-48c4-b318-0b119dfdda61 nodeName:}" failed. No retries permitted until 2026-04-22 16:22:10.657245599 +0000 UTC m=+34.592559823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert") pod "ingress-canary-c2xk7" (UID: "bf42ef1a-eb82-48c4-b318-0b119dfdda61") : secret "canary-serving-cert" not found Apr 22 16:22:10.157489 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.157333 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ed730508-b5b5-44cd-b56a-f58225697c5d-tmp-dir\") pod \"dns-default-8524b\" (UID: \"ed730508-b5b5-44cd-b56a-f58225697c5d\") " pod="openshift-dns/dns-default-8524b" Apr 22 16:22:10.157569 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.157544 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed730508-b5b5-44cd-b56a-f58225697c5d-config-volume\") pod \"dns-default-8524b\" (UID: \"ed730508-b5b5-44cd-b56a-f58225697c5d\") " pod="openshift-dns/dns-default-8524b" Apr 22 16:22:10.169232 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.169204 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd97x\" (UniqueName: \"kubernetes.io/projected/ed730508-b5b5-44cd-b56a-f58225697c5d-kube-api-access-gd97x\") pod \"dns-default-8524b\" (UID: \"ed730508-b5b5-44cd-b56a-f58225697c5d\") " pod="openshift-dns/dns-default-8524b" Apr 22 16:22:10.169385 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.169367 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qztwx\" (UniqueName: \"kubernetes.io/projected/bf42ef1a-eb82-48c4-b318-0b119dfdda61-kube-api-access-qztwx\") pod \"ingress-canary-c2xk7\" (UID: \"bf42ef1a-eb82-48c4-b318-0b119dfdda61\") " pod="openshift-ingress-canary/ingress-canary-c2xk7" Apr 22 16:22:10.358264 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.358226 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6q76d\" (UniqueName: \"kubernetes.io/projected/95147dbd-9393-4f07-9051-3461d90ddadb-kube-api-access-6q76d\") pod \"network-check-target-2r4t4\" (UID: \"95147dbd-9393-4f07-9051-3461d90ddadb\") " pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:22:10.358426 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.358276 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs\") pod \"network-metrics-daemon-cwt8x\" (UID: \"e324836e-ef75-432e-978a-639279d2702e\") " pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:22:10.358426 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:10.358405 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 16:22:10.358538 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:10.358479 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs podName:e324836e-ef75-432e-978a-639279d2702e nodeName:}" failed. No retries permitted until 2026-04-22 16:22:42.358459106 +0000 UTC m=+66.293773323 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs") pod "network-metrics-daemon-cwt8x" (UID: "e324836e-ef75-432e-978a-639279d2702e") : secret "metrics-daemon-secret" not found Apr 22 16:22:10.361121 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.361095 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q76d\" (UniqueName: \"kubernetes.io/projected/95147dbd-9393-4f07-9051-3461d90ddadb-kube-api-access-6q76d\") pod \"network-check-target-2r4t4\" (UID: \"95147dbd-9393-4f07-9051-3461d90ddadb\") " pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:22:10.517987 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.517893 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:22:10.660357 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.660325 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert\") pod \"ingress-canary-c2xk7\" (UID: \"bf42ef1a-eb82-48c4-b318-0b119dfdda61\") " pod="openshift-ingress-canary/ingress-canary-c2xk7" Apr 22 16:22:10.660552 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.660421 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls\") pod \"dns-default-8524b\" (UID: \"ed730508-b5b5-44cd-b56a-f58225697c5d\") " pod="openshift-dns/dns-default-8524b" Apr 22 16:22:10.660552 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:10.660492 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:22:10.660552 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:10.660530 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:22:10.660702 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:10.660564 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert podName:bf42ef1a-eb82-48c4-b318-0b119dfdda61 nodeName:}" failed. No retries permitted until 2026-04-22 16:22:11.660540949 +0000 UTC m=+35.595855164 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert") pod "ingress-canary-c2xk7" (UID: "bf42ef1a-eb82-48c4-b318-0b119dfdda61") : secret "canary-serving-cert" not found Apr 22 16:22:10.660702 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:10.660583 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls podName:ed730508-b5b5-44cd-b56a-f58225697c5d nodeName:}" failed. No retries permitted until 2026-04-22 16:22:11.66057623 +0000 UTC m=+35.595890433 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls") pod "dns-default-8524b" (UID: "ed730508-b5b5-44cd-b56a-f58225697c5d") : secret "dns-default-metrics-tls" not found Apr 22 16:22:10.681134 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:10.681095 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2r4t4"] Apr 22 16:22:10.686384 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:22:10.686357 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95147dbd_9393_4f07_9051_3461d90ddadb.slice/crio-88a2fc9d2e29035f07bb5ec3b2ce5f3204ca0227da497cc92428b5ec0fb16496 WatchSource:0}: Error finding container 88a2fc9d2e29035f07bb5ec3b2ce5f3204ca0227da497cc92428b5ec0fb16496: Status 404 returned error can't find the container with id 88a2fc9d2e29035f07bb5ec3b2ce5f3204ca0227da497cc92428b5ec0fb16496 Apr 22 16:22:11.023298 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:11.023265 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2r4t4" event={"ID":"95147dbd-9393-4f07-9051-3461d90ddadb","Type":"ContainerStarted","Data":"88a2fc9d2e29035f07bb5ec3b2ce5f3204ca0227da497cc92428b5ec0fb16496"} Apr 22 16:22:11.669003 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:11.668957 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls\") pod \"dns-default-8524b\" (UID: \"ed730508-b5b5-44cd-b56a-f58225697c5d\") " pod="openshift-dns/dns-default-8524b" Apr 22 16:22:11.669297 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:11.669044 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert\") pod \"ingress-canary-c2xk7\" (UID: \"bf42ef1a-eb82-48c4-b318-0b119dfdda61\") " pod="openshift-ingress-canary/ingress-canary-c2xk7" Apr 22 16:22:11.669297 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:11.669117 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:22:11.669297 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:11.669187 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls podName:ed730508-b5b5-44cd-b56a-f58225697c5d nodeName:}" failed. No retries permitted until 2026-04-22 16:22:13.669166034 +0000 UTC m=+37.604480256 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls") pod "dns-default-8524b" (UID: "ed730508-b5b5-44cd-b56a-f58225697c5d") : secret "dns-default-metrics-tls" not found Apr 22 16:22:11.669297 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:11.669200 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:22:11.669297 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:11.669290 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert podName:bf42ef1a-eb82-48c4-b318-0b119dfdda61 nodeName:}" failed. No retries permitted until 2026-04-22 16:22:13.669264986 +0000 UTC m=+37.604579211 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert") pod "ingress-canary-c2xk7" (UID: "bf42ef1a-eb82-48c4-b318-0b119dfdda61") : secret "canary-serving-cert" not found Apr 22 16:22:12.878447 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:12.878412 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f07a62bd-5116-4f09-94bd-5cf21c3a890b-original-pull-secret\") pod \"global-pull-secret-syncer-7455g\" (UID: \"f07a62bd-5116-4f09-94bd-5cf21c3a890b\") " pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:22:12.882568 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:12.882532 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f07a62bd-5116-4f09-94bd-5cf21c3a890b-original-pull-secret\") pod \"global-pull-secret-syncer-7455g\" (UID: \"f07a62bd-5116-4f09-94bd-5cf21c3a890b\") " pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:22:12.909247 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:12.909224 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7455g" Apr 22 16:22:13.683023 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:13.682943 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls\") pod \"dns-default-8524b\" (UID: \"ed730508-b5b5-44cd-b56a-f58225697c5d\") " pod="openshift-dns/dns-default-8524b" Apr 22 16:22:13.683023 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:13.683011 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert\") pod \"ingress-canary-c2xk7\" (UID: \"bf42ef1a-eb82-48c4-b318-0b119dfdda61\") " pod="openshift-ingress-canary/ingress-canary-c2xk7" Apr 22 16:22:13.683213 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:13.683110 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:22:13.683213 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:13.683131 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:22:13.683213 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:13.683188 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert podName:bf42ef1a-eb82-48c4-b318-0b119dfdda61 nodeName:}" failed. No retries permitted until 2026-04-22 16:22:17.683169226 +0000 UTC m=+41.618483434 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert") pod "ingress-canary-c2xk7" (UID: "bf42ef1a-eb82-48c4-b318-0b119dfdda61") : secret "canary-serving-cert" not found Apr 22 16:22:13.683213 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:13.683205 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls podName:ed730508-b5b5-44cd-b56a-f58225697c5d nodeName:}" failed. No retries permitted until 2026-04-22 16:22:17.683197086 +0000 UTC m=+41.618511292 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls") pod "dns-default-8524b" (UID: "ed730508-b5b5-44cd-b56a-f58225697c5d") : secret "dns-default-metrics-tls" not found Apr 22 16:22:14.834289 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:14.834082 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7455g"] Apr 22 16:22:14.837944 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:22:14.837921 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf07a62bd_5116_4f09_94bd_5cf21c3a890b.slice/crio-2e3c83a9eb52f92ebf40557051c487ff4a6e4de607651d03285a80129d14dbb5 WatchSource:0}: Error finding container 2e3c83a9eb52f92ebf40557051c487ff4a6e4de607651d03285a80129d14dbb5: Status 404 returned error can't find the container with id 2e3c83a9eb52f92ebf40557051c487ff4a6e4de607651d03285a80129d14dbb5 Apr 22 16:22:15.033832 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:15.033783 2573 generic.go:358] "Generic (PLEG): container finished" podID="c9801573-c6f5-4b2d-a5ea-6b5b53cf411b" containerID="bbb3b91b1321c01a9746d0297b27be3c86b5bc64a95505731452ffa5247c0c3e" exitCode=0 Apr 22 16:22:15.033942 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:15.033880 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gh4hm" event={"ID":"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b","Type":"ContainerDied","Data":"bbb3b91b1321c01a9746d0297b27be3c86b5bc64a95505731452ffa5247c0c3e"} Apr 22 16:22:15.035318 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:15.035294 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2r4t4" event={"ID":"95147dbd-9393-4f07-9051-3461d90ddadb","Type":"ContainerStarted","Data":"c494eb0a026975842c5b7a3a3c841a6c104f461c223d9174269596cb58417077"} Apr 22 16:22:15.035436 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:15.035419 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:22:15.036316 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:15.036294 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7455g" event={"ID":"f07a62bd-5116-4f09-94bd-5cf21c3a890b","Type":"ContainerStarted","Data":"2e3c83a9eb52f92ebf40557051c487ff4a6e4de607651d03285a80129d14dbb5"} Apr 22 16:22:15.078019 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:15.077593 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-2r4t4" podStartSLOduration=35.046768225 podStartE2EDuration="39.077582325s" podCreationTimestamp="2026-04-22 16:21:36 +0000 UTC" firstStartedPulling="2026-04-22 16:22:10.68840192 +0000 UTC m=+34.623716123" lastFinishedPulling="2026-04-22 16:22:14.719216014 +0000 UTC m=+38.654530223" observedRunningTime="2026-04-22 16:22:15.077200519 +0000 UTC m=+39.012514741" watchObservedRunningTime="2026-04-22 16:22:15.077582325 +0000 UTC m=+39.012896582" Apr 22 16:22:16.040870 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:16.040836 2573 generic.go:358] "Generic (PLEG): container finished" podID="c9801573-c6f5-4b2d-a5ea-6b5b53cf411b" containerID="7b30669f483b68a8233c87353e223d3eb1890c4a04adbcd80cbc23ac5089a40d" exitCode=0 Apr 22 16:22:16.041548 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:16.040918 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gh4hm" event={"ID":"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b","Type":"ContainerDied","Data":"7b30669f483b68a8233c87353e223d3eb1890c4a04adbcd80cbc23ac5089a40d"} Apr 22 16:22:17.046159 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:17.046121 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gh4hm" event={"ID":"c9801573-c6f5-4b2d-a5ea-6b5b53cf411b","Type":"ContainerStarted","Data":"a7ee5c8387692bea14da7170b966c781a4b8a70360216a7f8cc683ed08f4a3a4"} Apr 22 16:22:17.068478 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:17.068431 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gh4hm" podStartSLOduration=5.985834234 podStartE2EDuration="41.068417374s" podCreationTimestamp="2026-04-22 16:21:36 +0000 UTC" firstStartedPulling="2026-04-22 16:21:39.632961971 +0000 UTC m=+3.568276188" lastFinishedPulling="2026-04-22 16:22:14.715545105 +0000 UTC m=+38.650859328" observedRunningTime="2026-04-22 16:22:17.067087655 +0000 UTC m=+41.002401893" watchObservedRunningTime="2026-04-22 16:22:17.068417374 +0000 UTC m=+41.003731600" Apr 22 16:22:17.713099 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:17.713057 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert\") pod \"ingress-canary-c2xk7\" (UID: \"bf42ef1a-eb82-48c4-b318-0b119dfdda61\") " pod="openshift-ingress-canary/ingress-canary-c2xk7" Apr 22 16:22:17.713289 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:17.713130 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls\") pod \"dns-default-8524b\" (UID: \"ed730508-b5b5-44cd-b56a-f58225697c5d\") " pod="openshift-dns/dns-default-8524b" Apr 22 16:22:17.713289 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:17.713224 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:22:17.713399 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:17.713295 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert podName:bf42ef1a-eb82-48c4-b318-0b119dfdda61 nodeName:}" failed. No retries permitted until 2026-04-22 16:22:25.713275616 +0000 UTC m=+49.648589833 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert") pod "ingress-canary-c2xk7" (UID: "bf42ef1a-eb82-48c4-b318-0b119dfdda61") : secret "canary-serving-cert" not found Apr 22 16:22:17.713399 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:17.713232 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:22:17.713399 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:17.713391 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls podName:ed730508-b5b5-44cd-b56a-f58225697c5d nodeName:}" failed. No retries permitted until 2026-04-22 16:22:25.713372268 +0000 UTC m=+49.648686478 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls") pod "dns-default-8524b" (UID: "ed730508-b5b5-44cd-b56a-f58225697c5d") : secret "dns-default-metrics-tls" not found Apr 22 16:22:20.052860 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:20.052821 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7455g" event={"ID":"f07a62bd-5116-4f09-94bd-5cf21c3a890b","Type":"ContainerStarted","Data":"9b7e5f848f4c90fdef7abfc86d182582b6644fb2b2fbf1e832a6063f04645e26"} Apr 22 16:22:20.066817 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:20.066748 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7455g" podStartSLOduration=34.593638696 podStartE2EDuration="39.066734764s" podCreationTimestamp="2026-04-22 16:21:41 +0000 UTC" firstStartedPulling="2026-04-22 16:22:14.839699548 +0000 UTC m=+38.775013758" lastFinishedPulling="2026-04-22 16:22:19.312795608 +0000 UTC m=+43.248109826" observedRunningTime="2026-04-22 16:22:20.066066769 +0000 UTC m=+44.001380991" watchObservedRunningTime="2026-04-22 16:22:20.066734764 +0000 UTC m=+44.002048989" Apr 22 16:22:25.764618 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:25.764577 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls\") pod \"dns-default-8524b\" (UID: \"ed730508-b5b5-44cd-b56a-f58225697c5d\") " pod="openshift-dns/dns-default-8524b" Apr 22 16:22:25.764618 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:25.764624 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert\") pod \"ingress-canary-c2xk7\" (UID: \"bf42ef1a-eb82-48c4-b318-0b119dfdda61\") " pod="openshift-ingress-canary/ingress-canary-c2xk7" Apr 22 16:22:25.765062 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:25.764713 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:22:25.765062 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:25.764718 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:22:25.765062 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:25.764792 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert podName:bf42ef1a-eb82-48c4-b318-0b119dfdda61 nodeName:}" failed. No retries permitted until 2026-04-22 16:22:41.764777434 +0000 UTC m=+65.700091638 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert") pod "ingress-canary-c2xk7" (UID: "bf42ef1a-eb82-48c4-b318-0b119dfdda61") : secret "canary-serving-cert" not found Apr 22 16:22:25.765062 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:25.764804 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls podName:ed730508-b5b5-44cd-b56a-f58225697c5d nodeName:}" failed. No retries permitted until 2026-04-22 16:22:41.764798785 +0000 UTC m=+65.700112988 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls") pod "dns-default-8524b" (UID: "ed730508-b5b5-44cd-b56a-f58225697c5d") : secret "dns-default-metrics-tls" not found Apr 22 16:22:35.024250 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:35.024219 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2mskd" Apr 22 16:22:41.767193 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:41.767149 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls\") pod \"dns-default-8524b\" (UID: \"ed730508-b5b5-44cd-b56a-f58225697c5d\") " pod="openshift-dns/dns-default-8524b" Apr 22 16:22:41.767193 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:41.767200 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert\") pod \"ingress-canary-c2xk7\" (UID: \"bf42ef1a-eb82-48c4-b318-0b119dfdda61\") " pod="openshift-ingress-canary/ingress-canary-c2xk7" Apr 22 16:22:41.767611 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:41.767299 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:22:41.767611 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:41.767307 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:22:41.767611 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:41.767374 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert podName:bf42ef1a-eb82-48c4-b318-0b119dfdda61 nodeName:}" failed. No retries permitted until 2026-04-22 16:23:13.767337066 +0000 UTC m=+97.702651269 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert") pod "ingress-canary-c2xk7" (UID: "bf42ef1a-eb82-48c4-b318-0b119dfdda61") : secret "canary-serving-cert" not found Apr 22 16:22:41.767611 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:41.767388 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls podName:ed730508-b5b5-44cd-b56a-f58225697c5d nodeName:}" failed. No retries permitted until 2026-04-22 16:23:13.767382135 +0000 UTC m=+97.702696338 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls") pod "dns-default-8524b" (UID: "ed730508-b5b5-44cd-b56a-f58225697c5d") : secret "dns-default-metrics-tls" not found Apr 22 16:22:42.371994 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:42.371954 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs\") pod \"network-metrics-daemon-cwt8x\" (UID: \"e324836e-ef75-432e-978a-639279d2702e\") " pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:22:42.372167 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:42.372099 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 16:22:42.372167 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:22:42.372164 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs podName:e324836e-ef75-432e-978a-639279d2702e nodeName:}" failed. No retries permitted until 2026-04-22 16:23:46.372147445 +0000 UTC m=+130.307461648 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs") pod "network-metrics-daemon-cwt8x" (UID: "e324836e-ef75-432e-978a-639279d2702e") : secret "metrics-daemon-secret" not found Apr 22 16:22:46.042737 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:22:46.042708 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-2r4t4" Apr 22 16:23:13.783295 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:13.783205 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls\") pod \"dns-default-8524b\" (UID: \"ed730508-b5b5-44cd-b56a-f58225697c5d\") " pod="openshift-dns/dns-default-8524b" Apr 22 16:23:13.783295 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:13.783261 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert\") pod \"ingress-canary-c2xk7\" (UID: \"bf42ef1a-eb82-48c4-b318-0b119dfdda61\") " pod="openshift-ingress-canary/ingress-canary-c2xk7" Apr 22 16:23:13.783706 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:13.783346 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:23:13.783706 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:13.783405 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls podName:ed730508-b5b5-44cd-b56a-f58225697c5d nodeName:}" failed. No retries permitted until 2026-04-22 16:24:17.783390094 +0000 UTC m=+161.718704301 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls") pod "dns-default-8524b" (UID: "ed730508-b5b5-44cd-b56a-f58225697c5d") : secret "dns-default-metrics-tls" not found Apr 22 16:23:13.783706 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:13.783355 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:23:13.783706 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:13.783485 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert podName:bf42ef1a-eb82-48c4-b318-0b119dfdda61 nodeName:}" failed. No retries permitted until 2026-04-22 16:24:17.783472932 +0000 UTC m=+161.718787135 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert") pod "ingress-canary-c2xk7" (UID: "bf42ef1a-eb82-48c4-b318-0b119dfdda61") : secret "canary-serving-cert" not found Apr 22 16:23:40.211613 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:40.211576 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g"] Apr 22 16:23:40.214278 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:40.214263 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g" Apr 22 16:23:40.217362 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:40.217340 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 16:23:40.218211 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:40.218192 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-5x4sl\"" Apr 22 16:23:40.218313 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:40.218232 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 16:23:40.218313 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:40.218241 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 16:23:40.218313 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:40.218234 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 16:23:40.222951 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:40.222932 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g"] Apr 22 16:23:40.355927 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:40.355902 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7bw5\" (UniqueName: \"kubernetes.io/projected/c9c547d5-42c1-445c-b145-1e317ab8947a-kube-api-access-q7bw5\") pod \"cluster-monitoring-operator-75587bd455-c7m6g\" (UID: \"c9c547d5-42c1-445c-b145-1e317ab8947a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g" Apr 22 16:23:40.356063 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:40.355934 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c9c547d5-42c1-445c-b145-1e317ab8947a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-c7m6g\" (UID: \"c9c547d5-42c1-445c-b145-1e317ab8947a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g" Apr 22 16:23:40.356063 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:40.355957 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9c547d5-42c1-445c-b145-1e317ab8947a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c7m6g\" (UID: \"c9c547d5-42c1-445c-b145-1e317ab8947a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g" Apr 22 16:23:40.456507 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:40.456450 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7bw5\" (UniqueName: \"kubernetes.io/projected/c9c547d5-42c1-445c-b145-1e317ab8947a-kube-api-access-q7bw5\") pod \"cluster-monitoring-operator-75587bd455-c7m6g\" (UID: \"c9c547d5-42c1-445c-b145-1e317ab8947a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g" Apr 22 16:23:40.456701 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:40.456520 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c9c547d5-42c1-445c-b145-1e317ab8947a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-c7m6g\" (UID: \"c9c547d5-42c1-445c-b145-1e317ab8947a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g" Apr 22 16:23:40.456701 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:40.456549 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9c547d5-42c1-445c-b145-1e317ab8947a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c7m6g\" (UID: \"c9c547d5-42c1-445c-b145-1e317ab8947a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g" Apr 22 16:23:40.456701 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:40.456687 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 16:23:40.456866 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:40.456747 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9c547d5-42c1-445c-b145-1e317ab8947a-cluster-monitoring-operator-tls podName:c9c547d5-42c1-445c-b145-1e317ab8947a nodeName:}" failed. No retries permitted until 2026-04-22 16:23:40.956732947 +0000 UTC m=+124.892047150 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c9c547d5-42c1-445c-b145-1e317ab8947a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-c7m6g" (UID: "c9c547d5-42c1-445c-b145-1e317ab8947a") : secret "cluster-monitoring-operator-tls" not found Apr 22 16:23:40.457268 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:40.457247 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c9c547d5-42c1-445c-b145-1e317ab8947a-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-c7m6g\" (UID: \"c9c547d5-42c1-445c-b145-1e317ab8947a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g" Apr 22 16:23:40.465769 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:40.465689 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7bw5\" (UniqueName: \"kubernetes.io/projected/c9c547d5-42c1-445c-b145-1e317ab8947a-kube-api-access-q7bw5\") pod \"cluster-monitoring-operator-75587bd455-c7m6g\" (UID: \"c9c547d5-42c1-445c-b145-1e317ab8947a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g" Apr 22 16:23:40.960238 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:40.960191 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9c547d5-42c1-445c-b145-1e317ab8947a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c7m6g\" (UID: \"c9c547d5-42c1-445c-b145-1e317ab8947a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g" Apr 22 16:23:40.960429 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:40.960307 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 16:23:40.960429 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:40.960369 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9c547d5-42c1-445c-b145-1e317ab8947a-cluster-monitoring-operator-tls podName:c9c547d5-42c1-445c-b145-1e317ab8947a nodeName:}" failed. No retries permitted until 2026-04-22 16:23:41.960355077 +0000 UTC m=+125.895669279 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c9c547d5-42c1-445c-b145-1e317ab8947a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-c7m6g" (UID: "c9c547d5-42c1-445c-b145-1e317ab8947a") : secret "cluster-monitoring-operator-tls" not found Apr 22 16:23:41.965889 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:41.965847 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9c547d5-42c1-445c-b145-1e317ab8947a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c7m6g\" (UID: \"c9c547d5-42c1-445c-b145-1e317ab8947a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g" Apr 22 16:23:41.966257 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:41.965976 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 16:23:41.966257 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:41.966039 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9c547d5-42c1-445c-b145-1e317ab8947a-cluster-monitoring-operator-tls podName:c9c547d5-42c1-445c-b145-1e317ab8947a nodeName:}" failed. No retries permitted until 2026-04-22 16:23:43.96602352 +0000 UTC m=+127.901337722 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c9c547d5-42c1-445c-b145-1e317ab8947a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-c7m6g" (UID: "c9c547d5-42c1-445c-b145-1e317ab8947a") : secret "cluster-monitoring-operator-tls" not found Apr 22 16:23:43.979834 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:43.979794 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9c547d5-42c1-445c-b145-1e317ab8947a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c7m6g\" (UID: \"c9c547d5-42c1-445c-b145-1e317ab8947a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g" Apr 22 16:23:43.980233 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:43.979936 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 16:23:43.980233 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:43.979997 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9c547d5-42c1-445c-b145-1e317ab8947a-cluster-monitoring-operator-tls podName:c9c547d5-42c1-445c-b145-1e317ab8947a nodeName:}" failed. No retries permitted until 2026-04-22 16:23:47.979982617 +0000 UTC m=+131.915296821 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c9c547d5-42c1-445c-b145-1e317ab8947a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-c7m6g" (UID: "c9c547d5-42c1-445c-b145-1e317ab8947a") : secret "cluster-monitoring-operator-tls" not found Apr 22 16:23:45.261494 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.261459 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-hx648"] Apr 22 16:23:45.264559 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.264538 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-hx648" Apr 22 16:23:45.266513 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.266492 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 16:23:45.266594 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.266500 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-x5qk2\"" Apr 22 16:23:45.267401 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.267380 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 16:23:45.267505 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.267390 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 16:23:45.267505 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.267457 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 16:23:45.271252 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.271011 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 16:23:45.273770 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.273736 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-hx648"] Apr 22 16:23:45.389940 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.389900 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ae7e321-7d7a-4cff-b23f-dfbc5af07459-config\") pod \"console-operator-9d4b6777b-hx648\" (UID: \"4ae7e321-7d7a-4cff-b23f-dfbc5af07459\") " pod="openshift-console-operator/console-operator-9d4b6777b-hx648" Apr 22 16:23:45.389940 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.389942 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ae7e321-7d7a-4cff-b23f-dfbc5af07459-serving-cert\") pod \"console-operator-9d4b6777b-hx648\" (UID: \"4ae7e321-7d7a-4cff-b23f-dfbc5af07459\") " pod="openshift-console-operator/console-operator-9d4b6777b-hx648" Apr 22 16:23:45.390199 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.390050 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ae7e321-7d7a-4cff-b23f-dfbc5af07459-trusted-ca\") pod \"console-operator-9d4b6777b-hx648\" (UID: \"4ae7e321-7d7a-4cff-b23f-dfbc5af07459\") " pod="openshift-console-operator/console-operator-9d4b6777b-hx648" Apr 22 16:23:45.390199 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.390081 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx5xs\" (UniqueName: \"kubernetes.io/projected/4ae7e321-7d7a-4cff-b23f-dfbc5af07459-kube-api-access-gx5xs\") pod \"console-operator-9d4b6777b-hx648\" (UID: \"4ae7e321-7d7a-4cff-b23f-dfbc5af07459\") " pod="openshift-console-operator/console-operator-9d4b6777b-hx648" Apr 22 16:23:45.490439 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.490410 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ae7e321-7d7a-4cff-b23f-dfbc5af07459-config\") pod \"console-operator-9d4b6777b-hx648\" (UID: \"4ae7e321-7d7a-4cff-b23f-dfbc5af07459\") " pod="openshift-console-operator/console-operator-9d4b6777b-hx648" Apr 22 16:23:45.490540 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.490445 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ae7e321-7d7a-4cff-b23f-dfbc5af07459-serving-cert\") pod \"console-operator-9d4b6777b-hx648\" (UID: \"4ae7e321-7d7a-4cff-b23f-dfbc5af07459\") " pod="openshift-console-operator/console-operator-9d4b6777b-hx648" Apr 22 16:23:45.490540 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.490493 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ae7e321-7d7a-4cff-b23f-dfbc5af07459-trusted-ca\") pod \"console-operator-9d4b6777b-hx648\" (UID: \"4ae7e321-7d7a-4cff-b23f-dfbc5af07459\") " pod="openshift-console-operator/console-operator-9d4b6777b-hx648" Apr 22 16:23:45.490540 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.490514 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gx5xs\" (UniqueName: \"kubernetes.io/projected/4ae7e321-7d7a-4cff-b23f-dfbc5af07459-kube-api-access-gx5xs\") pod \"console-operator-9d4b6777b-hx648\" (UID: \"4ae7e321-7d7a-4cff-b23f-dfbc5af07459\") " pod="openshift-console-operator/console-operator-9d4b6777b-hx648" Apr 22 16:23:45.491059 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.491039 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ae7e321-7d7a-4cff-b23f-dfbc5af07459-config\") pod \"console-operator-9d4b6777b-hx648\" (UID: \"4ae7e321-7d7a-4cff-b23f-dfbc5af07459\") " pod="openshift-console-operator/console-operator-9d4b6777b-hx648" Apr 22 16:23:45.491266 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.491241 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4ae7e321-7d7a-4cff-b23f-dfbc5af07459-trusted-ca\") pod \"console-operator-9d4b6777b-hx648\" (UID: \"4ae7e321-7d7a-4cff-b23f-dfbc5af07459\") " pod="openshift-console-operator/console-operator-9d4b6777b-hx648" Apr 22 16:23:45.492816 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.492797 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ae7e321-7d7a-4cff-b23f-dfbc5af07459-serving-cert\") pod \"console-operator-9d4b6777b-hx648\" (UID: \"4ae7e321-7d7a-4cff-b23f-dfbc5af07459\") " pod="openshift-console-operator/console-operator-9d4b6777b-hx648" Apr 22 16:23:45.497850 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.497827 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx5xs\" (UniqueName: \"kubernetes.io/projected/4ae7e321-7d7a-4cff-b23f-dfbc5af07459-kube-api-access-gx5xs\") pod \"console-operator-9d4b6777b-hx648\" (UID: \"4ae7e321-7d7a-4cff-b23f-dfbc5af07459\") " pod="openshift-console-operator/console-operator-9d4b6777b-hx648" Apr 22 16:23:45.573860 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.573837 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-hx648" Apr 22 16:23:45.685804 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:45.685777 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-hx648"] Apr 22 16:23:45.688798 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:23:45.688775 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ae7e321_7d7a_4cff_b23f_dfbc5af07459.slice/crio-d621940a141119846345714620d5d9c8ce6774270ead180d571c60df835ebf5a WatchSource:0}: Error finding container d621940a141119846345714620d5d9c8ce6774270ead180d571c60df835ebf5a: Status 404 returned error can't find the container with id d621940a141119846345714620d5d9c8ce6774270ead180d571c60df835ebf5a Apr 22 16:23:46.214618 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:46.214589 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-hx648" event={"ID":"4ae7e321-7d7a-4cff-b23f-dfbc5af07459","Type":"ContainerStarted","Data":"d621940a141119846345714620d5d9c8ce6774270ead180d571c60df835ebf5a"} Apr 22 16:23:46.339358 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:46.339333 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ljkt5_1f2b5ca6-7540-4d0e-88b4-b34788bdeb77/dns-node-resolver/0.log" Apr 22 16:23:46.396801 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:46.396768 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs\") pod \"network-metrics-daemon-cwt8x\" (UID: \"e324836e-ef75-432e-978a-639279d2702e\") " pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:23:46.396950 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:46.396932 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 16:23:46.397018 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:46.397007 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs podName:e324836e-ef75-432e-978a-639279d2702e nodeName:}" failed. No retries permitted until 2026-04-22 16:25:48.396986674 +0000 UTC m=+252.332300880 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs") pod "network-metrics-daemon-cwt8x" (UID: "e324836e-ef75-432e-978a-639279d2702e") : secret "metrics-daemon-secret" not found Apr 22 16:23:47.137244 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:47.137214 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zgppz_ca6eb44a-3f8d-408b-ae0d-0ef553dc08d8/node-ca/0.log" Apr 22 16:23:48.008910 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:48.008877 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9c547d5-42c1-445c-b145-1e317ab8947a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c7m6g\" (UID: \"c9c547d5-42c1-445c-b145-1e317ab8947a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g" Apr 22 16:23:48.009333 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:48.009025 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 16:23:48.009333 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:48.009136 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9c547d5-42c1-445c-b145-1e317ab8947a-cluster-monitoring-operator-tls podName:c9c547d5-42c1-445c-b145-1e317ab8947a nodeName:}" failed. No retries permitted until 2026-04-22 16:23:56.009115946 +0000 UTC m=+139.944430168 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c9c547d5-42c1-445c-b145-1e317ab8947a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-c7m6g" (UID: "c9c547d5-42c1-445c-b145-1e317ab8947a") : secret "cluster-monitoring-operator-tls" not found Apr 22 16:23:49.221413 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:49.221387 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/0.log" Apr 22 16:23:49.221797 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:49.221425 2573 generic.go:358] "Generic (PLEG): container finished" podID="4ae7e321-7d7a-4cff-b23f-dfbc5af07459" containerID="11ead9e2f7a9a5e810b46ede99e627a9fe2c03ca30811cc9fe990801bc1da180" exitCode=255 Apr 22 16:23:49.221797 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:49.221461 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-hx648" event={"ID":"4ae7e321-7d7a-4cff-b23f-dfbc5af07459","Type":"ContainerDied","Data":"11ead9e2f7a9a5e810b46ede99e627a9fe2c03ca30811cc9fe990801bc1da180"} Apr 22 16:23:49.221797 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:49.221700 2573 scope.go:117] "RemoveContainer" containerID="11ead9e2f7a9a5e810b46ede99e627a9fe2c03ca30811cc9fe990801bc1da180" Apr 22 16:23:50.224438 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.224415 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 16:23:50.224891 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.224782 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/0.log" Apr 22 16:23:50.224891 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.224815 2573 generic.go:358] "Generic (PLEG): container finished" podID="4ae7e321-7d7a-4cff-b23f-dfbc5af07459" containerID="c099cad984bc935b2d9326ad0e058dab68fe7e109c59c590f772fe0173580d00" exitCode=255 Apr 22 16:23:50.224891 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.224845 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-hx648" event={"ID":"4ae7e321-7d7a-4cff-b23f-dfbc5af07459","Type":"ContainerDied","Data":"c099cad984bc935b2d9326ad0e058dab68fe7e109c59c590f772fe0173580d00"} Apr 22 16:23:50.224891 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.224873 2573 scope.go:117] "RemoveContainer" containerID="11ead9e2f7a9a5e810b46ede99e627a9fe2c03ca30811cc9fe990801bc1da180" Apr 22 16:23:50.225185 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.225163 2573 scope.go:117] "RemoveContainer" containerID="c099cad984bc935b2d9326ad0e058dab68fe7e109c59c590f772fe0173580d00" Apr 22 16:23:50.225363 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:50.225343 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-hx648_openshift-console-operator(4ae7e321-7d7a-4cff-b23f-dfbc5af07459)\"" pod="openshift-console-operator/console-operator-9d4b6777b-hx648" podUID="4ae7e321-7d7a-4cff-b23f-dfbc5af07459" Apr 22 16:23:50.414686 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.414651 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7867586f55-85f4h"] Apr 22 16:23:50.417699 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.417685 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.419965 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.419943 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 16:23:50.420152 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.420138 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 16:23:50.420227 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.420187 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 16:23:50.420286 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.420247 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-r8rzs\"" Apr 22 16:23:50.424260 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.424240 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 16:23:50.428985 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.428967 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7867586f55-85f4h"] Apr 22 16:23:50.526545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.526459 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/84d6547c-fe65-4d04-ab06-5184bfe5d36e-ca-trust-extracted\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.526545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.526501 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/84d6547c-fe65-4d04-ab06-5184bfe5d36e-installation-pull-secrets\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.526545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.526521 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftt7v\" (UniqueName: \"kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-kube-api-access-ftt7v\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.526545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.526537 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-tls\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.526933 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.526557 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-bound-sa-token\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.526933 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.526642 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-certificates\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.526933 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.526683 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84d6547c-fe65-4d04-ab06-5184bfe5d36e-trusted-ca\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.526933 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.526800 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/84d6547c-fe65-4d04-ab06-5184bfe5d36e-image-registry-private-configuration\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.628107 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.628071 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/84d6547c-fe65-4d04-ab06-5184bfe5d36e-ca-trust-extracted\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.628107 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.628113 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/84d6547c-fe65-4d04-ab06-5184bfe5d36e-installation-pull-secrets\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.628340 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.628130 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftt7v\" (UniqueName: \"kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-kube-api-access-ftt7v\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.628340 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.628163 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-tls\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.628340 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.628180 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-bound-sa-token\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.628340 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.628209 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-certificates\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.628340 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.628226 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84d6547c-fe65-4d04-ab06-5184bfe5d36e-trusted-ca\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.628593 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.628394 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/84d6547c-fe65-4d04-ab06-5184bfe5d36e-image-registry-private-configuration\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.628593 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:50.628422 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 16:23:50.628593 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:50.628445 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7867586f55-85f4h: secret "image-registry-tls" not found Apr 22 16:23:50.628593 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:50.628500 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-tls podName:84d6547c-fe65-4d04-ab06-5184bfe5d36e nodeName:}" failed. No retries permitted until 2026-04-22 16:23:51.128481416 +0000 UTC m=+135.063795618 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-tls") pod "image-registry-7867586f55-85f4h" (UID: "84d6547c-fe65-4d04-ab06-5184bfe5d36e") : secret "image-registry-tls" not found Apr 22 16:23:50.628593 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.628538 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/84d6547c-fe65-4d04-ab06-5184bfe5d36e-ca-trust-extracted\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.629006 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.628983 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-certificates\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.629279 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.629263 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84d6547c-fe65-4d04-ab06-5184bfe5d36e-trusted-ca\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.630877 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.630860 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/84d6547c-fe65-4d04-ab06-5184bfe5d36e-image-registry-private-configuration\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.630969 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.630951 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/84d6547c-fe65-4d04-ab06-5184bfe5d36e-installation-pull-secrets\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.639024 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.639000 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftt7v\" (UniqueName: \"kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-kube-api-access-ftt7v\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:50.639658 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:50.639638 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-bound-sa-token\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:51.133424 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:51.133388 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-tls\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:51.133607 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:51.133550 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 16:23:51.133607 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:51.133569 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7867586f55-85f4h: secret "image-registry-tls" not found Apr 22 16:23:51.133686 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:51.133635 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-tls podName:84d6547c-fe65-4d04-ab06-5184bfe5d36e nodeName:}" failed. No retries permitted until 2026-04-22 16:23:52.1336171 +0000 UTC m=+136.068931308 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-tls") pod "image-registry-7867586f55-85f4h" (UID: "84d6547c-fe65-4d04-ab06-5184bfe5d36e") : secret "image-registry-tls" not found Apr 22 16:23:51.227816 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:51.227789 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 16:23:51.228255 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:51.228214 2573 scope.go:117] "RemoveContainer" containerID="c099cad984bc935b2d9326ad0e058dab68fe7e109c59c590f772fe0173580d00" Apr 22 16:23:51.228440 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:51.228419 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-hx648_openshift-console-operator(4ae7e321-7d7a-4cff-b23f-dfbc5af07459)\"" pod="openshift-console-operator/console-operator-9d4b6777b-hx648" podUID="4ae7e321-7d7a-4cff-b23f-dfbc5af07459" Apr 22 16:23:52.139896 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:52.139863 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-tls\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:52.140063 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:52.140000 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 16:23:52.140063 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:52.140016 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7867586f55-85f4h: secret "image-registry-tls" not found Apr 22 16:23:52.140147 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:52.140079 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-tls podName:84d6547c-fe65-4d04-ab06-5184bfe5d36e nodeName:}" failed. No retries permitted until 2026-04-22 16:23:54.140063693 +0000 UTC m=+138.075377897 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-tls") pod "image-registry-7867586f55-85f4h" (UID: "84d6547c-fe65-4d04-ab06-5184bfe5d36e") : secret "image-registry-tls" not found Apr 22 16:23:54.154374 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:54.154329 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-tls\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:54.154777 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:54.154475 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 16:23:54.154777 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:54.154495 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7867586f55-85f4h: secret "image-registry-tls" not found Apr 22 16:23:54.154777 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:54.154549 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-tls podName:84d6547c-fe65-4d04-ab06-5184bfe5d36e nodeName:}" failed. No retries permitted until 2026-04-22 16:23:58.154533005 +0000 UTC m=+142.089847228 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-tls") pod "image-registry-7867586f55-85f4h" (UID: "84d6547c-fe65-4d04-ab06-5184bfe5d36e") : secret "image-registry-tls" not found Apr 22 16:23:54.337614 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:54.337583 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-2hv2b"] Apr 22 16:23:54.340247 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:54.340231 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2hv2b" Apr 22 16:23:54.342432 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:54.342408 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 16:23:54.342432 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:54.342427 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 16:23:54.343290 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:54.343267 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-dpvw6\"" Apr 22 16:23:54.347768 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:54.347733 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-2hv2b"] Apr 22 16:23:54.457017 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:54.456956 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwhnx\" (UniqueName: \"kubernetes.io/projected/94ee8e4f-0d0b-4075-95d8-5f19844fb295-kube-api-access-hwhnx\") pod \"migrator-74bb7799d9-2hv2b\" (UID: \"94ee8e4f-0d0b-4075-95d8-5f19844fb295\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2hv2b" Apr 22 16:23:54.558300 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:54.558206 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwhnx\" (UniqueName: \"kubernetes.io/projected/94ee8e4f-0d0b-4075-95d8-5f19844fb295-kube-api-access-hwhnx\") pod \"migrator-74bb7799d9-2hv2b\" (UID: \"94ee8e4f-0d0b-4075-95d8-5f19844fb295\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2hv2b" Apr 22 16:23:54.566250 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:54.566225 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwhnx\" (UniqueName: \"kubernetes.io/projected/94ee8e4f-0d0b-4075-95d8-5f19844fb295-kube-api-access-hwhnx\") pod \"migrator-74bb7799d9-2hv2b\" (UID: \"94ee8e4f-0d0b-4075-95d8-5f19844fb295\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2hv2b" Apr 22 16:23:54.649284 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:54.649259 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2hv2b" Apr 22 16:23:54.759374 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:54.759337 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-2hv2b"] Apr 22 16:23:54.762481 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:23:54.762451 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94ee8e4f_0d0b_4075_95d8_5f19844fb295.slice/crio-c00049125a28816e6118329de935e8675fc789c36de9aa671615a16b12f8356e WatchSource:0}: Error finding container c00049125a28816e6118329de935e8675fc789c36de9aa671615a16b12f8356e: Status 404 returned error can't find the container with id c00049125a28816e6118329de935e8675fc789c36de9aa671615a16b12f8356e Apr 22 16:23:55.235923 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:55.235895 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2hv2b" event={"ID":"94ee8e4f-0d0b-4075-95d8-5f19844fb295","Type":"ContainerStarted","Data":"c00049125a28816e6118329de935e8675fc789c36de9aa671615a16b12f8356e"} Apr 22 16:23:55.574418 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:55.574386 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-hx648" Apr 22 16:23:55.574418 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:55.574425 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-hx648" Apr 22 16:23:55.574739 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:55.574728 2573 scope.go:117] "RemoveContainer" containerID="c099cad984bc935b2d9326ad0e058dab68fe7e109c59c590f772fe0173580d00" Apr 22 16:23:55.574930 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:55.574913 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-hx648_openshift-console-operator(4ae7e321-7d7a-4cff-b23f-dfbc5af07459)\"" pod="openshift-console-operator/console-operator-9d4b6777b-hx648" podUID="4ae7e321-7d7a-4cff-b23f-dfbc5af07459" Apr 22 16:23:56.069439 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:56.069414 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9c547d5-42c1-445c-b145-1e317ab8947a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c7m6g\" (UID: \"c9c547d5-42c1-445c-b145-1e317ab8947a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g" Apr 22 16:23:56.069541 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:56.069530 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 16:23:56.069586 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:56.069578 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9c547d5-42c1-445c-b145-1e317ab8947a-cluster-monitoring-operator-tls podName:c9c547d5-42c1-445c-b145-1e317ab8947a nodeName:}" failed. No retries permitted until 2026-04-22 16:24:12.069565401 +0000 UTC m=+156.004879604 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c9c547d5-42c1-445c-b145-1e317ab8947a-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-c7m6g" (UID: "c9c547d5-42c1-445c-b145-1e317ab8947a") : secret "cluster-monitoring-operator-tls" not found Apr 22 16:23:56.239133 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:56.239046 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2hv2b" event={"ID":"94ee8e4f-0d0b-4075-95d8-5f19844fb295","Type":"ContainerStarted","Data":"71aa30270051d4cbee4c8c820246eee5b03e4dbb3887c1c4a2617b9850189512"} Apr 22 16:23:56.239133 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:56.239085 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2hv2b" event={"ID":"94ee8e4f-0d0b-4075-95d8-5f19844fb295","Type":"ContainerStarted","Data":"4e0870b0fd5940e7b1369dc415271c06d17f958f661153bc5879e0cfc6f2cb82"} Apr 22 16:23:56.254021 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:56.253978 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-2hv2b" podStartSLOduration=1.041058989 podStartE2EDuration="2.253965558s" podCreationTimestamp="2026-04-22 16:23:54 +0000 UTC" firstStartedPulling="2026-04-22 16:23:54.764251496 +0000 UTC m=+138.699565699" lastFinishedPulling="2026-04-22 16:23:55.977158062 +0000 UTC m=+139.912472268" observedRunningTime="2026-04-22 16:23:56.253500804 +0000 UTC m=+140.188815028" watchObservedRunningTime="2026-04-22 16:23:56.253965558 +0000 UTC m=+140.189279783" Apr 22 16:23:56.749020 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:56.748988 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xp5dc"] Apr 22 16:23:56.751936 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:56.751919 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-xp5dc" Apr 22 16:23:56.754035 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:56.754015 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 16:23:56.754132 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:56.754067 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 16:23:56.754132 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:56.754077 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 16:23:56.754998 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:56.754980 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-6rdk7\"" Apr 22 16:23:56.755137 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:56.755003 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 16:23:56.759060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:56.759041 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xp5dc"] Apr 22 16:23:56.775426 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:56.775400 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e32eb015-60ab-4c71-b711-f99dcc87477c-signing-key\") pod \"service-ca-865cb79987-xp5dc\" (UID: \"e32eb015-60ab-4c71-b711-f99dcc87477c\") " pod="openshift-service-ca/service-ca-865cb79987-xp5dc" Apr 22 16:23:56.775503 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:56.775465 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e32eb015-60ab-4c71-b711-f99dcc87477c-signing-cabundle\") pod \"service-ca-865cb79987-xp5dc\" (UID: \"e32eb015-60ab-4c71-b711-f99dcc87477c\") " pod="openshift-service-ca/service-ca-865cb79987-xp5dc" Apr 22 16:23:56.775503 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:56.775486 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fntln\" (UniqueName: \"kubernetes.io/projected/e32eb015-60ab-4c71-b711-f99dcc87477c-kube-api-access-fntln\") pod \"service-ca-865cb79987-xp5dc\" (UID: \"e32eb015-60ab-4c71-b711-f99dcc87477c\") " pod="openshift-service-ca/service-ca-865cb79987-xp5dc" Apr 22 16:23:56.876702 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:56.876668 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e32eb015-60ab-4c71-b711-f99dcc87477c-signing-key\") pod \"service-ca-865cb79987-xp5dc\" (UID: \"e32eb015-60ab-4c71-b711-f99dcc87477c\") " pod="openshift-service-ca/service-ca-865cb79987-xp5dc" Apr 22 16:23:56.876863 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:56.876781 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e32eb015-60ab-4c71-b711-f99dcc87477c-signing-cabundle\") pod \"service-ca-865cb79987-xp5dc\" (UID: \"e32eb015-60ab-4c71-b711-f99dcc87477c\") " pod="openshift-service-ca/service-ca-865cb79987-xp5dc" Apr 22 16:23:56.876863 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:56.876806 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fntln\" (UniqueName: \"kubernetes.io/projected/e32eb015-60ab-4c71-b711-f99dcc87477c-kube-api-access-fntln\") pod \"service-ca-865cb79987-xp5dc\" (UID: \"e32eb015-60ab-4c71-b711-f99dcc87477c\") " pod="openshift-service-ca/service-ca-865cb79987-xp5dc" Apr 22 16:23:56.878070 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:56.878042 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/e32eb015-60ab-4c71-b711-f99dcc87477c-signing-cabundle\") pod \"service-ca-865cb79987-xp5dc\" (UID: \"e32eb015-60ab-4c71-b711-f99dcc87477c\") " pod="openshift-service-ca/service-ca-865cb79987-xp5dc" Apr 22 16:23:56.879106 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:56.879085 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/e32eb015-60ab-4c71-b711-f99dcc87477c-signing-key\") pod \"service-ca-865cb79987-xp5dc\" (UID: \"e32eb015-60ab-4c71-b711-f99dcc87477c\") " pod="openshift-service-ca/service-ca-865cb79987-xp5dc" Apr 22 16:23:56.884381 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:56.884353 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fntln\" (UniqueName: \"kubernetes.io/projected/e32eb015-60ab-4c71-b711-f99dcc87477c-kube-api-access-fntln\") pod \"service-ca-865cb79987-xp5dc\" (UID: \"e32eb015-60ab-4c71-b711-f99dcc87477c\") " pod="openshift-service-ca/service-ca-865cb79987-xp5dc" Apr 22 16:23:57.061084 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:57.061051 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-xp5dc" Apr 22 16:23:57.169667 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:57.169636 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xp5dc"] Apr 22 16:23:57.173113 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:23:57.173087 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode32eb015_60ab_4c71_b711_f99dcc87477c.slice/crio-b933d2b477b96b1f3138671212b1a64bd09f20fac3bd7931565043e4ed997ea5 WatchSource:0}: Error finding container b933d2b477b96b1f3138671212b1a64bd09f20fac3bd7931565043e4ed997ea5: Status 404 returned error can't find the container with id b933d2b477b96b1f3138671212b1a64bd09f20fac3bd7931565043e4ed997ea5 Apr 22 16:23:57.242533 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:57.242500 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-xp5dc" event={"ID":"e32eb015-60ab-4c71-b711-f99dcc87477c","Type":"ContainerStarted","Data":"b933d2b477b96b1f3138671212b1a64bd09f20fac3bd7931565043e4ed997ea5"} Apr 22 16:23:58.185266 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:58.185223 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-tls\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:23:58.185440 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:58.185387 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 16:23:58.185440 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:58.185404 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7867586f55-85f4h: secret "image-registry-tls" not found Apr 22 16:23:58.185537 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:23:58.185466 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-tls podName:84d6547c-fe65-4d04-ab06-5184bfe5d36e nodeName:}" failed. No retries permitted until 2026-04-22 16:24:06.185452403 +0000 UTC m=+150.120766629 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-tls") pod "image-registry-7867586f55-85f4h" (UID: "84d6547c-fe65-4d04-ab06-5184bfe5d36e") : secret "image-registry-tls" not found Apr 22 16:23:59.251697 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:59.251610 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-xp5dc" event={"ID":"e32eb015-60ab-4c71-b711-f99dcc87477c","Type":"ContainerStarted","Data":"95ea39b61d7210e5da1e515b67d75c5161166af44741778c37ae609b6ba05aef"} Apr 22 16:23:59.265594 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:23:59.265556 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-xp5dc" podStartSLOduration=1.5366982409999999 podStartE2EDuration="3.265543599s" podCreationTimestamp="2026-04-22 16:23:56 +0000 UTC" firstStartedPulling="2026-04-22 16:23:57.174856066 +0000 UTC m=+141.110170270" lastFinishedPulling="2026-04-22 16:23:58.903701419 +0000 UTC m=+142.839015628" observedRunningTime="2026-04-22 16:23:59.264790167 +0000 UTC m=+143.200104389" watchObservedRunningTime="2026-04-22 16:23:59.265543599 +0000 UTC m=+143.200857824" Apr 22 16:24:06.248121 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:06.248079 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-tls\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:24:06.250553 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:06.250528 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-tls\") pod \"image-registry-7867586f55-85f4h\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:24:06.327391 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:06.327347 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:24:06.451604 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:06.451580 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7867586f55-85f4h"] Apr 22 16:24:06.454023 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:24:06.453992 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84d6547c_fe65_4d04_ab06_5184bfe5d36e.slice/crio-b9fc76e261e2fb9229784e76c90c7e4ce06f6effbd6c52deb3f4006178e660b5 WatchSource:0}: Error finding container b9fc76e261e2fb9229784e76c90c7e4ce06f6effbd6c52deb3f4006178e660b5: Status 404 returned error can't find the container with id b9fc76e261e2fb9229784e76c90c7e4ce06f6effbd6c52deb3f4006178e660b5 Apr 22 16:24:07.274055 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:07.274015 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7867586f55-85f4h" event={"ID":"84d6547c-fe65-4d04-ab06-5184bfe5d36e","Type":"ContainerStarted","Data":"aa0ef58c645a9aa88346185f839817bc7756ede71e696223a80e8798a908ab8f"} Apr 22 16:24:07.274055 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:07.274059 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7867586f55-85f4h" event={"ID":"84d6547c-fe65-4d04-ab06-5184bfe5d36e","Type":"ContainerStarted","Data":"b9fc76e261e2fb9229784e76c90c7e4ce06f6effbd6c52deb3f4006178e660b5"} Apr 22 16:24:07.274519 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:07.274091 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:24:07.296415 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:07.296358 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7867586f55-85f4h" podStartSLOduration=17.296340155 podStartE2EDuration="17.296340155s" podCreationTimestamp="2026-04-22 16:23:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:24:07.295136467 +0000 UTC m=+151.230450697" watchObservedRunningTime="2026-04-22 16:24:07.296340155 +0000 UTC m=+151.231654381" Apr 22 16:24:10.897012 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:10.896974 2573 scope.go:117] "RemoveContainer" containerID="c099cad984bc935b2d9326ad0e058dab68fe7e109c59c590f772fe0173580d00" Apr 22 16:24:11.284672 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:11.284595 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 16:24:11.284852 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:11.284668 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-hx648" event={"ID":"4ae7e321-7d7a-4cff-b23f-dfbc5af07459","Type":"ContainerStarted","Data":"beb138f28310aa2190289d1e833fcc8958d1c9c2bcd0419f4c579f6ea6c182f6"} Apr 22 16:24:11.284991 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:11.284970 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-hx648" Apr 22 16:24:11.301050 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:11.301007 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-hx648" podStartSLOduration=23.715958509 podStartE2EDuration="26.300995129s" podCreationTimestamp="2026-04-22 16:23:45 +0000 UTC" firstStartedPulling="2026-04-22 16:23:45.690514307 +0000 UTC m=+129.625828510" lastFinishedPulling="2026-04-22 16:23:48.275550924 +0000 UTC m=+132.210865130" observedRunningTime="2026-04-22 16:24:11.29971054 +0000 UTC m=+155.235024765" watchObservedRunningTime="2026-04-22 16:24:11.300995129 +0000 UTC m=+155.236309393" Apr 22 16:24:11.399005 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:11.398971 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-hx648" Apr 22 16:24:12.092994 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:12.092964 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9c547d5-42c1-445c-b145-1e317ab8947a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c7m6g\" (UID: \"c9c547d5-42c1-445c-b145-1e317ab8947a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g" Apr 22 16:24:12.095075 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:12.095056 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c9c547d5-42c1-445c-b145-1e317ab8947a-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c7m6g\" (UID: \"c9c547d5-42c1-445c-b145-1e317ab8947a\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g" Apr 22 16:24:12.322516 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:12.322488 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g" Apr 22 16:24:12.429346 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:12.429321 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g"] Apr 22 16:24:12.432340 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:24:12.432310 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9c547d5_42c1_445c_b145_1e317ab8947a.slice/crio-e4ee6121e25410b51ddeb5cf66ea7e377739c56f23ff3aefd8ff0d4d549da28d WatchSource:0}: Error finding container e4ee6121e25410b51ddeb5cf66ea7e377739c56f23ff3aefd8ff0d4d549da28d: Status 404 returned error can't find the container with id e4ee6121e25410b51ddeb5cf66ea7e377739c56f23ff3aefd8ff0d4d549da28d Apr 22 16:24:12.925107 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:24:12.925070 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-cwt8x" podUID="e324836e-ef75-432e-978a-639279d2702e" Apr 22 16:24:12.946208 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:24:12.946189 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-c2xk7" podUID="bf42ef1a-eb82-48c4-b318-0b119dfdda61" Apr 22 16:24:12.960535 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:24:12.960505 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-8524b" podUID="ed730508-b5b5-44cd-b56a-f58225697c5d" Apr 22 16:24:13.290703 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:13.290657 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-c2xk7" Apr 22 16:24:13.291178 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:13.290660 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g" event={"ID":"c9c547d5-42c1-445c-b145-1e317ab8947a","Type":"ContainerStarted","Data":"e4ee6121e25410b51ddeb5cf66ea7e377739c56f23ff3aefd8ff0d4d549da28d"} Apr 22 16:24:15.297136 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:15.297102 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g" event={"ID":"c9c547d5-42c1-445c-b145-1e317ab8947a","Type":"ContainerStarted","Data":"99ef85d88a00cf4c3a02ea87b52efabd078b3d5ac49b493efed6b85ee91d4f42"} Apr 22 16:24:15.311901 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:15.311850 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c7m6g" podStartSLOduration=33.191268698 podStartE2EDuration="35.311835012s" podCreationTimestamp="2026-04-22 16:23:40 +0000 UTC" firstStartedPulling="2026-04-22 16:24:12.434073768 +0000 UTC m=+156.369387971" lastFinishedPulling="2026-04-22 16:24:14.554640082 +0000 UTC m=+158.489954285" observedRunningTime="2026-04-22 16:24:15.311328108 +0000 UTC m=+159.246642336" watchObservedRunningTime="2026-04-22 16:24:15.311835012 +0000 UTC m=+159.247149237" Apr 22 16:24:15.961138 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:15.961105 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-65frr"] Apr 22 16:24:15.964039 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:15.964018 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7867586f55-85f4h"] Apr 22 16:24:15.964169 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:15.964152 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-65frr" Apr 22 16:24:15.966384 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:15.966349 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-bkfkj\"" Apr 22 16:24:15.966502 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:15.966386 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 16:24:15.966502 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:15.966429 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 16:24:15.975545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:15.975523 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-65frr"] Apr 22 16:24:16.020920 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.020900 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtqdn\" (UniqueName: \"kubernetes.io/projected/a96be859-8cab-480c-a151-485aa4b28fca-kube-api-access-dtqdn\") pod \"downloads-6bcc868b7-65frr\" (UID: \"a96be859-8cab-480c-a151-485aa4b28fca\") " pod="openshift-console/downloads-6bcc868b7-65frr" Apr 22 16:24:16.075384 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.075361 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-q72tk"] Apr 22 16:24:16.078363 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.078346 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-q72tk" Apr 22 16:24:16.081056 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.081038 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 16:24:16.081127 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.081069 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 16:24:16.081324 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.081302 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 16:24:16.081397 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.081381 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 16:24:16.081436 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.081381 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-llksk\"" Apr 22 16:24:16.088844 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.088825 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-q72tk"] Apr 22 16:24:16.122165 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.122142 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dbd5bed6-525a-4375-8584-35c15db9f5ac-data-volume\") pod \"insights-runtime-extractor-q72tk\" (UID: \"dbd5bed6-525a-4375-8584-35c15db9f5ac\") " pod="openshift-insights/insights-runtime-extractor-q72tk" Apr 22 16:24:16.122269 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.122177 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dbd5bed6-525a-4375-8584-35c15db9f5ac-crio-socket\") pod \"insights-runtime-extractor-q72tk\" (UID: \"dbd5bed6-525a-4375-8584-35c15db9f5ac\") " pod="openshift-insights/insights-runtime-extractor-q72tk" Apr 22 16:24:16.122269 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.122201 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dbd5bed6-525a-4375-8584-35c15db9f5ac-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q72tk\" (UID: \"dbd5bed6-525a-4375-8584-35c15db9f5ac\") " pod="openshift-insights/insights-runtime-extractor-q72tk" Apr 22 16:24:16.122342 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.122271 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtqdn\" (UniqueName: \"kubernetes.io/projected/a96be859-8cab-480c-a151-485aa4b28fca-kube-api-access-dtqdn\") pod \"downloads-6bcc868b7-65frr\" (UID: \"a96be859-8cab-480c-a151-485aa4b28fca\") " pod="openshift-console/downloads-6bcc868b7-65frr" Apr 22 16:24:16.122342 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.122299 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd7pk\" (UniqueName: \"kubernetes.io/projected/dbd5bed6-525a-4375-8584-35c15db9f5ac-kube-api-access-wd7pk\") pod \"insights-runtime-extractor-q72tk\" (UID: \"dbd5bed6-525a-4375-8584-35c15db9f5ac\") " pod="openshift-insights/insights-runtime-extractor-q72tk" Apr 22 16:24:16.122342 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.122318 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dbd5bed6-525a-4375-8584-35c15db9f5ac-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q72tk\" (UID: \"dbd5bed6-525a-4375-8584-35c15db9f5ac\") " pod="openshift-insights/insights-runtime-extractor-q72tk" Apr 22 16:24:16.133098 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.133076 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtqdn\" (UniqueName: \"kubernetes.io/projected/a96be859-8cab-480c-a151-485aa4b28fca-kube-api-access-dtqdn\") pod \"downloads-6bcc868b7-65frr\" (UID: \"a96be859-8cab-480c-a151-485aa4b28fca\") " pod="openshift-console/downloads-6bcc868b7-65frr" Apr 22 16:24:16.223164 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.223111 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dbd5bed6-525a-4375-8584-35c15db9f5ac-crio-socket\") pod \"insights-runtime-extractor-q72tk\" (UID: \"dbd5bed6-525a-4375-8584-35c15db9f5ac\") " pod="openshift-insights/insights-runtime-extractor-q72tk" Apr 22 16:24:16.223164 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.223142 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dbd5bed6-525a-4375-8584-35c15db9f5ac-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q72tk\" (UID: \"dbd5bed6-525a-4375-8584-35c15db9f5ac\") " pod="openshift-insights/insights-runtime-extractor-q72tk" Apr 22 16:24:16.223372 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.223200 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wd7pk\" (UniqueName: \"kubernetes.io/projected/dbd5bed6-525a-4375-8584-35c15db9f5ac-kube-api-access-wd7pk\") pod \"insights-runtime-extractor-q72tk\" (UID: \"dbd5bed6-525a-4375-8584-35c15db9f5ac\") " pod="openshift-insights/insights-runtime-extractor-q72tk" Apr 22 16:24:16.223372 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.223227 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/dbd5bed6-525a-4375-8584-35c15db9f5ac-crio-socket\") pod \"insights-runtime-extractor-q72tk\" (UID: \"dbd5bed6-525a-4375-8584-35c15db9f5ac\") " pod="openshift-insights/insights-runtime-extractor-q72tk" Apr 22 16:24:16.223372 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.223232 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dbd5bed6-525a-4375-8584-35c15db9f5ac-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q72tk\" (UID: \"dbd5bed6-525a-4375-8584-35c15db9f5ac\") " pod="openshift-insights/insights-runtime-extractor-q72tk" Apr 22 16:24:16.223372 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.223304 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dbd5bed6-525a-4375-8584-35c15db9f5ac-data-volume\") pod \"insights-runtime-extractor-q72tk\" (UID: \"dbd5bed6-525a-4375-8584-35c15db9f5ac\") " pod="openshift-insights/insights-runtime-extractor-q72tk" Apr 22 16:24:16.223655 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.223639 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/dbd5bed6-525a-4375-8584-35c15db9f5ac-data-volume\") pod \"insights-runtime-extractor-q72tk\" (UID: \"dbd5bed6-525a-4375-8584-35c15db9f5ac\") " pod="openshift-insights/insights-runtime-extractor-q72tk" Apr 22 16:24:16.223797 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.223777 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/dbd5bed6-525a-4375-8584-35c15db9f5ac-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q72tk\" (UID: \"dbd5bed6-525a-4375-8584-35c15db9f5ac\") " pod="openshift-insights/insights-runtime-extractor-q72tk" Apr 22 16:24:16.225307 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.225292 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/dbd5bed6-525a-4375-8584-35c15db9f5ac-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q72tk\" (UID: \"dbd5bed6-525a-4375-8584-35c15db9f5ac\") " pod="openshift-insights/insights-runtime-extractor-q72tk" Apr 22 16:24:16.251346 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.251321 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd7pk\" (UniqueName: \"kubernetes.io/projected/dbd5bed6-525a-4375-8584-35c15db9f5ac-kube-api-access-wd7pk\") pod \"insights-runtime-extractor-q72tk\" (UID: \"dbd5bed6-525a-4375-8584-35c15db9f5ac\") " pod="openshift-insights/insights-runtime-extractor-q72tk" Apr 22 16:24:16.273314 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.273297 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-65frr" Apr 22 16:24:16.386649 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.386621 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-65frr"] Apr 22 16:24:16.386968 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.386710 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-q72tk" Apr 22 16:24:16.389623 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:24:16.389597 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda96be859_8cab_480c_a151_485aa4b28fca.slice/crio-fa537f738044355ac28742f16b89e07df00aef9a7996fe62280672619e90cd7b WatchSource:0}: Error finding container fa537f738044355ac28742f16b89e07df00aef9a7996fe62280672619e90cd7b: Status 404 returned error can't find the container with id fa537f738044355ac28742f16b89e07df00aef9a7996fe62280672619e90cd7b Apr 22 16:24:16.502317 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:16.502252 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-q72tk"] Apr 22 16:24:16.505606 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:24:16.505570 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbd5bed6_525a_4375_8584_35c15db9f5ac.slice/crio-c31bacbd69df34660c1d88bad46fe085fe08b32ceb46913e699d4e787997e3bc WatchSource:0}: Error finding container c31bacbd69df34660c1d88bad46fe085fe08b32ceb46913e699d4e787997e3bc: Status 404 returned error can't find the container with id c31bacbd69df34660c1d88bad46fe085fe08b32ceb46913e699d4e787997e3bc Apr 22 16:24:17.303367 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:17.303322 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-65frr" event={"ID":"a96be859-8cab-480c-a151-485aa4b28fca","Type":"ContainerStarted","Data":"fa537f738044355ac28742f16b89e07df00aef9a7996fe62280672619e90cd7b"} Apr 22 16:24:17.304711 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:17.304673 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q72tk" event={"ID":"dbd5bed6-525a-4375-8584-35c15db9f5ac","Type":"ContainerStarted","Data":"f6b879a01a1d0e453daec5b92aa52df532ad0349b4926406c6461ca4f0c9f37b"} Apr 22 16:24:17.304711 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:17.304708 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q72tk" event={"ID":"dbd5bed6-525a-4375-8584-35c15db9f5ac","Type":"ContainerStarted","Data":"c31bacbd69df34660c1d88bad46fe085fe08b32ceb46913e699d4e787997e3bc"} Apr 22 16:24:17.835192 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:17.835157 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls\") pod \"dns-default-8524b\" (UID: \"ed730508-b5b5-44cd-b56a-f58225697c5d\") " pod="openshift-dns/dns-default-8524b" Apr 22 16:24:17.835560 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:17.835200 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert\") pod \"ingress-canary-c2xk7\" (UID: \"bf42ef1a-eb82-48c4-b318-0b119dfdda61\") " pod="openshift-ingress-canary/ingress-canary-c2xk7" Apr 22 16:24:17.837547 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:17.837518 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bf42ef1a-eb82-48c4-b318-0b119dfdda61-cert\") pod \"ingress-canary-c2xk7\" (UID: \"bf42ef1a-eb82-48c4-b318-0b119dfdda61\") " pod="openshift-ingress-canary/ingress-canary-c2xk7" Apr 22 16:24:17.837647 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:17.837556 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed730508-b5b5-44cd-b56a-f58225697c5d-metrics-tls\") pod \"dns-default-8524b\" (UID: \"ed730508-b5b5-44cd-b56a-f58225697c5d\") " pod="openshift-dns/dns-default-8524b" Apr 22 16:24:18.093914 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:18.093831 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pdvf2\"" Apr 22 16:24:18.101836 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:18.101789 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-c2xk7" Apr 22 16:24:18.230033 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:18.230002 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-c2xk7"] Apr 22 16:24:18.232963 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:24:18.232922 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf42ef1a_eb82_48c4_b318_0b119dfdda61.slice/crio-c2eed9312b09d3c1ef46c3c415ce76d09ed050129d14162a06011a04eaf93c80 WatchSource:0}: Error finding container c2eed9312b09d3c1ef46c3c415ce76d09ed050129d14162a06011a04eaf93c80: Status 404 returned error can't find the container with id c2eed9312b09d3c1ef46c3c415ce76d09ed050129d14162a06011a04eaf93c80 Apr 22 16:24:18.308473 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:18.308441 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-c2xk7" event={"ID":"bf42ef1a-eb82-48c4-b318-0b119dfdda61","Type":"ContainerStarted","Data":"c2eed9312b09d3c1ef46c3c415ce76d09ed050129d14162a06011a04eaf93c80"} Apr 22 16:24:18.309825 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:18.309793 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q72tk" event={"ID":"dbd5bed6-525a-4375-8584-35c15db9f5ac","Type":"ContainerStarted","Data":"0f33e177917ce37b9e80d1689c170608025cdfb18f0ef3ccf8674c639c2c6ceb"} Apr 22 16:24:20.117542 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.117503 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gmcgm"] Apr 22 16:24:20.122391 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.122016 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-gmcgm" Apr 22 16:24:20.125543 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.125323 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 16:24:20.125543 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.125355 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 16:24:20.126437 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.126413 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 16:24:20.126687 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.126667 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-99gtt\"" Apr 22 16:24:20.132528 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.132504 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gmcgm"] Apr 22 16:24:20.255305 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.255058 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w96t\" (UniqueName: \"kubernetes.io/projected/82e21656-91bb-49a6-8a79-6241b36c61e0-kube-api-access-2w96t\") pod \"prometheus-operator-5676c8c784-gmcgm\" (UID: \"82e21656-91bb-49a6-8a79-6241b36c61e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gmcgm" Apr 22 16:24:20.255305 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.255115 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/82e21656-91bb-49a6-8a79-6241b36c61e0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gmcgm\" (UID: \"82e21656-91bb-49a6-8a79-6241b36c61e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gmcgm" Apr 22 16:24:20.255305 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.255169 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/82e21656-91bb-49a6-8a79-6241b36c61e0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gmcgm\" (UID: \"82e21656-91bb-49a6-8a79-6241b36c61e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gmcgm" Apr 22 16:24:20.255305 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.255193 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/82e21656-91bb-49a6-8a79-6241b36c61e0-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gmcgm\" (UID: \"82e21656-91bb-49a6-8a79-6241b36c61e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gmcgm" Apr 22 16:24:20.316896 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.316858 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-c2xk7" event={"ID":"bf42ef1a-eb82-48c4-b318-0b119dfdda61","Type":"ContainerStarted","Data":"6b1e3ef4e2fc2d9cb4f34de786bd7a777c08f5e2a2afb6ebe435cf77e39d66fd"} Apr 22 16:24:20.318806 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.318782 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q72tk" event={"ID":"dbd5bed6-525a-4375-8584-35c15db9f5ac","Type":"ContainerStarted","Data":"a950dba36e4e3ae2502e68a2ae8e45227dc1c262ee585c99930ec05c52e1e143"} Apr 22 16:24:20.334310 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.334260 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-c2xk7" podStartSLOduration=129.318965488 podStartE2EDuration="2m11.334245267s" podCreationTimestamp="2026-04-22 16:22:09 +0000 UTC" firstStartedPulling="2026-04-22 16:24:18.23494668 +0000 UTC m=+162.170260898" lastFinishedPulling="2026-04-22 16:24:20.250226474 +0000 UTC m=+164.185540677" observedRunningTime="2026-04-22 16:24:20.333982619 +0000 UTC m=+164.269296844" watchObservedRunningTime="2026-04-22 16:24:20.334245267 +0000 UTC m=+164.269559491" Apr 22 16:24:20.353046 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.352979 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-q72tk" podStartSLOduration=1.6753521120000001 podStartE2EDuration="4.352959774s" podCreationTimestamp="2026-04-22 16:24:16 +0000 UTC" firstStartedPulling="2026-04-22 16:24:16.575706798 +0000 UTC m=+160.511021002" lastFinishedPulling="2026-04-22 16:24:19.253314448 +0000 UTC m=+163.188628664" observedRunningTime="2026-04-22 16:24:20.352141009 +0000 UTC m=+164.287455248" watchObservedRunningTime="2026-04-22 16:24:20.352959774 +0000 UTC m=+164.288274002" Apr 22 16:24:20.355808 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.355782 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2w96t\" (UniqueName: \"kubernetes.io/projected/82e21656-91bb-49a6-8a79-6241b36c61e0-kube-api-access-2w96t\") pod \"prometheus-operator-5676c8c784-gmcgm\" (UID: \"82e21656-91bb-49a6-8a79-6241b36c61e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gmcgm" Apr 22 16:24:20.355951 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.355848 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/82e21656-91bb-49a6-8a79-6241b36c61e0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gmcgm\" (UID: \"82e21656-91bb-49a6-8a79-6241b36c61e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gmcgm" Apr 22 16:24:20.355951 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.355893 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/82e21656-91bb-49a6-8a79-6241b36c61e0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gmcgm\" (UID: \"82e21656-91bb-49a6-8a79-6241b36c61e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gmcgm" Apr 22 16:24:20.355951 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.355931 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/82e21656-91bb-49a6-8a79-6241b36c61e0-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gmcgm\" (UID: \"82e21656-91bb-49a6-8a79-6241b36c61e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gmcgm" Apr 22 16:24:20.356720 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.356673 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/82e21656-91bb-49a6-8a79-6241b36c61e0-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-gmcgm\" (UID: \"82e21656-91bb-49a6-8a79-6241b36c61e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gmcgm" Apr 22 16:24:20.358985 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.358960 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/82e21656-91bb-49a6-8a79-6241b36c61e0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-gmcgm\" (UID: \"82e21656-91bb-49a6-8a79-6241b36c61e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gmcgm" Apr 22 16:24:20.359156 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.359135 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/82e21656-91bb-49a6-8a79-6241b36c61e0-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-gmcgm\" (UID: \"82e21656-91bb-49a6-8a79-6241b36c61e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gmcgm" Apr 22 16:24:20.368077 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.367998 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w96t\" (UniqueName: \"kubernetes.io/projected/82e21656-91bb-49a6-8a79-6241b36c61e0-kube-api-access-2w96t\") pod \"prometheus-operator-5676c8c784-gmcgm\" (UID: \"82e21656-91bb-49a6-8a79-6241b36c61e0\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-gmcgm" Apr 22 16:24:20.434284 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.434257 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-gmcgm" Apr 22 16:24:20.561647 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:20.561616 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-gmcgm"] Apr 22 16:24:20.564870 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:24:20.564845 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82e21656_91bb_49a6_8a79_6241b36c61e0.slice/crio-9c0a69ed969832d15b278c645a5035c2039ac43e5ced96f00f788604674fb5cf WatchSource:0}: Error finding container 9c0a69ed969832d15b278c645a5035c2039ac43e5ced96f00f788604674fb5cf: Status 404 returned error can't find the container with id 9c0a69ed969832d15b278c645a5035c2039ac43e5ced96f00f788604674fb5cf Apr 22 16:24:21.323617 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.323575 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gmcgm" event={"ID":"82e21656-91bb-49a6-8a79-6241b36c61e0","Type":"ContainerStarted","Data":"9c0a69ed969832d15b278c645a5035c2039ac43e5ced96f00f788604674fb5cf"} Apr 22 16:24:21.703722 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.703624 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-58697f9fbb-p6fvn"] Apr 22 16:24:21.707606 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.707560 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:24:21.711209 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.711131 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 16:24:21.712774 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.712729 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 16:24:21.712889 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.712824 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 16:24:21.712955 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.712917 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 16:24:21.713200 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.713179 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-59fnm\"" Apr 22 16:24:21.713472 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.713366 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 16:24:21.718081 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.718048 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58697f9fbb-p6fvn"] Apr 22 16:24:21.774075 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.774045 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r65xv\" (UniqueName: \"kubernetes.io/projected/3f6ac367-d437-4342-9f76-bb16839b4f6a-kube-api-access-r65xv\") pod \"console-58697f9fbb-p6fvn\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:24:21.774236 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.774116 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f6ac367-d437-4342-9f76-bb16839b4f6a-console-serving-cert\") pod \"console-58697f9fbb-p6fvn\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:24:21.774236 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.774175 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f6ac367-d437-4342-9f76-bb16839b4f6a-oauth-serving-cert\") pod \"console-58697f9fbb-p6fvn\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:24:21.774236 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.774214 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f6ac367-d437-4342-9f76-bb16839b4f6a-console-oauth-config\") pod \"console-58697f9fbb-p6fvn\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:24:21.774236 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.774232 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f6ac367-d437-4342-9f76-bb16839b4f6a-console-config\") pod \"console-58697f9fbb-p6fvn\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:24:21.774439 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.774250 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f6ac367-d437-4342-9f76-bb16839b4f6a-service-ca\") pod \"console-58697f9fbb-p6fvn\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:24:21.875438 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.875393 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r65xv\" (UniqueName: \"kubernetes.io/projected/3f6ac367-d437-4342-9f76-bb16839b4f6a-kube-api-access-r65xv\") pod \"console-58697f9fbb-p6fvn\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:24:21.875622 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.875465 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f6ac367-d437-4342-9f76-bb16839b4f6a-console-serving-cert\") pod \"console-58697f9fbb-p6fvn\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:24:21.875622 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.875518 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f6ac367-d437-4342-9f76-bb16839b4f6a-oauth-serving-cert\") pod \"console-58697f9fbb-p6fvn\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:24:21.875622 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.875561 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f6ac367-d437-4342-9f76-bb16839b4f6a-console-oauth-config\") pod \"console-58697f9fbb-p6fvn\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:24:21.875622 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.875589 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f6ac367-d437-4342-9f76-bb16839b4f6a-console-config\") pod \"console-58697f9fbb-p6fvn\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:24:21.875622 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.875618 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f6ac367-d437-4342-9f76-bb16839b4f6a-service-ca\") pod \"console-58697f9fbb-p6fvn\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:24:21.876539 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.876481 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f6ac367-d437-4342-9f76-bb16839b4f6a-console-config\") pod \"console-58697f9fbb-p6fvn\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:24:21.876655 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.876545 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f6ac367-d437-4342-9f76-bb16839b4f6a-oauth-serving-cert\") pod \"console-58697f9fbb-p6fvn\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:24:21.877958 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.877893 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f6ac367-d437-4342-9f76-bb16839b4f6a-service-ca\") pod \"console-58697f9fbb-p6fvn\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:24:21.878460 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.878438 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f6ac367-d437-4342-9f76-bb16839b4f6a-console-oauth-config\") pod \"console-58697f9fbb-p6fvn\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:24:21.878938 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.878895 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f6ac367-d437-4342-9f76-bb16839b4f6a-console-serving-cert\") pod \"console-58697f9fbb-p6fvn\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:24:21.884407 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:21.884385 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r65xv\" (UniqueName: \"kubernetes.io/projected/3f6ac367-d437-4342-9f76-bb16839b4f6a-kube-api-access-r65xv\") pod \"console-58697f9fbb-p6fvn\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:24:22.021105 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:22.021024 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:24:22.156099 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:22.156068 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58697f9fbb-p6fvn"] Apr 22 16:24:22.249774 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:24:22.249719 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f6ac367_d437_4342_9f76_bb16839b4f6a.slice/crio-077b82834f807230b6da21693be25e69d91778bfc48d18a93a3d3933c2426093 WatchSource:0}: Error finding container 077b82834f807230b6da21693be25e69d91778bfc48d18a93a3d3933c2426093: Status 404 returned error can't find the container with id 077b82834f807230b6da21693be25e69d91778bfc48d18a93a3d3933c2426093 Apr 22 16:24:22.330043 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:22.330016 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58697f9fbb-p6fvn" event={"ID":"3f6ac367-d437-4342-9f76-bb16839b4f6a","Type":"ContainerStarted","Data":"077b82834f807230b6da21693be25e69d91778bfc48d18a93a3d3933c2426093"} Apr 22 16:24:23.335158 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:23.335118 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gmcgm" event={"ID":"82e21656-91bb-49a6-8a79-6241b36c61e0","Type":"ContainerStarted","Data":"1aab34c6d34703fdcf304f2079c2c259d96d3c88f967a81dd5ed2ad02a3e178e"} Apr 22 16:24:23.335635 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:23.335169 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-gmcgm" event={"ID":"82e21656-91bb-49a6-8a79-6241b36c61e0","Type":"ContainerStarted","Data":"7183becfddb3be05c4e90b018370b09a8e71c01c50259e103ea82318e393a763"} Apr 22 16:24:23.354906 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:23.354861 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-gmcgm" podStartSLOduration=1.652778861 podStartE2EDuration="3.354848032s" podCreationTimestamp="2026-04-22 16:24:20 +0000 UTC" firstStartedPulling="2026-04-22 16:24:20.566677979 +0000 UTC m=+164.501992186" lastFinishedPulling="2026-04-22 16:24:22.26874715 +0000 UTC m=+166.204061357" observedRunningTime="2026-04-22 16:24:23.353592179 +0000 UTC m=+167.288906405" watchObservedRunningTime="2026-04-22 16:24:23.354848032 +0000 UTC m=+167.290162256" Apr 22 16:24:25.475086 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.473176 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-62wb6"] Apr 22 16:24:25.485804 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.482787 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw"] Apr 22 16:24:25.485804 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.485167 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw" Apr 22 16:24:25.485804 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.485777 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.494278 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.494118 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 16:24:25.494383 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.494320 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 16:24:25.494533 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.494486 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-6fs7l\"" Apr 22 16:24:25.494696 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.494678 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 16:24:25.494897 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.494881 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 16:24:25.496602 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.496579 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw"] Apr 22 16:24:25.496694 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.496670 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 16:24:25.497104 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.497086 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-q77pt\"" Apr 22 16:24:25.610104 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.610070 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9ff261de-df3b-469b-a87d-1dc5330b2f0c-node-exporter-textfile\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.610256 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.610131 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9ff261de-df3b-469b-a87d-1dc5330b2f0c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.610256 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.610179 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9ff261de-df3b-469b-a87d-1dc5330b2f0c-node-exporter-wtmp\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.610256 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.610206 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9ff261de-df3b-469b-a87d-1dc5330b2f0c-sys\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.610256 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.610236 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7f700c39-c35c-4f3b-b024-463588166278-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-jfwzw\" (UID: \"7f700c39-c35c-4f3b-b024-463588166278\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw" Apr 22 16:24:25.610481 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.610260 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sr55\" (UniqueName: \"kubernetes.io/projected/9ff261de-df3b-469b-a87d-1dc5330b2f0c-kube-api-access-9sr55\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.610481 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.610288 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7f700c39-c35c-4f3b-b024-463588166278-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-jfwzw\" (UID: \"7f700c39-c35c-4f3b-b024-463588166278\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw" Apr 22 16:24:25.610481 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.610325 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv49w\" (UniqueName: \"kubernetes.io/projected/7f700c39-c35c-4f3b-b024-463588166278-kube-api-access-gv49w\") pod \"openshift-state-metrics-9d44df66c-jfwzw\" (UID: \"7f700c39-c35c-4f3b-b024-463588166278\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw" Apr 22 16:24:25.610481 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.610348 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9ff261de-df3b-469b-a87d-1dc5330b2f0c-node-exporter-tls\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.610481 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.610373 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9ff261de-df3b-469b-a87d-1dc5330b2f0c-metrics-client-ca\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.610481 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.610431 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9ff261de-df3b-469b-a87d-1dc5330b2f0c-root\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.610481 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.610457 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9ff261de-df3b-469b-a87d-1dc5330b2f0c-node-exporter-accelerators-collector-config\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.610836 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.610488 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f700c39-c35c-4f3b-b024-463588166278-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-jfwzw\" (UID: \"7f700c39-c35c-4f3b-b024-463588166278\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw" Apr 22 16:24:25.711872 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.711791 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9ff261de-df3b-469b-a87d-1dc5330b2f0c-root\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.711872 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.711836 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9ff261de-df3b-469b-a87d-1dc5330b2f0c-node-exporter-accelerators-collector-config\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.712095 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.711917 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9ff261de-df3b-469b-a87d-1dc5330b2f0c-root\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.712095 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.711969 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f700c39-c35c-4f3b-b024-463588166278-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-jfwzw\" (UID: \"7f700c39-c35c-4f3b-b024-463588166278\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw" Apr 22 16:24:25.712095 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.712025 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9ff261de-df3b-469b-a87d-1dc5330b2f0c-node-exporter-textfile\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.712095 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.712068 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9ff261de-df3b-469b-a87d-1dc5330b2f0c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.712293 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.712116 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9ff261de-df3b-469b-a87d-1dc5330b2f0c-node-exporter-wtmp\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.712293 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.712143 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9ff261de-df3b-469b-a87d-1dc5330b2f0c-sys\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.712293 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:24:25.712152 2573 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 22 16:24:25.712293 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.712174 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7f700c39-c35c-4f3b-b024-463588166278-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-jfwzw\" (UID: \"7f700c39-c35c-4f3b-b024-463588166278\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw" Apr 22 16:24:25.712293 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.712201 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9sr55\" (UniqueName: \"kubernetes.io/projected/9ff261de-df3b-469b-a87d-1dc5330b2f0c-kube-api-access-9sr55\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.712293 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:24:25.712237 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f700c39-c35c-4f3b-b024-463588166278-openshift-state-metrics-tls podName:7f700c39-c35c-4f3b-b024-463588166278 nodeName:}" failed. No retries permitted until 2026-04-22 16:24:26.212215238 +0000 UTC m=+170.147529442 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/7f700c39-c35c-4f3b-b024-463588166278-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-jfwzw" (UID: "7f700c39-c35c-4f3b-b024-463588166278") : secret "openshift-state-metrics-tls" not found Apr 22 16:24:25.712293 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.712271 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7f700c39-c35c-4f3b-b024-463588166278-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-jfwzw\" (UID: \"7f700c39-c35c-4f3b-b024-463588166278\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw" Apr 22 16:24:25.712591 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.712312 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gv49w\" (UniqueName: \"kubernetes.io/projected/7f700c39-c35c-4f3b-b024-463588166278-kube-api-access-gv49w\") pod \"openshift-state-metrics-9d44df66c-jfwzw\" (UID: \"7f700c39-c35c-4f3b-b024-463588166278\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw" Apr 22 16:24:25.712591 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.712339 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9ff261de-df3b-469b-a87d-1dc5330b2f0c-node-exporter-tls\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.712591 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.712362 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9ff261de-df3b-469b-a87d-1dc5330b2f0c-metrics-client-ca\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.712591 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.712399 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9ff261de-df3b-469b-a87d-1dc5330b2f0c-node-exporter-accelerators-collector-config\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.712801 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.712598 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9ff261de-df3b-469b-a87d-1dc5330b2f0c-node-exporter-wtmp\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.712801 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.712649 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9ff261de-df3b-469b-a87d-1dc5330b2f0c-sys\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.712913 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.712862 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9ff261de-df3b-469b-a87d-1dc5330b2f0c-metrics-client-ca\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.713084 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.713008 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9ff261de-df3b-469b-a87d-1dc5330b2f0c-node-exporter-textfile\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.713345 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.713293 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7f700c39-c35c-4f3b-b024-463588166278-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-jfwzw\" (UID: \"7f700c39-c35c-4f3b-b024-463588166278\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw" Apr 22 16:24:25.715521 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.715480 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9ff261de-df3b-469b-a87d-1dc5330b2f0c-node-exporter-tls\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.716588 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.716563 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9ff261de-df3b-469b-a87d-1dc5330b2f0c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.716704 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.716575 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7f700c39-c35c-4f3b-b024-463588166278-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-jfwzw\" (UID: \"7f700c39-c35c-4f3b-b024-463588166278\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw" Apr 22 16:24:25.721580 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.721555 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sr55\" (UniqueName: \"kubernetes.io/projected/9ff261de-df3b-469b-a87d-1dc5330b2f0c-kube-api-access-9sr55\") pod \"node-exporter-62wb6\" (UID: \"9ff261de-df3b-469b-a87d-1dc5330b2f0c\") " pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.722049 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.722008 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv49w\" (UniqueName: \"kubernetes.io/projected/7f700c39-c35c-4f3b-b024-463588166278-kube-api-access-gv49w\") pod \"openshift-state-metrics-9d44df66c-jfwzw\" (UID: \"7f700c39-c35c-4f3b-b024-463588166278\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw" Apr 22 16:24:25.813498 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.813459 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-62wb6" Apr 22 16:24:25.823419 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:24:25.823384 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ff261de_df3b_469b_a87d_1dc5330b2f0c.slice/crio-f81aa5764fc966155a7309b9d4135939626e48670270b44ecb651d87f874b623 WatchSource:0}: Error finding container f81aa5764fc966155a7309b9d4135939626e48670270b44ecb651d87f874b623: Status 404 returned error can't find the container with id f81aa5764fc966155a7309b9d4135939626e48670270b44ecb651d87f874b623 Apr 22 16:24:25.896513 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.896465 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:24:25.970269 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.970225 2573 patch_prober.go:28] interesting pod/image-registry-7867586f55-85f4h container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 16:24:25.970435 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:25.970287 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7867586f55-85f4h" podUID="84d6547c-fe65-4d04-ab06-5184bfe5d36e" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 16:24:26.216449 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:26.216371 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f700c39-c35c-4f3b-b024-463588166278-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-jfwzw\" (UID: \"7f700c39-c35c-4f3b-b024-463588166278\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw" Apr 22 16:24:26.218918 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:26.218894 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7f700c39-c35c-4f3b-b024-463588166278-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-jfwzw\" (UID: \"7f700c39-c35c-4f3b-b024-463588166278\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw" Apr 22 16:24:26.343793 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:26.343740 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58697f9fbb-p6fvn" event={"ID":"3f6ac367-d437-4342-9f76-bb16839b4f6a","Type":"ContainerStarted","Data":"771f04e189544d071cf04eabe39be3c7eebadf5e16e992033ab64a8548b9a698"} Apr 22 16:24:26.344602 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:26.344575 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-62wb6" event={"ID":"9ff261de-df3b-469b-a87d-1dc5330b2f0c","Type":"ContainerStarted","Data":"f81aa5764fc966155a7309b9d4135939626e48670270b44ecb651d87f874b623"} Apr 22 16:24:26.362591 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:26.362551 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-58697f9fbb-p6fvn" podStartSLOduration=1.8186338210000002 podStartE2EDuration="5.362540177s" podCreationTimestamp="2026-04-22 16:24:21 +0000 UTC" firstStartedPulling="2026-04-22 16:24:22.265189821 +0000 UTC m=+166.200504029" lastFinishedPulling="2026-04-22 16:24:25.809096182 +0000 UTC m=+169.744410385" observedRunningTime="2026-04-22 16:24:26.362389197 +0000 UTC m=+170.297703421" watchObservedRunningTime="2026-04-22 16:24:26.362540177 +0000 UTC m=+170.297854402" Apr 22 16:24:26.414920 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:26.414883 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw" Apr 22 16:24:26.583096 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:26.583062 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw"] Apr 22 16:24:26.588458 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:24:26.588427 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f700c39_c35c_4f3b_b024_463588166278.slice/crio-6dcdd879383f6c3caadf9fe10c085986a7fd0f42b0e08d463aed190923340611 WatchSource:0}: Error finding container 6dcdd879383f6c3caadf9fe10c085986a7fd0f42b0e08d463aed190923340611: Status 404 returned error can't find the container with id 6dcdd879383f6c3caadf9fe10c085986a7fd0f42b0e08d463aed190923340611 Apr 22 16:24:27.350668 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:27.350629 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw" event={"ID":"7f700c39-c35c-4f3b-b024-463588166278","Type":"ContainerStarted","Data":"22c0ff8811ac24d58b15f5280696eae12d208dea50c46161a38899b4d0c12a82"} Apr 22 16:24:27.350668 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:27.350674 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw" event={"ID":"7f700c39-c35c-4f3b-b024-463588166278","Type":"ContainerStarted","Data":"0860101cfea74faae1ef7307eb947b9edf411a4c89f1e61bb2b23e8dfa6ac205"} Apr 22 16:24:27.350911 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:27.350686 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw" event={"ID":"7f700c39-c35c-4f3b-b024-463588166278","Type":"ContainerStarted","Data":"6dcdd879383f6c3caadf9fe10c085986a7fd0f42b0e08d463aed190923340611"} Apr 22 16:24:27.352025 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:27.352000 2573 generic.go:358] "Generic (PLEG): container finished" podID="9ff261de-df3b-469b-a87d-1dc5330b2f0c" containerID="e425a885afd2a56449226336911788b92198b916241bfd23de37aaeca054a8e4" exitCode=0 Apr 22 16:24:27.352129 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:27.352085 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-62wb6" event={"ID":"9ff261de-df3b-469b-a87d-1dc5330b2f0c","Type":"ContainerDied","Data":"e425a885afd2a56449226336911788b92198b916241bfd23de37aaeca054a8e4"} Apr 22 16:24:27.896931 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:27.896896 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8524b" Apr 22 16:24:27.899499 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:27.899468 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fsrq7\"" Apr 22 16:24:27.908016 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:27.907981 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8524b" Apr 22 16:24:28.114567 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:28.114535 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8524b"] Apr 22 16:24:28.118315 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:24:28.118272 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded730508_b5b5_44cd_b56a_f58225697c5d.slice/crio-29b8d352ff039f41c232a1f73fa14f3cf7ae95dcd0acfe075b908655fa1a3e6c WatchSource:0}: Error finding container 29b8d352ff039f41c232a1f73fa14f3cf7ae95dcd0acfe075b908655fa1a3e6c: Status 404 returned error can't find the container with id 29b8d352ff039f41c232a1f73fa14f3cf7ae95dcd0acfe075b908655fa1a3e6c Apr 22 16:24:28.359580 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:28.359545 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw" event={"ID":"7f700c39-c35c-4f3b-b024-463588166278","Type":"ContainerStarted","Data":"4b24325fef20f2fd3d8a0a30819d62bbd7644358c2d0ba35819068ab2c5aef5a"} Apr 22 16:24:28.362278 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:28.362220 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-62wb6" event={"ID":"9ff261de-df3b-469b-a87d-1dc5330b2f0c","Type":"ContainerStarted","Data":"f108a2ea3a1580c0fe57349ddab96065c38569c7b6e14e71ea3f398f6d419a05"} Apr 22 16:24:28.362278 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:28.362254 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-62wb6" event={"ID":"9ff261de-df3b-469b-a87d-1dc5330b2f0c","Type":"ContainerStarted","Data":"e51f39cf5357068095651766e360e501be242cc7aea19095acbd57da89d4972c"} Apr 22 16:24:28.363337 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:28.363314 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8524b" event={"ID":"ed730508-b5b5-44cd-b56a-f58225697c5d","Type":"ContainerStarted","Data":"29b8d352ff039f41c232a1f73fa14f3cf7ae95dcd0acfe075b908655fa1a3e6c"} Apr 22 16:24:28.380551 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:28.380499 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jfwzw" podStartSLOduration=2.27291696 podStartE2EDuration="3.380477707s" podCreationTimestamp="2026-04-22 16:24:25 +0000 UTC" firstStartedPulling="2026-04-22 16:24:26.887016766 +0000 UTC m=+170.822330968" lastFinishedPulling="2026-04-22 16:24:27.994577505 +0000 UTC m=+171.929891715" observedRunningTime="2026-04-22 16:24:28.378860155 +0000 UTC m=+172.314174382" watchObservedRunningTime="2026-04-22 16:24:28.380477707 +0000 UTC m=+172.315791933" Apr 22 16:24:29.876036 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:29.875937 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-62wb6" podStartSLOduration=3.881674619 podStartE2EDuration="4.875916633s" podCreationTimestamp="2026-04-22 16:24:25 +0000 UTC" firstStartedPulling="2026-04-22 16:24:25.826816473 +0000 UTC m=+169.762130682" lastFinishedPulling="2026-04-22 16:24:26.821058488 +0000 UTC m=+170.756372696" observedRunningTime="2026-04-22 16:24:28.398728357 +0000 UTC m=+172.334042576" watchObservedRunningTime="2026-04-22 16:24:29.875916633 +0000 UTC m=+173.811230854" Apr 22 16:24:29.876456 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:29.876414 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-9759db7d6-7t7pd"] Apr 22 16:24:29.878710 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:29.878690 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:29.881917 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:29.881899 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 16:24:29.882013 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:29.881998 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-z5jpr\"" Apr 22 16:24:29.882083 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:29.882019 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 16:24:29.882083 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:29.882019 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 16:24:29.882083 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:29.882049 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 16:24:29.882240 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:29.881998 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-3cm2ntoutnu1a\"" Apr 22 16:24:29.891110 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:29.891091 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-9759db7d6-7t7pd"] Apr 22 16:24:29.952843 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:29.952808 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7916ac0-f665-4081-a897-ae7825389217-client-ca-bundle\") pod \"metrics-server-9759db7d6-7t7pd\" (UID: \"e7916ac0-f665-4081-a897-ae7825389217\") " pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:29.952976 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:29.952865 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e7916ac0-f665-4081-a897-ae7825389217-secret-metrics-server-tls\") pod \"metrics-server-9759db7d6-7t7pd\" (UID: \"e7916ac0-f665-4081-a897-ae7825389217\") " pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:29.952976 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:29.952903 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7916ac0-f665-4081-a897-ae7825389217-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-9759db7d6-7t7pd\" (UID: \"e7916ac0-f665-4081-a897-ae7825389217\") " pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:29.952976 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:29.952955 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e7916ac0-f665-4081-a897-ae7825389217-audit-log\") pod \"metrics-server-9759db7d6-7t7pd\" (UID: \"e7916ac0-f665-4081-a897-ae7825389217\") " pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:29.953130 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:29.953105 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e7916ac0-f665-4081-a897-ae7825389217-metrics-server-audit-profiles\") pod \"metrics-server-9759db7d6-7t7pd\" (UID: \"e7916ac0-f665-4081-a897-ae7825389217\") " pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:29.953173 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:29.953149 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e7916ac0-f665-4081-a897-ae7825389217-secret-metrics-server-client-certs\") pod \"metrics-server-9759db7d6-7t7pd\" (UID: \"e7916ac0-f665-4081-a897-ae7825389217\") " pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:29.953228 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:29.953185 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhwkv\" (UniqueName: \"kubernetes.io/projected/e7916ac0-f665-4081-a897-ae7825389217-kube-api-access-jhwkv\") pod \"metrics-server-9759db7d6-7t7pd\" (UID: \"e7916ac0-f665-4081-a897-ae7825389217\") " pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:30.054315 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.054286 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e7916ac0-f665-4081-a897-ae7825389217-metrics-server-audit-profiles\") pod \"metrics-server-9759db7d6-7t7pd\" (UID: \"e7916ac0-f665-4081-a897-ae7825389217\") " pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:30.054454 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.054322 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e7916ac0-f665-4081-a897-ae7825389217-secret-metrics-server-client-certs\") pod \"metrics-server-9759db7d6-7t7pd\" (UID: \"e7916ac0-f665-4081-a897-ae7825389217\") " pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:30.054454 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.054358 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhwkv\" (UniqueName: \"kubernetes.io/projected/e7916ac0-f665-4081-a897-ae7825389217-kube-api-access-jhwkv\") pod \"metrics-server-9759db7d6-7t7pd\" (UID: \"e7916ac0-f665-4081-a897-ae7825389217\") " pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:30.054454 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.054393 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7916ac0-f665-4081-a897-ae7825389217-client-ca-bundle\") pod \"metrics-server-9759db7d6-7t7pd\" (UID: \"e7916ac0-f665-4081-a897-ae7825389217\") " pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:30.054454 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.054425 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e7916ac0-f665-4081-a897-ae7825389217-secret-metrics-server-tls\") pod \"metrics-server-9759db7d6-7t7pd\" (UID: \"e7916ac0-f665-4081-a897-ae7825389217\") " pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:30.054817 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.054781 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7916ac0-f665-4081-a897-ae7825389217-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-9759db7d6-7t7pd\" (UID: \"e7916ac0-f665-4081-a897-ae7825389217\") " pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:30.054923 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.054861 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e7916ac0-f665-4081-a897-ae7825389217-audit-log\") pod \"metrics-server-9759db7d6-7t7pd\" (UID: \"e7916ac0-f665-4081-a897-ae7825389217\") " pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:30.055283 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.055232 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e7916ac0-f665-4081-a897-ae7825389217-audit-log\") pod \"metrics-server-9759db7d6-7t7pd\" (UID: \"e7916ac0-f665-4081-a897-ae7825389217\") " pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:30.055399 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.055382 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e7916ac0-f665-4081-a897-ae7825389217-metrics-server-audit-profiles\") pod \"metrics-server-9759db7d6-7t7pd\" (UID: \"e7916ac0-f665-4081-a897-ae7825389217\") " pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:30.055480 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.055459 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7916ac0-f665-4081-a897-ae7825389217-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-9759db7d6-7t7pd\" (UID: \"e7916ac0-f665-4081-a897-ae7825389217\") " pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:30.058060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.058036 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e7916ac0-f665-4081-a897-ae7825389217-secret-metrics-server-tls\") pod \"metrics-server-9759db7d6-7t7pd\" (UID: \"e7916ac0-f665-4081-a897-ae7825389217\") " pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:30.058169 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.058066 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7916ac0-f665-4081-a897-ae7825389217-client-ca-bundle\") pod \"metrics-server-9759db7d6-7t7pd\" (UID: \"e7916ac0-f665-4081-a897-ae7825389217\") " pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:30.058169 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.058125 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e7916ac0-f665-4081-a897-ae7825389217-secret-metrics-server-client-certs\") pod \"metrics-server-9759db7d6-7t7pd\" (UID: \"e7916ac0-f665-4081-a897-ae7825389217\") " pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:30.062372 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.062348 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhwkv\" (UniqueName: \"kubernetes.io/projected/e7916ac0-f665-4081-a897-ae7825389217-kube-api-access-jhwkv\") pod \"metrics-server-9759db7d6-7t7pd\" (UID: \"e7916ac0-f665-4081-a897-ae7825389217\") " pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:30.187409 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.187332 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:30.253734 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.252916 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-bl9p4"] Apr 22 16:24:30.260372 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.258153 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bl9p4" Apr 22 16:24:30.261091 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.260800 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 16:24:30.261091 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.261066 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-fp2h6\"" Apr 22 16:24:30.264603 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.264584 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-bl9p4"] Apr 22 16:24:30.326714 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.326688 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-9759db7d6-7t7pd"] Apr 22 16:24:30.329795 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:24:30.329749 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7916ac0_f665_4081_a897_ae7825389217.slice/crio-cc35c426bda4b43c407147361c977084a7033f8086e42092dbd80ddcb535ecfa WatchSource:0}: Error finding container cc35c426bda4b43c407147361c977084a7033f8086e42092dbd80ddcb535ecfa: Status 404 returned error can't find the container with id cc35c426bda4b43c407147361c977084a7033f8086e42092dbd80ddcb535ecfa Apr 22 16:24:30.357539 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.357512 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/09d5a02d-d631-44a9-b1f0-97d6de575878-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-bl9p4\" (UID: \"09d5a02d-d631-44a9-b1f0-97d6de575878\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bl9p4" Apr 22 16:24:30.371520 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.371492 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8524b" event={"ID":"ed730508-b5b5-44cd-b56a-f58225697c5d","Type":"ContainerStarted","Data":"49b534b4f03382d36a77c2a11d8df5809537635a61ddb48009dfd589afa808b1"} Apr 22 16:24:30.371607 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.371533 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8524b" event={"ID":"ed730508-b5b5-44cd-b56a-f58225697c5d","Type":"ContainerStarted","Data":"ce3f16cdc96b900e825aeaca3ace2f4735e5aa92900f59cac74a748181c28ac2"} Apr 22 16:24:30.371672 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.371634 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-8524b" Apr 22 16:24:30.372565 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.372540 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" event={"ID":"e7916ac0-f665-4081-a897-ae7825389217","Type":"ContainerStarted","Data":"cc35c426bda4b43c407147361c977084a7033f8086e42092dbd80ddcb535ecfa"} Apr 22 16:24:30.390397 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.390352 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8524b" podStartSLOduration=139.92219492 podStartE2EDuration="2m21.390334712s" podCreationTimestamp="2026-04-22 16:22:09 +0000 UTC" firstStartedPulling="2026-04-22 16:24:28.120421512 +0000 UTC m=+172.055735716" lastFinishedPulling="2026-04-22 16:24:29.58856129 +0000 UTC m=+173.523875508" observedRunningTime="2026-04-22 16:24:30.38902651 +0000 UTC m=+174.324340732" watchObservedRunningTime="2026-04-22 16:24:30.390334712 +0000 UTC m=+174.325648938" Apr 22 16:24:30.458220 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.458150 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/09d5a02d-d631-44a9-b1f0-97d6de575878-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-bl9p4\" (UID: \"09d5a02d-d631-44a9-b1f0-97d6de575878\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bl9p4" Apr 22 16:24:30.458334 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:24:30.458310 2573 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 22 16:24:30.458400 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:24:30.458375 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09d5a02d-d631-44a9-b1f0-97d6de575878-monitoring-plugin-cert podName:09d5a02d-d631-44a9-b1f0-97d6de575878 nodeName:}" failed. No retries permitted until 2026-04-22 16:24:30.958356995 +0000 UTC m=+174.893671218 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/09d5a02d-d631-44a9-b1f0-97d6de575878-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-bl9p4" (UID: "09d5a02d-d631-44a9-b1f0-97d6de575878") : secret "monitoring-plugin-cert" not found Apr 22 16:24:30.961535 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.961485 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/09d5a02d-d631-44a9-b1f0-97d6de575878-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-bl9p4\" (UID: \"09d5a02d-d631-44a9-b1f0-97d6de575878\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bl9p4" Apr 22 16:24:30.964614 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:30.964565 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/09d5a02d-d631-44a9-b1f0-97d6de575878-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-bl9p4\" (UID: \"09d5a02d-d631-44a9-b1f0-97d6de575878\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bl9p4" Apr 22 16:24:31.170744 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.170702 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bl9p4" Apr 22 16:24:31.743602 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.743557 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 16:24:31.747788 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.747513 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.750365 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.750186 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 16:24:31.750365 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.750311 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 16:24:31.750657 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.750637 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 16:24:31.751385 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.751200 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 16:24:31.751951 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.751934 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 16:24:31.752129 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.752110 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 16:24:31.752213 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.751944 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 16:24:31.752289 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.751994 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 16:24:31.752359 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.752067 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 16:24:31.752457 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.752073 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-lm9s2\"" Apr 22 16:24:31.752561 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.752121 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 16:24:31.752626 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.751934 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-83b7ch0bb6ol1\"" Apr 22 16:24:31.752962 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.752942 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 16:24:31.754377 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.754194 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 16:24:31.774719 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.774695 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 16:24:31.870104 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.870070 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.870104 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.870112 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.870353 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.870146 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.870353 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.870171 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.870353 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.870196 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.870353 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.870217 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29xsm\" (UniqueName: \"kubernetes.io/projected/30343fd2-6e3f-4118-b2ac-db945987ce03-kube-api-access-29xsm\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.870353 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.870249 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-web-config\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.870353 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.870301 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.870615 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.870349 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.870615 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.870452 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/30343fd2-6e3f-4118-b2ac-db945987ce03-config-out\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.870615 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.870510 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.870738 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.870623 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.870738 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.870657 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/30343fd2-6e3f-4118-b2ac-db945987ce03-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.870738 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.870681 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.870738 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.870710 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.870738 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.870737 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.870951 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.870764 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-config\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.870951 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.870778 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/30343fd2-6e3f-4118-b2ac-db945987ce03-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.971698 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.971652 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.972237 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.971729 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.972237 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.971793 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/30343fd2-6e3f-4118-b2ac-db945987ce03-config-out\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.972237 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.971824 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.972237 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.971882 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.972237 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.971925 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/30343fd2-6e3f-4118-b2ac-db945987ce03-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.972237 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.971950 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.972237 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.971980 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.972237 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.972019 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.972237 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.972045 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-config\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.972237 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.972069 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/30343fd2-6e3f-4118-b2ac-db945987ce03-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.972237 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.972098 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.972237 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.972125 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.972237 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.972162 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.972237 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.972194 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.972237 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.972224 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.972237 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.972247 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29xsm\" (UniqueName: \"kubernetes.io/projected/30343fd2-6e3f-4118-b2ac-db945987ce03-kube-api-access-29xsm\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.973136 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.972279 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-web-config\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.973519 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.973485 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.974374 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.974348 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.975087 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.975010 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.978709 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.976305 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.979319 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.979021 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-config\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.979526 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.979503 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-web-config\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.980212 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.980047 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/30343fd2-6e3f-4118-b2ac-db945987ce03-config-out\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.980735 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.980506 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.980735 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.980695 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.981098 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.981035 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/30343fd2-6e3f-4118-b2ac-db945987ce03-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.982603 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.982003 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.982603 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.982414 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.982603 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.982550 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.982603 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.982599 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/30343fd2-6e3f-4118-b2ac-db945987ce03-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.982942 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.982656 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.983645 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.983616 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.984681 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.984613 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:31.990138 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:31.990115 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29xsm\" (UniqueName: \"kubernetes.io/projected/30343fd2-6e3f-4118-b2ac-db945987ce03-kube-api-access-29xsm\") pod \"prometheus-k8s-0\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:32.022167 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:32.021976 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:24:32.022615 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:32.022577 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:24:32.024254 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:32.024226 2573 patch_prober.go:28] interesting pod/console-58697f9fbb-p6fvn container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.133.0.16:8443/health\": dial tcp 10.133.0.16:8443: connect: connection refused" start-of-body= Apr 22 16:24:32.024355 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:32.024298 2573 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-58697f9fbb-p6fvn" podUID="3f6ac367-d437-4342-9f76-bb16839b4f6a" containerName="console" probeResult="failure" output="Get \"https://10.133.0.16:8443/health\": dial tcp 10.133.0.16:8443: connect: connection refused" Apr 22 16:24:32.062356 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:32.062285 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:35.696200 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:35.696165 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58697f9fbb-p6fvn"] Apr 22 16:24:35.968950 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:35.968850 2573 patch_prober.go:28] interesting pod/image-registry-7867586f55-85f4h container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 16:24:35.969103 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:35.968943 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-7867586f55-85f4h" podUID="84d6547c-fe65-4d04-ab06-5184bfe5d36e" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 16:24:37.651453 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:37.651428 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 16:24:37.652951 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:24:37.652925 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30343fd2_6e3f_4118_b2ac_db945987ce03.slice/crio-638f21a74fbf90d248d4f6ad393f9486f127ddea115cea81a685e7748d488510 WatchSource:0}: Error finding container 638f21a74fbf90d248d4f6ad393f9486f127ddea115cea81a685e7748d488510: Status 404 returned error can't find the container with id 638f21a74fbf90d248d4f6ad393f9486f127ddea115cea81a685e7748d488510 Apr 22 16:24:37.661057 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:37.661037 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-bl9p4"] Apr 22 16:24:37.663146 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:24:37.663126 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09d5a02d_d631_44a9_b1f0_97d6de575878.slice/crio-f6c36d849a8ad07322c2f96e83d04326ffb69f0af40d79ef3e01141845ce2286 WatchSource:0}: Error finding container f6c36d849a8ad07322c2f96e83d04326ffb69f0af40d79ef3e01141845ce2286: Status 404 returned error can't find the container with id f6c36d849a8ad07322c2f96e83d04326ffb69f0af40d79ef3e01141845ce2286 Apr 22 16:24:38.399882 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:38.399684 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bl9p4" event={"ID":"09d5a02d-d631-44a9-b1f0-97d6de575878","Type":"ContainerStarted","Data":"f6c36d849a8ad07322c2f96e83d04326ffb69f0af40d79ef3e01141845ce2286"} Apr 22 16:24:38.401393 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:38.401274 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30343fd2-6e3f-4118-b2ac-db945987ce03","Type":"ContainerStarted","Data":"638f21a74fbf90d248d4f6ad393f9486f127ddea115cea81a685e7748d488510"} Apr 22 16:24:38.405272 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:38.405207 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-65frr" event={"ID":"a96be859-8cab-480c-a151-485aa4b28fca","Type":"ContainerStarted","Data":"33f3d31248091e6a4cd2b0da2d2d3a2805824e84e905a47ccc2189b13b59ea9b"} Apr 22 16:24:38.406496 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:38.406463 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-65frr" Apr 22 16:24:38.425442 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:38.425241 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-65frr" podStartSLOduration=2.22245986 podStartE2EDuration="23.425224186s" podCreationTimestamp="2026-04-22 16:24:15 +0000 UTC" firstStartedPulling="2026-04-22 16:24:16.393939308 +0000 UTC m=+160.329253510" lastFinishedPulling="2026-04-22 16:24:37.596703617 +0000 UTC m=+181.532017836" observedRunningTime="2026-04-22 16:24:38.424790648 +0000 UTC m=+182.360104874" watchObservedRunningTime="2026-04-22 16:24:38.425224186 +0000 UTC m=+182.360538414" Apr 22 16:24:38.428358 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:38.428337 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-65frr" Apr 22 16:24:40.388836 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:40.388804 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8524b" Apr 22 16:24:40.983455 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:40.983412 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7867586f55-85f4h" podUID="84d6547c-fe65-4d04-ab06-5184bfe5d36e" containerName="registry" containerID="cri-o://aa0ef58c645a9aa88346185f839817bc7756ede71e696223a80e8798a908ab8f" gracePeriod=30 Apr 22 16:24:41.252587 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.252563 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:24:41.267287 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.267261 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-tls\") pod \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " Apr 22 16:24:41.267421 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.267310 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/84d6547c-fe65-4d04-ab06-5184bfe5d36e-installation-pull-secrets\") pod \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " Apr 22 16:24:41.267421 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.267345 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-bound-sa-token\") pod \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " Apr 22 16:24:41.267421 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.267384 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-certificates\") pod \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " Apr 22 16:24:41.267421 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.267411 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84d6547c-fe65-4d04-ab06-5184bfe5d36e-trusted-ca\") pod \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " Apr 22 16:24:41.267748 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.267451 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftt7v\" (UniqueName: \"kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-kube-api-access-ftt7v\") pod \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " Apr 22 16:24:41.267748 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.267497 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/84d6547c-fe65-4d04-ab06-5184bfe5d36e-image-registry-private-configuration\") pod \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " Apr 22 16:24:41.267748 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.267532 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/84d6547c-fe65-4d04-ab06-5184bfe5d36e-ca-trust-extracted\") pod \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\" (UID: \"84d6547c-fe65-4d04-ab06-5184bfe5d36e\") " Apr 22 16:24:41.267923 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.267769 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "84d6547c-fe65-4d04-ab06-5184bfe5d36e" (UID: "84d6547c-fe65-4d04-ab06-5184bfe5d36e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:24:41.267923 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.267888 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-certificates\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:24:41.268020 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.267989 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d6547c-fe65-4d04-ab06-5184bfe5d36e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "84d6547c-fe65-4d04-ab06-5184bfe5d36e" (UID: "84d6547c-fe65-4d04-ab06-5184bfe5d36e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:24:41.270164 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.270106 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "84d6547c-fe65-4d04-ab06-5184bfe5d36e" (UID: "84d6547c-fe65-4d04-ab06-5184bfe5d36e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:24:41.270164 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.270114 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84d6547c-fe65-4d04-ab06-5184bfe5d36e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "84d6547c-fe65-4d04-ab06-5184bfe5d36e" (UID: "84d6547c-fe65-4d04-ab06-5184bfe5d36e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:24:41.270387 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.270170 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84d6547c-fe65-4d04-ab06-5184bfe5d36e-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "84d6547c-fe65-4d04-ab06-5184bfe5d36e" (UID: "84d6547c-fe65-4d04-ab06-5184bfe5d36e"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:24:41.270442 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.270394 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "84d6547c-fe65-4d04-ab06-5184bfe5d36e" (UID: "84d6547c-fe65-4d04-ab06-5184bfe5d36e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:24:41.270748 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.270723 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-kube-api-access-ftt7v" (OuterVolumeSpecName: "kube-api-access-ftt7v") pod "84d6547c-fe65-4d04-ab06-5184bfe5d36e" (UID: "84d6547c-fe65-4d04-ab06-5184bfe5d36e"). InnerVolumeSpecName "kube-api-access-ftt7v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:24:41.280858 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.280833 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84d6547c-fe65-4d04-ab06-5184bfe5d36e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "84d6547c-fe65-4d04-ab06-5184bfe5d36e" (UID: "84d6547c-fe65-4d04-ab06-5184bfe5d36e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:24:41.368412 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.368376 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-registry-tls\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:24:41.368412 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.368411 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/84d6547c-fe65-4d04-ab06-5184bfe5d36e-installation-pull-secrets\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:24:41.368600 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.368423 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-bound-sa-token\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:24:41.368600 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.368438 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84d6547c-fe65-4d04-ab06-5184bfe5d36e-trusted-ca\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:24:41.368600 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.368452 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ftt7v\" (UniqueName: \"kubernetes.io/projected/84d6547c-fe65-4d04-ab06-5184bfe5d36e-kube-api-access-ftt7v\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:24:41.368600 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.368468 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/84d6547c-fe65-4d04-ab06-5184bfe5d36e-image-registry-private-configuration\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:24:41.368600 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.368480 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/84d6547c-fe65-4d04-ab06-5184bfe5d36e-ca-trust-extracted\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:24:41.416192 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.416158 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bl9p4" event={"ID":"09d5a02d-d631-44a9-b1f0-97d6de575878","Type":"ContainerStarted","Data":"4db191996eb293bd0c8989543c97cf6fa635e866423d7ae17093d42765a37737"} Apr 22 16:24:41.416576 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.416552 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bl9p4" Apr 22 16:24:41.417813 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.417789 2573 generic.go:358] "Generic (PLEG): container finished" podID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerID="45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e" exitCode=0 Apr 22 16:24:41.417978 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.417921 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30343fd2-6e3f-4118-b2ac-db945987ce03","Type":"ContainerDied","Data":"45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e"} Apr 22 16:24:41.419227 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.419208 2573 generic.go:358] "Generic (PLEG): container finished" podID="84d6547c-fe65-4d04-ab06-5184bfe5d36e" containerID="aa0ef58c645a9aa88346185f839817bc7756ede71e696223a80e8798a908ab8f" exitCode=0 Apr 22 16:24:41.419359 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.419320 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7867586f55-85f4h" Apr 22 16:24:41.420545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.419781 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7867586f55-85f4h" event={"ID":"84d6547c-fe65-4d04-ab06-5184bfe5d36e","Type":"ContainerDied","Data":"aa0ef58c645a9aa88346185f839817bc7756ede71e696223a80e8798a908ab8f"} Apr 22 16:24:41.420545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.419815 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7867586f55-85f4h" event={"ID":"84d6547c-fe65-4d04-ab06-5184bfe5d36e","Type":"ContainerDied","Data":"b9fc76e261e2fb9229784e76c90c7e4ce06f6effbd6c52deb3f4006178e660b5"} Apr 22 16:24:41.420545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.419838 2573 scope.go:117] "RemoveContainer" containerID="aa0ef58c645a9aa88346185f839817bc7756ede71e696223a80e8798a908ab8f" Apr 22 16:24:41.421921 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.421897 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" event={"ID":"e7916ac0-f665-4081-a897-ae7825389217","Type":"ContainerStarted","Data":"e57cbf11ca6489908653edf6e0ad2711f0f2d3190c64c223e241f2e6d4421f27"} Apr 22 16:24:41.423359 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.423330 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bl9p4" Apr 22 16:24:41.430333 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.430305 2573 scope.go:117] "RemoveContainer" containerID="aa0ef58c645a9aa88346185f839817bc7756ede71e696223a80e8798a908ab8f" Apr 22 16:24:41.430718 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:24:41.430643 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa0ef58c645a9aa88346185f839817bc7756ede71e696223a80e8798a908ab8f\": container with ID starting with aa0ef58c645a9aa88346185f839817bc7756ede71e696223a80e8798a908ab8f not found: ID does not exist" containerID="aa0ef58c645a9aa88346185f839817bc7756ede71e696223a80e8798a908ab8f" Apr 22 16:24:41.430834 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.430680 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa0ef58c645a9aa88346185f839817bc7756ede71e696223a80e8798a908ab8f"} err="failed to get container status \"aa0ef58c645a9aa88346185f839817bc7756ede71e696223a80e8798a908ab8f\": rpc error: code = NotFound desc = could not find container \"aa0ef58c645a9aa88346185f839817bc7756ede71e696223a80e8798a908ab8f\": container with ID starting with aa0ef58c645a9aa88346185f839817bc7756ede71e696223a80e8798a908ab8f not found: ID does not exist" Apr 22 16:24:41.448856 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.448808 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-bl9p4" podStartSLOduration=8.661003025 podStartE2EDuration="11.448794729s" podCreationTimestamp="2026-04-22 16:24:30 +0000 UTC" firstStartedPulling="2026-04-22 16:24:37.665089677 +0000 UTC m=+181.600403880" lastFinishedPulling="2026-04-22 16:24:40.452881381 +0000 UTC m=+184.388195584" observedRunningTime="2026-04-22 16:24:41.446667185 +0000 UTC m=+185.381981401" watchObservedRunningTime="2026-04-22 16:24:41.448794729 +0000 UTC m=+185.384108955" Apr 22 16:24:41.474584 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.474562 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7867586f55-85f4h"] Apr 22 16:24:41.483734 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.483710 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7867586f55-85f4h"] Apr 22 16:24:41.505819 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:41.505726 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" podStartSLOduration=2.389727702 podStartE2EDuration="12.505710963s" podCreationTimestamp="2026-04-22 16:24:29 +0000 UTC" firstStartedPulling="2026-04-22 16:24:30.331945717 +0000 UTC m=+174.267259920" lastFinishedPulling="2026-04-22 16:24:40.44792897 +0000 UTC m=+184.383243181" observedRunningTime="2026-04-22 16:24:41.504034803 +0000 UTC m=+185.439349030" watchObservedRunningTime="2026-04-22 16:24:41.505710963 +0000 UTC m=+185.441025189" Apr 22 16:24:42.902568 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:42.902210 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84d6547c-fe65-4d04-ab06-5184bfe5d36e" path="/var/lib/kubelet/pods/84d6547c-fe65-4d04-ab06-5184bfe5d36e/volumes" Apr 22 16:24:46.445376 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:46.445342 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30343fd2-6e3f-4118-b2ac-db945987ce03","Type":"ContainerStarted","Data":"39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403"} Apr 22 16:24:46.445914 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:46.445385 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30343fd2-6e3f-4118-b2ac-db945987ce03","Type":"ContainerStarted","Data":"992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073"} Apr 22 16:24:48.454963 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:48.454930 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30343fd2-6e3f-4118-b2ac-db945987ce03","Type":"ContainerStarted","Data":"90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3"} Apr 22 16:24:48.454963 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:48.454966 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30343fd2-6e3f-4118-b2ac-db945987ce03","Type":"ContainerStarted","Data":"2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737"} Apr 22 16:24:48.455489 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:48.454976 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30343fd2-6e3f-4118-b2ac-db945987ce03","Type":"ContainerStarted","Data":"568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a"} Apr 22 16:24:48.455489 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:48.454984 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30343fd2-6e3f-4118-b2ac-db945987ce03","Type":"ContainerStarted","Data":"db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3"} Apr 22 16:24:48.492417 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:48.492369 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=7.136077057 podStartE2EDuration="17.492353877s" podCreationTimestamp="2026-04-22 16:24:31 +0000 UTC" firstStartedPulling="2026-04-22 16:24:37.654743992 +0000 UTC m=+181.590058199" lastFinishedPulling="2026-04-22 16:24:48.011020813 +0000 UTC m=+191.946335019" observedRunningTime="2026-04-22 16:24:48.490060648 +0000 UTC m=+192.425374912" watchObservedRunningTime="2026-04-22 16:24:48.492353877 +0000 UTC m=+192.427668101" Apr 22 16:24:50.188075 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:50.188033 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:50.188531 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:50.188091 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:24:51.114399 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:51.114367 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8524b_ed730508-b5b5-44cd-b56a-f58225697c5d/dns/0.log" Apr 22 16:24:51.315115 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:51.315089 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8524b_ed730508-b5b5-44cd-b56a-f58225697c5d/kube-rbac-proxy/0.log" Apr 22 16:24:52.062985 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:52.062955 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:24:52.314657 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:52.314573 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ljkt5_1f2b5ca6-7540-4d0e-88b4-b34788bdeb77/dns-node-resolver/0.log" Apr 22 16:24:52.714735 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:24:52.714655 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-c2xk7_bf42ef1a-eb82-48c4-b318-0b119dfdda61/serve-healthcheck-canary/0.log" Apr 22 16:25:00.717170 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:00.717101 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-58697f9fbb-p6fvn" podUID="3f6ac367-d437-4342-9f76-bb16839b4f6a" containerName="console" containerID="cri-o://771f04e189544d071cf04eabe39be3c7eebadf5e16e992033ab64a8548b9a698" gracePeriod=15 Apr 22 16:25:00.996715 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:00.996686 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58697f9fbb-p6fvn_3f6ac367-d437-4342-9f76-bb16839b4f6a/console/0.log" Apr 22 16:25:00.996894 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:00.996774 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:25:01.053037 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.052995 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f6ac367-d437-4342-9f76-bb16839b4f6a-oauth-serving-cert\") pod \"3f6ac367-d437-4342-9f76-bb16839b4f6a\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " Apr 22 16:25:01.053221 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.053069 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f6ac367-d437-4342-9f76-bb16839b4f6a-console-serving-cert\") pod \"3f6ac367-d437-4342-9f76-bb16839b4f6a\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " Apr 22 16:25:01.053221 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.053089 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f6ac367-d437-4342-9f76-bb16839b4f6a-console-oauth-config\") pod \"3f6ac367-d437-4342-9f76-bb16839b4f6a\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " Apr 22 16:25:01.053221 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.053131 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f6ac367-d437-4342-9f76-bb16839b4f6a-service-ca\") pod \"3f6ac367-d437-4342-9f76-bb16839b4f6a\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " Apr 22 16:25:01.053221 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.053162 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f6ac367-d437-4342-9f76-bb16839b4f6a-console-config\") pod \"3f6ac367-d437-4342-9f76-bb16839b4f6a\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " Apr 22 16:25:01.053221 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.053208 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r65xv\" (UniqueName: \"kubernetes.io/projected/3f6ac367-d437-4342-9f76-bb16839b4f6a-kube-api-access-r65xv\") pod \"3f6ac367-d437-4342-9f76-bb16839b4f6a\" (UID: \"3f6ac367-d437-4342-9f76-bb16839b4f6a\") " Apr 22 16:25:01.053501 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.053473 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f6ac367-d437-4342-9f76-bb16839b4f6a-service-ca" (OuterVolumeSpecName: "service-ca") pod "3f6ac367-d437-4342-9f76-bb16839b4f6a" (UID: "3f6ac367-d437-4342-9f76-bb16839b4f6a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:25:01.053501 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.053480 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f6ac367-d437-4342-9f76-bb16839b4f6a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "3f6ac367-d437-4342-9f76-bb16839b4f6a" (UID: "3f6ac367-d437-4342-9f76-bb16839b4f6a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:25:01.053623 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.053531 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f6ac367-d437-4342-9f76-bb16839b4f6a-console-config" (OuterVolumeSpecName: "console-config") pod "3f6ac367-d437-4342-9f76-bb16839b4f6a" (UID: "3f6ac367-d437-4342-9f76-bb16839b4f6a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:25:01.055482 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.055450 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f6ac367-d437-4342-9f76-bb16839b4f6a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "3f6ac367-d437-4342-9f76-bb16839b4f6a" (UID: "3f6ac367-d437-4342-9f76-bb16839b4f6a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:25:01.055620 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.055487 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f6ac367-d437-4342-9f76-bb16839b4f6a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "3f6ac367-d437-4342-9f76-bb16839b4f6a" (UID: "3f6ac367-d437-4342-9f76-bb16839b4f6a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:25:01.055843 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.055804 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f6ac367-d437-4342-9f76-bb16839b4f6a-kube-api-access-r65xv" (OuterVolumeSpecName: "kube-api-access-r65xv") pod "3f6ac367-d437-4342-9f76-bb16839b4f6a" (UID: "3f6ac367-d437-4342-9f76-bb16839b4f6a"). InnerVolumeSpecName "kube-api-access-r65xv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:25:01.154384 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.154353 2573 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f6ac367-d437-4342-9f76-bb16839b4f6a-service-ca\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:01.154384 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.154380 2573 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3f6ac367-d437-4342-9f76-bb16839b4f6a-console-config\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:01.154384 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.154389 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r65xv\" (UniqueName: \"kubernetes.io/projected/3f6ac367-d437-4342-9f76-bb16839b4f6a-kube-api-access-r65xv\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:01.154600 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.154399 2573 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3f6ac367-d437-4342-9f76-bb16839b4f6a-oauth-serving-cert\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:01.154600 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.154409 2573 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3f6ac367-d437-4342-9f76-bb16839b4f6a-console-serving-cert\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:01.154600 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.154418 2573 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3f6ac367-d437-4342-9f76-bb16839b4f6a-console-oauth-config\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:01.492881 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.492857 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58697f9fbb-p6fvn_3f6ac367-d437-4342-9f76-bb16839b4f6a/console/0.log" Apr 22 16:25:01.493073 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.492895 2573 generic.go:358] "Generic (PLEG): container finished" podID="3f6ac367-d437-4342-9f76-bb16839b4f6a" containerID="771f04e189544d071cf04eabe39be3c7eebadf5e16e992033ab64a8548b9a698" exitCode=2 Apr 22 16:25:01.493073 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.492961 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58697f9fbb-p6fvn" Apr 22 16:25:01.493073 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.492982 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58697f9fbb-p6fvn" event={"ID":"3f6ac367-d437-4342-9f76-bb16839b4f6a","Type":"ContainerDied","Data":"771f04e189544d071cf04eabe39be3c7eebadf5e16e992033ab64a8548b9a698"} Apr 22 16:25:01.493073 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.493026 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58697f9fbb-p6fvn" event={"ID":"3f6ac367-d437-4342-9f76-bb16839b4f6a","Type":"ContainerDied","Data":"077b82834f807230b6da21693be25e69d91778bfc48d18a93a3d3933c2426093"} Apr 22 16:25:01.493073 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.493045 2573 scope.go:117] "RemoveContainer" containerID="771f04e189544d071cf04eabe39be3c7eebadf5e16e992033ab64a8548b9a698" Apr 22 16:25:01.502462 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.502444 2573 scope.go:117] "RemoveContainer" containerID="771f04e189544d071cf04eabe39be3c7eebadf5e16e992033ab64a8548b9a698" Apr 22 16:25:01.502729 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:25:01.502705 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"771f04e189544d071cf04eabe39be3c7eebadf5e16e992033ab64a8548b9a698\": container with ID starting with 771f04e189544d071cf04eabe39be3c7eebadf5e16e992033ab64a8548b9a698 not found: ID does not exist" containerID="771f04e189544d071cf04eabe39be3c7eebadf5e16e992033ab64a8548b9a698" Apr 22 16:25:01.502865 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.502733 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"771f04e189544d071cf04eabe39be3c7eebadf5e16e992033ab64a8548b9a698"} err="failed to get container status \"771f04e189544d071cf04eabe39be3c7eebadf5e16e992033ab64a8548b9a698\": rpc error: code = NotFound desc = could not find container \"771f04e189544d071cf04eabe39be3c7eebadf5e16e992033ab64a8548b9a698\": container with ID starting with 771f04e189544d071cf04eabe39be3c7eebadf5e16e992033ab64a8548b9a698 not found: ID does not exist" Apr 22 16:25:01.516729 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.516701 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58697f9fbb-p6fvn"] Apr 22 16:25:01.522320 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:01.522296 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-58697f9fbb-p6fvn"] Apr 22 16:25:02.900478 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:02.900440 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f6ac367-d437-4342-9f76-bb16839b4f6a" path="/var/lib/kubelet/pods/3f6ac367-d437-4342-9f76-bb16839b4f6a/volumes" Apr 22 16:25:10.193861 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:10.193825 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:25:10.197639 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:10.197614 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-9759db7d6-7t7pd" Apr 22 16:25:32.063112 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:32.063056 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:32.082583 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:32.082558 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:32.601191 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:32.601164 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:48.463599 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:48.463561 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs\") pod \"network-metrics-daemon-cwt8x\" (UID: \"e324836e-ef75-432e-978a-639279d2702e\") " pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:25:48.465938 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:48.465908 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e324836e-ef75-432e-978a-639279d2702e-metrics-certs\") pod \"network-metrics-daemon-cwt8x\" (UID: \"e324836e-ef75-432e-978a-639279d2702e\") " pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:25:48.699456 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:48.699420 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dv4l7\"" Apr 22 16:25:48.707351 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:48.707330 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cwt8x" Apr 22 16:25:48.820808 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:48.820774 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cwt8x"] Apr 22 16:25:48.824698 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:25:48.824675 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode324836e_ef75_432e_978a_639279d2702e.slice/crio-d9b46ac662a1c44e933e3dc58cc5c60a0809c350e5b5614a6f36b318fa61996f WatchSource:0}: Error finding container d9b46ac662a1c44e933e3dc58cc5c60a0809c350e5b5614a6f36b318fa61996f: Status 404 returned error can't find the container with id d9b46ac662a1c44e933e3dc58cc5c60a0809c350e5b5614a6f36b318fa61996f Apr 22 16:25:49.632029 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:49.631975 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cwt8x" event={"ID":"e324836e-ef75-432e-978a-639279d2702e","Type":"ContainerStarted","Data":"d9b46ac662a1c44e933e3dc58cc5c60a0809c350e5b5614a6f36b318fa61996f"} Apr 22 16:25:50.178123 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.177375 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 16:25:50.178123 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.177984 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="prometheus" containerID="cri-o://992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073" gracePeriod=600 Apr 22 16:25:50.178123 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.178071 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="kube-rbac-proxy-thanos" containerID="cri-o://90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3" gracePeriod=600 Apr 22 16:25:50.178423 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.178132 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="config-reloader" containerID="cri-o://39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403" gracePeriod=600 Apr 22 16:25:50.178423 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.178145 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="thanos-sidecar" containerID="cri-o://db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3" gracePeriod=600 Apr 22 16:25:50.178423 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.178231 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="kube-rbac-proxy" containerID="cri-o://2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737" gracePeriod=600 Apr 22 16:25:50.178423 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.178253 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="kube-rbac-proxy-web" containerID="cri-o://568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a" gracePeriod=600 Apr 22 16:25:50.422017 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.421995 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.584735 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.584705 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-prometheus-k8s-rulefiles-0\") pod \"30343fd2-6e3f-4118-b2ac-db945987ce03\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " Apr 22 16:25:50.584932 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.584743 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/30343fd2-6e3f-4118-b2ac-db945987ce03-tls-assets\") pod \"30343fd2-6e3f-4118-b2ac-db945987ce03\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " Apr 22 16:25:50.584932 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.584793 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"30343fd2-6e3f-4118-b2ac-db945987ce03\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " Apr 22 16:25:50.584932 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.584812 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-configmap-metrics-client-ca\") pod \"30343fd2-6e3f-4118-b2ac-db945987ce03\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " Apr 22 16:25:50.584932 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.584844 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-configmap-serving-certs-ca-bundle\") pod \"30343fd2-6e3f-4118-b2ac-db945987ce03\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " Apr 22 16:25:50.584932 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.584862 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-prometheus-k8s-tls\") pod \"30343fd2-6e3f-4118-b2ac-db945987ce03\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " Apr 22 16:25:50.584932 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.584897 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-kube-rbac-proxy\") pod \"30343fd2-6e3f-4118-b2ac-db945987ce03\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " Apr 22 16:25:50.584932 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.584921 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/30343fd2-6e3f-4118-b2ac-db945987ce03-config-out\") pod \"30343fd2-6e3f-4118-b2ac-db945987ce03\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " Apr 22 16:25:50.585350 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.585295 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "30343fd2-6e3f-4118-b2ac-db945987ce03" (UID: "30343fd2-6e3f-4118-b2ac-db945987ce03"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:25:50.585350 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.585306 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "30343fd2-6e3f-4118-b2ac-db945987ce03" (UID: "30343fd2-6e3f-4118-b2ac-db945987ce03"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:25:50.585542 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.585366 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-prometheus-trusted-ca-bundle\") pod \"30343fd2-6e3f-4118-b2ac-db945987ce03\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " Apr 22 16:25:50.585542 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.585398 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-config\") pod \"30343fd2-6e3f-4118-b2ac-db945987ce03\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " Apr 22 16:25:50.585542 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.585454 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29xsm\" (UniqueName: \"kubernetes.io/projected/30343fd2-6e3f-4118-b2ac-db945987ce03-kube-api-access-29xsm\") pod \"30343fd2-6e3f-4118-b2ac-db945987ce03\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " Apr 22 16:25:50.585542 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.585503 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-web-config\") pod \"30343fd2-6e3f-4118-b2ac-db945987ce03\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " Apr 22 16:25:50.585542 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.585532 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-metrics-client-certs\") pod \"30343fd2-6e3f-4118-b2ac-db945987ce03\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " Apr 22 16:25:50.585818 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.585561 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/30343fd2-6e3f-4118-b2ac-db945987ce03-prometheus-k8s-db\") pod \"30343fd2-6e3f-4118-b2ac-db945987ce03\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " Apr 22 16:25:50.585818 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.585589 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-thanos-prometheus-http-client-file\") pod \"30343fd2-6e3f-4118-b2ac-db945987ce03\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " Apr 22 16:25:50.585818 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.585725 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-configmap-kubelet-serving-ca-bundle\") pod \"30343fd2-6e3f-4118-b2ac-db945987ce03\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " Apr 22 16:25:50.585818 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.585779 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"30343fd2-6e3f-4118-b2ac-db945987ce03\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " Apr 22 16:25:50.585818 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.585799 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "30343fd2-6e3f-4118-b2ac-db945987ce03" (UID: "30343fd2-6e3f-4118-b2ac-db945987ce03"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:25:50.586074 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.585837 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-grpc-tls\") pod \"30343fd2-6e3f-4118-b2ac-db945987ce03\" (UID: \"30343fd2-6e3f-4118-b2ac-db945987ce03\") " Apr 22 16:25:50.586074 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.585934 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "30343fd2-6e3f-4118-b2ac-db945987ce03" (UID: "30343fd2-6e3f-4118-b2ac-db945987ce03"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:25:50.586184 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.586088 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-prometheus-trusted-ca-bundle\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:50.586184 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.586107 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:50.586184 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.586122 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-configmap-metrics-client-ca\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:50.586184 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.586137 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:50.587653 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.587387 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "30343fd2-6e3f-4118-b2ac-db945987ce03" (UID: "30343fd2-6e3f-4118-b2ac-db945987ce03"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:25:50.587653 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.587559 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30343fd2-6e3f-4118-b2ac-db945987ce03-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "30343fd2-6e3f-4118-b2ac-db945987ce03" (UID: "30343fd2-6e3f-4118-b2ac-db945987ce03"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:25:50.588008 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.587966 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30343fd2-6e3f-4118-b2ac-db945987ce03-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "30343fd2-6e3f-4118-b2ac-db945987ce03" (UID: "30343fd2-6e3f-4118-b2ac-db945987ce03"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:25:50.588122 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.588094 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "30343fd2-6e3f-4118-b2ac-db945987ce03" (UID: "30343fd2-6e3f-4118-b2ac-db945987ce03"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:25:50.588250 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.588227 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "30343fd2-6e3f-4118-b2ac-db945987ce03" (UID: "30343fd2-6e3f-4118-b2ac-db945987ce03"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:25:50.588425 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.588396 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30343fd2-6e3f-4118-b2ac-db945987ce03-kube-api-access-29xsm" (OuterVolumeSpecName: "kube-api-access-29xsm") pod "30343fd2-6e3f-4118-b2ac-db945987ce03" (UID: "30343fd2-6e3f-4118-b2ac-db945987ce03"). InnerVolumeSpecName "kube-api-access-29xsm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:25:50.588737 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.588712 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "30343fd2-6e3f-4118-b2ac-db945987ce03" (UID: "30343fd2-6e3f-4118-b2ac-db945987ce03"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:25:50.589283 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.589255 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "30343fd2-6e3f-4118-b2ac-db945987ce03" (UID: "30343fd2-6e3f-4118-b2ac-db945987ce03"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:25:50.589811 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.589783 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-config" (OuterVolumeSpecName: "config") pod "30343fd2-6e3f-4118-b2ac-db945987ce03" (UID: "30343fd2-6e3f-4118-b2ac-db945987ce03"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:25:50.589892 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.589806 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "30343fd2-6e3f-4118-b2ac-db945987ce03" (UID: "30343fd2-6e3f-4118-b2ac-db945987ce03"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:25:50.590186 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.590166 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30343fd2-6e3f-4118-b2ac-db945987ce03-config-out" (OuterVolumeSpecName: "config-out") pod "30343fd2-6e3f-4118-b2ac-db945987ce03" (UID: "30343fd2-6e3f-4118-b2ac-db945987ce03"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:25:50.590350 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.590332 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "30343fd2-6e3f-4118-b2ac-db945987ce03" (UID: "30343fd2-6e3f-4118-b2ac-db945987ce03"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:25:50.590854 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.590840 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "30343fd2-6e3f-4118-b2ac-db945987ce03" (UID: "30343fd2-6e3f-4118-b2ac-db945987ce03"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:25:50.600511 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.600487 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-web-config" (OuterVolumeSpecName: "web-config") pod "30343fd2-6e3f-4118-b2ac-db945987ce03" (UID: "30343fd2-6e3f-4118-b2ac-db945987ce03"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:25:50.637146 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.637120 2573 generic.go:358] "Generic (PLEG): container finished" podID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerID="90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3" exitCode=0 Apr 22 16:25:50.637146 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.637142 2573 generic.go:358] "Generic (PLEG): container finished" podID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerID="2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737" exitCode=0 Apr 22 16:25:50.637550 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.637152 2573 generic.go:358] "Generic (PLEG): container finished" podID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerID="568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a" exitCode=0 Apr 22 16:25:50.637550 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.637161 2573 generic.go:358] "Generic (PLEG): container finished" podID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerID="db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3" exitCode=0 Apr 22 16:25:50.637550 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.637167 2573 generic.go:358] "Generic (PLEG): container finished" podID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerID="39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403" exitCode=0 Apr 22 16:25:50.637550 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.637174 2573 generic.go:358] "Generic (PLEG): container finished" podID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerID="992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073" exitCode=0 Apr 22 16:25:50.637550 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.637202 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30343fd2-6e3f-4118-b2ac-db945987ce03","Type":"ContainerDied","Data":"90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3"} Apr 22 16:25:50.637550 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.637227 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.637550 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.637239 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30343fd2-6e3f-4118-b2ac-db945987ce03","Type":"ContainerDied","Data":"2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737"} Apr 22 16:25:50.637550 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.637255 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30343fd2-6e3f-4118-b2ac-db945987ce03","Type":"ContainerDied","Data":"568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a"} Apr 22 16:25:50.637550 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.637269 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30343fd2-6e3f-4118-b2ac-db945987ce03","Type":"ContainerDied","Data":"db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3"} Apr 22 16:25:50.637550 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.637284 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30343fd2-6e3f-4118-b2ac-db945987ce03","Type":"ContainerDied","Data":"39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403"} Apr 22 16:25:50.637550 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.637297 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30343fd2-6e3f-4118-b2ac-db945987ce03","Type":"ContainerDied","Data":"992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073"} Apr 22 16:25:50.637550 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.637306 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"30343fd2-6e3f-4118-b2ac-db945987ce03","Type":"ContainerDied","Data":"638f21a74fbf90d248d4f6ad393f9486f127ddea115cea81a685e7748d488510"} Apr 22 16:25:50.637550 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.637315 2573 scope.go:117] "RemoveContainer" containerID="90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3" Apr 22 16:25:50.638914 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.638891 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cwt8x" event={"ID":"e324836e-ef75-432e-978a-639279d2702e","Type":"ContainerStarted","Data":"4cf1ea7abeea32d8af2dc64f2e583768641802f5561c9133a45c9fbae860aa17"} Apr 22 16:25:50.639013 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.638919 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cwt8x" event={"ID":"e324836e-ef75-432e-978a-639279d2702e","Type":"ContainerStarted","Data":"9443b456ab0e245deca81863b202fa32d84ce3744105d11f42b95ba40931ff36"} Apr 22 16:25:50.644969 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.644888 2573 scope.go:117] "RemoveContainer" containerID="2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737" Apr 22 16:25:50.651017 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.651000 2573 scope.go:117] "RemoveContainer" containerID="568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a" Apr 22 16:25:50.657089 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.657046 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-cwt8x" podStartSLOduration=253.716942996 podStartE2EDuration="4m14.657030814s" podCreationTimestamp="2026-04-22 16:21:36 +0000 UTC" firstStartedPulling="2026-04-22 16:25:48.826792221 +0000 UTC m=+252.762106424" lastFinishedPulling="2026-04-22 16:25:49.766880024 +0000 UTC m=+253.702194242" observedRunningTime="2026-04-22 16:25:50.656422379 +0000 UTC m=+254.591736674" watchObservedRunningTime="2026-04-22 16:25:50.657030814 +0000 UTC m=+254.592345039" Apr 22 16:25:50.657202 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.657097 2573 scope.go:117] "RemoveContainer" containerID="db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3" Apr 22 16:25:50.663545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.663530 2573 scope.go:117] "RemoveContainer" containerID="39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403" Apr 22 16:25:50.669440 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.669427 2573 scope.go:117] "RemoveContainer" containerID="992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073" Apr 22 16:25:50.676433 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.676391 2573 scope.go:117] "RemoveContainer" containerID="45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e" Apr 22 16:25:50.677714 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.677695 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 16:25:50.682542 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.682522 2573 scope.go:117] "RemoveContainer" containerID="90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3" Apr 22 16:25:50.682820 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:25:50.682795 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3\": container with ID starting with 90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3 not found: ID does not exist" containerID="90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3" Apr 22 16:25:50.682927 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.682828 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3"} err="failed to get container status \"90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3\": rpc error: code = NotFound desc = could not find container \"90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3\": container with ID starting with 90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3 not found: ID does not exist" Apr 22 16:25:50.682927 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.682855 2573 scope.go:117] "RemoveContainer" containerID="2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737" Apr 22 16:25:50.683116 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:25:50.683091 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737\": container with ID starting with 2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737 not found: ID does not exist" containerID="2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737" Apr 22 16:25:50.683160 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.683122 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737"} err="failed to get container status \"2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737\": rpc error: code = NotFound desc = could not find container \"2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737\": container with ID starting with 2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737 not found: ID does not exist" Apr 22 16:25:50.683160 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.683139 2573 scope.go:117] "RemoveContainer" containerID="568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a" Apr 22 16:25:50.683359 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:25:50.683336 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a\": container with ID starting with 568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a not found: ID does not exist" containerID="568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a" Apr 22 16:25:50.683418 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.683362 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a"} err="failed to get container status \"568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a\": rpc error: code = NotFound desc = could not find container \"568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a\": container with ID starting with 568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a not found: ID does not exist" Apr 22 16:25:50.683418 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.683382 2573 scope.go:117] "RemoveContainer" containerID="db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3" Apr 22 16:25:50.683667 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:25:50.683637 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3\": container with ID starting with db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3 not found: ID does not exist" containerID="db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3" Apr 22 16:25:50.683775 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.683673 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3"} err="failed to get container status \"db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3\": rpc error: code = NotFound desc = could not find container \"db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3\": container with ID starting with db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3 not found: ID does not exist" Apr 22 16:25:50.683775 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.683693 2573 scope.go:117] "RemoveContainer" containerID="39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403" Apr 22 16:25:50.684060 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:25:50.684036 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403\": container with ID starting with 39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403 not found: ID does not exist" containerID="39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403" Apr 22 16:25:50.684141 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.684070 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403"} err="failed to get container status \"39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403\": rpc error: code = NotFound desc = could not find container \"39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403\": container with ID starting with 39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403 not found: ID does not exist" Apr 22 16:25:50.684141 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.684124 2573 scope.go:117] "RemoveContainer" containerID="992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073" Apr 22 16:25:50.684418 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:25:50.684396 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073\": container with ID starting with 992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073 not found: ID does not exist" containerID="992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073" Apr 22 16:25:50.684516 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.684423 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073"} err="failed to get container status \"992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073\": rpc error: code = NotFound desc = could not find container \"992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073\": container with ID starting with 992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073 not found: ID does not exist" Apr 22 16:25:50.684516 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.684443 2573 scope.go:117] "RemoveContainer" containerID="45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e" Apr 22 16:25:50.684810 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:25:50.684789 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e\": container with ID starting with 45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e not found: ID does not exist" containerID="45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e" Apr 22 16:25:50.684878 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.684815 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e"} err="failed to get container status \"45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e\": rpc error: code = NotFound desc = could not find container \"45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e\": container with ID starting with 45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e not found: ID does not exist" Apr 22 16:25:50.684878 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.684832 2573 scope.go:117] "RemoveContainer" containerID="90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3" Apr 22 16:25:50.685110 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.685088 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3"} err="failed to get container status \"90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3\": rpc error: code = NotFound desc = could not find container \"90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3\": container with ID starting with 90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3 not found: ID does not exist" Apr 22 16:25:50.685165 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.685110 2573 scope.go:117] "RemoveContainer" containerID="2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737" Apr 22 16:25:50.685352 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.685331 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737"} err="failed to get container status \"2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737\": rpc error: code = NotFound desc = could not find container \"2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737\": container with ID starting with 2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737 not found: ID does not exist" Apr 22 16:25:50.685424 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.685355 2573 scope.go:117] "RemoveContainer" containerID="568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a" Apr 22 16:25:50.685615 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.685596 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a"} err="failed to get container status \"568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a\": rpc error: code = NotFound desc = could not find container \"568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a\": container with ID starting with 568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a not found: ID does not exist" Apr 22 16:25:50.685700 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.685617 2573 scope.go:117] "RemoveContainer" containerID="db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3" Apr 22 16:25:50.685909 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.685882 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3"} err="failed to get container status \"db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3\": rpc error: code = NotFound desc = could not find container \"db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3\": container with ID starting with db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3 not found: ID does not exist" Apr 22 16:25:50.685954 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.685911 2573 scope.go:117] "RemoveContainer" containerID="39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403" Apr 22 16:25:50.685954 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.685930 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 16:25:50.686148 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.686132 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403"} err="failed to get container status \"39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403\": rpc error: code = NotFound desc = could not find container \"39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403\": container with ID starting with 39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403 not found: ID does not exist" Apr 22 16:25:50.686148 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.686148 2573 scope.go:117] "RemoveContainer" containerID="992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073" Apr 22 16:25:50.686371 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.686354 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073"} err="failed to get container status \"992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073\": rpc error: code = NotFound desc = could not find container \"992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073\": container with ID starting with 992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073 not found: ID does not exist" Apr 22 16:25:50.686427 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.686372 2573 scope.go:117] "RemoveContainer" containerID="45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e" Apr 22 16:25:50.686497 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.686474 2573 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-grpc-tls\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:50.686542 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.686497 2573 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/30343fd2-6e3f-4118-b2ac-db945987ce03-tls-assets\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:50.686542 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.686510 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:50.686542 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.686524 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-prometheus-k8s-tls\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:50.686542 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.686536 2573 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-kube-rbac-proxy\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:50.686708 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.686545 2573 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/30343fd2-6e3f-4118-b2ac-db945987ce03-config-out\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:50.686708 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.686549 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e"} err="failed to get container status \"45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e\": rpc error: code = NotFound desc = could not find container \"45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e\": container with ID starting with 45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e not found: ID does not exist" Apr 22 16:25:50.686708 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.686568 2573 scope.go:117] "RemoveContainer" containerID="90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3" Apr 22 16:25:50.686708 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.686554 2573 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-config\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:50.686708 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.686616 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-29xsm\" (UniqueName: \"kubernetes.io/projected/30343fd2-6e3f-4118-b2ac-db945987ce03-kube-api-access-29xsm\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:50.686708 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.686627 2573 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-web-config\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:50.686708 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.686636 2573 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-metrics-client-certs\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:50.686708 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.686645 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/30343fd2-6e3f-4118-b2ac-db945987ce03-prometheus-k8s-db\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:50.686708 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.686658 2573 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-thanos-prometheus-http-client-file\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:50.686708 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.686673 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/30343fd2-6e3f-4118-b2ac-db945987ce03-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:50.686708 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.686688 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/30343fd2-6e3f-4118-b2ac-db945987ce03-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:25:50.687110 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.686837 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3"} err="failed to get container status \"90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3\": rpc error: code = NotFound desc = could not find container \"90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3\": container with ID starting with 90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3 not found: ID does not exist" Apr 22 16:25:50.687110 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.686852 2573 scope.go:117] "RemoveContainer" containerID="2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737" Apr 22 16:25:50.687110 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.687064 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737"} err="failed to get container status \"2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737\": rpc error: code = NotFound desc = could not find container \"2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737\": container with ID starting with 2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737 not found: ID does not exist" Apr 22 16:25:50.687110 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.687080 2573 scope.go:117] "RemoveContainer" containerID="568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a" Apr 22 16:25:50.687310 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.687292 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a"} err="failed to get container status \"568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a\": rpc error: code = NotFound desc = could not find container \"568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a\": container with ID starting with 568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a not found: ID does not exist" Apr 22 16:25:50.687359 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.687312 2573 scope.go:117] "RemoveContainer" containerID="db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3" Apr 22 16:25:50.687527 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.687512 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3"} err="failed to get container status \"db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3\": rpc error: code = NotFound desc = could not find container \"db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3\": container with ID starting with db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3 not found: ID does not exist" Apr 22 16:25:50.687576 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.687527 2573 scope.go:117] "RemoveContainer" containerID="39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403" Apr 22 16:25:50.687737 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.687717 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403"} err="failed to get container status \"39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403\": rpc error: code = NotFound desc = could not find container \"39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403\": container with ID starting with 39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403 not found: ID does not exist" Apr 22 16:25:50.687808 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.687740 2573 scope.go:117] "RemoveContainer" containerID="992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073" Apr 22 16:25:50.687994 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.687971 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073"} err="failed to get container status \"992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073\": rpc error: code = NotFound desc = could not find container \"992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073\": container with ID starting with 992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073 not found: ID does not exist" Apr 22 16:25:50.687994 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.687994 2573 scope.go:117] "RemoveContainer" containerID="45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e" Apr 22 16:25:50.688223 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.688206 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e"} err="failed to get container status \"45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e\": rpc error: code = NotFound desc = could not find container \"45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e\": container with ID starting with 45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e not found: ID does not exist" Apr 22 16:25:50.688261 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.688223 2573 scope.go:117] "RemoveContainer" containerID="90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3" Apr 22 16:25:50.688428 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.688412 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3"} err="failed to get container status \"90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3\": rpc error: code = NotFound desc = could not find container \"90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3\": container with ID starting with 90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3 not found: ID does not exist" Apr 22 16:25:50.688428 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.688427 2573 scope.go:117] "RemoveContainer" containerID="2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737" Apr 22 16:25:50.688664 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.688646 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737"} err="failed to get container status \"2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737\": rpc error: code = NotFound desc = could not find container \"2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737\": container with ID starting with 2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737 not found: ID does not exist" Apr 22 16:25:50.688701 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.688664 2573 scope.go:117] "RemoveContainer" containerID="568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a" Apr 22 16:25:50.688887 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.688871 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a"} err="failed to get container status \"568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a\": rpc error: code = NotFound desc = could not find container \"568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a\": container with ID starting with 568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a not found: ID does not exist" Apr 22 16:25:50.688936 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.688887 2573 scope.go:117] "RemoveContainer" containerID="db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3" Apr 22 16:25:50.689082 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.689065 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3"} err="failed to get container status \"db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3\": rpc error: code = NotFound desc = could not find container \"db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3\": container with ID starting with db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3 not found: ID does not exist" Apr 22 16:25:50.689132 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.689082 2573 scope.go:117] "RemoveContainer" containerID="39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403" Apr 22 16:25:50.689282 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.689264 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403"} err="failed to get container status \"39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403\": rpc error: code = NotFound desc = could not find container \"39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403\": container with ID starting with 39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403 not found: ID does not exist" Apr 22 16:25:50.689324 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.689283 2573 scope.go:117] "RemoveContainer" containerID="992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073" Apr 22 16:25:50.689476 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.689461 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073"} err="failed to get container status \"992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073\": rpc error: code = NotFound desc = could not find container \"992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073\": container with ID starting with 992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073 not found: ID does not exist" Apr 22 16:25:50.689476 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.689476 2573 scope.go:117] "RemoveContainer" containerID="45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e" Apr 22 16:25:50.689686 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.689666 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e"} err="failed to get container status \"45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e\": rpc error: code = NotFound desc = could not find container \"45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e\": container with ID starting with 45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e not found: ID does not exist" Apr 22 16:25:50.689686 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.689685 2573 scope.go:117] "RemoveContainer" containerID="90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3" Apr 22 16:25:50.689886 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.689867 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3"} err="failed to get container status \"90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3\": rpc error: code = NotFound desc = could not find container \"90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3\": container with ID starting with 90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3 not found: ID does not exist" Apr 22 16:25:50.689934 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.689888 2573 scope.go:117] "RemoveContainer" containerID="2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737" Apr 22 16:25:50.690071 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.690056 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737"} err="failed to get container status \"2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737\": rpc error: code = NotFound desc = could not find container \"2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737\": container with ID starting with 2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737 not found: ID does not exist" Apr 22 16:25:50.690119 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.690072 2573 scope.go:117] "RemoveContainer" containerID="568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a" Apr 22 16:25:50.690273 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.690255 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a"} err="failed to get container status \"568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a\": rpc error: code = NotFound desc = could not find container \"568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a\": container with ID starting with 568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a not found: ID does not exist" Apr 22 16:25:50.690319 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.690274 2573 scope.go:117] "RemoveContainer" containerID="db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3" Apr 22 16:25:50.690444 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.690427 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3"} err="failed to get container status \"db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3\": rpc error: code = NotFound desc = could not find container \"db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3\": container with ID starting with db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3 not found: ID does not exist" Apr 22 16:25:50.690490 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.690445 2573 scope.go:117] "RemoveContainer" containerID="39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403" Apr 22 16:25:50.690604 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.690591 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403"} err="failed to get container status \"39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403\": rpc error: code = NotFound desc = could not find container \"39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403\": container with ID starting with 39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403 not found: ID does not exist" Apr 22 16:25:50.690660 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.690603 2573 scope.go:117] "RemoveContainer" containerID="992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073" Apr 22 16:25:50.690786 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.690769 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073"} err="failed to get container status \"992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073\": rpc error: code = NotFound desc = could not find container \"992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073\": container with ID starting with 992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073 not found: ID does not exist" Apr 22 16:25:50.690835 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.690789 2573 scope.go:117] "RemoveContainer" containerID="45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e" Apr 22 16:25:50.690984 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.690965 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e"} err="failed to get container status \"45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e\": rpc error: code = NotFound desc = could not find container \"45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e\": container with ID starting with 45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e not found: ID does not exist" Apr 22 16:25:50.691035 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.690985 2573 scope.go:117] "RemoveContainer" containerID="90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3" Apr 22 16:25:50.691145 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.691130 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3"} err="failed to get container status \"90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3\": rpc error: code = NotFound desc = could not find container \"90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3\": container with ID starting with 90fa77c01c30443d82670f9acc0bc176a92f23c5f88797d9a3b7e90a241cf1b3 not found: ID does not exist" Apr 22 16:25:50.691193 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.691144 2573 scope.go:117] "RemoveContainer" containerID="2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737" Apr 22 16:25:50.691294 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.691280 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737"} err="failed to get container status \"2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737\": rpc error: code = NotFound desc = could not find container \"2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737\": container with ID starting with 2950db79eca56d2b8c2a4d4065176452b5f14712aa1b2211540bd4f9e160d737 not found: ID does not exist" Apr 22 16:25:50.691341 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.691293 2573 scope.go:117] "RemoveContainer" containerID="568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a" Apr 22 16:25:50.691457 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.691443 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a"} err="failed to get container status \"568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a\": rpc error: code = NotFound desc = could not find container \"568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a\": container with ID starting with 568afabf120dcae0a4f3ee14e058631fff0c8a6eaaf78b1ca5c2190d1089186a not found: ID does not exist" Apr 22 16:25:50.691457 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.691456 2573 scope.go:117] "RemoveContainer" containerID="db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3" Apr 22 16:25:50.691597 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.691583 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3"} err="failed to get container status \"db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3\": rpc error: code = NotFound desc = could not find container \"db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3\": container with ID starting with db9e2619857808116e3a97d3253353e0b244a4574d5a0365593130c0528136e3 not found: ID does not exist" Apr 22 16:25:50.691597 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.691596 2573 scope.go:117] "RemoveContainer" containerID="39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403" Apr 22 16:25:50.691726 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.691710 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403"} err="failed to get container status \"39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403\": rpc error: code = NotFound desc = could not find container \"39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403\": container with ID starting with 39b578eb6e63552851798744fadb299a1593dff58dc6e0b5dee142569c6c4403 not found: ID does not exist" Apr 22 16:25:50.691791 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.691726 2573 scope.go:117] "RemoveContainer" containerID="992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073" Apr 22 16:25:50.691934 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.691920 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073"} err="failed to get container status \"992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073\": rpc error: code = NotFound desc = could not find container \"992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073\": container with ID starting with 992e6786a4aa0c14d10210e37309d0381c3ec510c8af31949c561154a3bee073 not found: ID does not exist" Apr 22 16:25:50.691991 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.691934 2573 scope.go:117] "RemoveContainer" containerID="45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e" Apr 22 16:25:50.692152 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.692126 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e"} err="failed to get container status \"45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e\": rpc error: code = NotFound desc = could not find container \"45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e\": container with ID starting with 45f7995a9a9f43bb2b346b045d15bf41377492fb4c413d02925523854c14785e not found: ID does not exist" Apr 22 16:25:50.713120 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713086 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 16:25:50.713388 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713371 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="kube-rbac-proxy-thanos" Apr 22 16:25:50.713388 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713386 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="kube-rbac-proxy-thanos" Apr 22 16:25:50.713545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713400 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="kube-rbac-proxy-web" Apr 22 16:25:50.713545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713406 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="kube-rbac-proxy-web" Apr 22 16:25:50.713545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713418 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84d6547c-fe65-4d04-ab06-5184bfe5d36e" containerName="registry" Apr 22 16:25:50.713545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713424 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d6547c-fe65-4d04-ab06-5184bfe5d36e" containerName="registry" Apr 22 16:25:50.713545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713432 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="thanos-sidecar" Apr 22 16:25:50.713545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713437 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="thanos-sidecar" Apr 22 16:25:50.713545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713448 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f6ac367-d437-4342-9f76-bb16839b4f6a" containerName="console" Apr 22 16:25:50.713545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713454 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f6ac367-d437-4342-9f76-bb16839b4f6a" containerName="console" Apr 22 16:25:50.713545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713460 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="config-reloader" Apr 22 16:25:50.713545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713466 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="config-reloader" Apr 22 16:25:50.713545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713472 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="kube-rbac-proxy" Apr 22 16:25:50.713545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713477 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="kube-rbac-proxy" Apr 22 16:25:50.713545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713484 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="init-config-reloader" Apr 22 16:25:50.713545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713489 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="init-config-reloader" Apr 22 16:25:50.713545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713495 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="prometheus" Apr 22 16:25:50.713545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713500 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="prometheus" Apr 22 16:25:50.713545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713539 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="config-reloader" Apr 22 16:25:50.713545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713547 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="84d6547c-fe65-4d04-ab06-5184bfe5d36e" containerName="registry" Apr 22 16:25:50.713545 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713553 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f6ac367-d437-4342-9f76-bb16839b4f6a" containerName="console" Apr 22 16:25:50.714179 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713560 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="thanos-sidecar" Apr 22 16:25:50.714179 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713567 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="prometheus" Apr 22 16:25:50.714179 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713572 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="kube-rbac-proxy-thanos" Apr 22 16:25:50.714179 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713579 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="kube-rbac-proxy-web" Apr 22 16:25:50.714179 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.713584 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" containerName="kube-rbac-proxy" Apr 22 16:25:50.718994 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.718971 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.721449 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.721413 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 16:25:50.721449 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.721428 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 16:25:50.721616 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.721469 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 16:25:50.721616 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.721494 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-lm9s2\"" Apr 22 16:25:50.721833 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.721813 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 16:25:50.721923 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.721845 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 16:25:50.721923 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.721885 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 16:25:50.721923 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.721894 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 16:25:50.722074 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.721813 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 16:25:50.722074 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.722054 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-83b7ch0bb6ol1\"" Apr 22 16:25:50.722165 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.722134 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 16:25:50.722423 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.722408 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 16:25:50.725566 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.725528 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 16:25:50.727904 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.727884 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 16:25:50.734322 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.734299 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 16:25:50.787676 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.787645 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6044f842-c26c-4c13-8987-6ed3e3e35c1b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.787803 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.787695 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6044f842-c26c-4c13-8987-6ed3e3e35c1b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.787803 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.787768 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6044f842-c26c-4c13-8987-6ed3e3e35c1b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.787926 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.787828 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6044f842-c26c-4c13-8987-6ed3e3e35c1b-config-out\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.787926 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.787894 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.788031 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.787928 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.788031 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.787947 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6044f842-c26c-4c13-8987-6ed3e3e35c1b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.788031 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.787971 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.788031 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.787999 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.788172 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.788089 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-web-config\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.788172 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.788150 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6044f842-c26c-4c13-8987-6ed3e3e35c1b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.788172 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.788167 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.788349 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.788208 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.788349 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.788238 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6044f842-c26c-4c13-8987-6ed3e3e35c1b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.788349 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.788283 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6044f842-c26c-4c13-8987-6ed3e3e35c1b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.788349 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.788327 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.788550 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.788361 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-config\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.788550 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.788393 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rff5x\" (UniqueName: \"kubernetes.io/projected/6044f842-c26c-4c13-8987-6ed3e3e35c1b-kube-api-access-rff5x\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.889693 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.889587 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6044f842-c26c-4c13-8987-6ed3e3e35c1b-config-out\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.889693 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.889645 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.889693 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.889664 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.889693 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.889684 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6044f842-c26c-4c13-8987-6ed3e3e35c1b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.889693 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.889703 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.890099 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.889722 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.890099 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.889779 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-web-config\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.890099 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.889822 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6044f842-c26c-4c13-8987-6ed3e3e35c1b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.890302 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.890268 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.890363 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.890322 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.890363 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.890356 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6044f842-c26c-4c13-8987-6ed3e3e35c1b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.890463 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.890389 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6044f842-c26c-4c13-8987-6ed3e3e35c1b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.890463 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.890422 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.890463 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.890457 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-config\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.890613 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.890487 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rff5x\" (UniqueName: \"kubernetes.io/projected/6044f842-c26c-4c13-8987-6ed3e3e35c1b-kube-api-access-rff5x\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.890613 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.890517 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6044f842-c26c-4c13-8987-6ed3e3e35c1b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.890613 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.890545 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6044f842-c26c-4c13-8987-6ed3e3e35c1b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.890613 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.890576 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6044f842-c26c-4c13-8987-6ed3e3e35c1b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.890876 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.890718 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6044f842-c26c-4c13-8987-6ed3e3e35c1b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.890934 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.890895 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6044f842-c26c-4c13-8987-6ed3e3e35c1b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.892537 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.892513 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6044f842-c26c-4c13-8987-6ed3e3e35c1b-config-out\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.892826 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.892648 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.892826 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.892734 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.892999 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.892915 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.894252 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.893571 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-config\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.894252 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.893737 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6044f842-c26c-4c13-8987-6ed3e3e35c1b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.894252 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.893743 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.894252 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.893827 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.894252 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.894160 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6044f842-c26c-4c13-8987-6ed3e3e35c1b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.894501 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.894480 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6044f842-c26c-4c13-8987-6ed3e3e35c1b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.894985 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.894959 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.895376 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.895348 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.895453 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.895426 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6044f842-c26c-4c13-8987-6ed3e3e35c1b-web-config\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.895815 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.895796 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6044f842-c26c-4c13-8987-6ed3e3e35c1b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.903444 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.897851 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6044f842-c26c-4c13-8987-6ed3e3e35c1b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.903444 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.902693 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rff5x\" (UniqueName: \"kubernetes.io/projected/6044f842-c26c-4c13-8987-6ed3e3e35c1b-kube-api-access-rff5x\") pod \"prometheus-k8s-0\" (UID: \"6044f842-c26c-4c13-8987-6ed3e3e35c1b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:50.903962 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:50.903940 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30343fd2-6e3f-4118-b2ac-db945987ce03" path="/var/lib/kubelet/pods/30343fd2-6e3f-4118-b2ac-db945987ce03/volumes" Apr 22 16:25:51.030119 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:51.030074 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:25:51.159502 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:51.159476 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 16:25:51.162271 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:25:51.162243 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6044f842_c26c_4c13_8987_6ed3e3e35c1b.slice/crio-dac57ebd2f948422def7d0744e446971c68ce086c3e60519c8579a578e0aa153 WatchSource:0}: Error finding container dac57ebd2f948422def7d0744e446971c68ce086c3e60519c8579a578e0aa153: Status 404 returned error can't find the container with id dac57ebd2f948422def7d0744e446971c68ce086c3e60519c8579a578e0aa153 Apr 22 16:25:51.643962 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:51.643922 2573 generic.go:358] "Generic (PLEG): container finished" podID="6044f842-c26c-4c13-8987-6ed3e3e35c1b" containerID="689de80578308cd700d1b814a421829e3bbcb1d919805ac3bcc70c29c429accd" exitCode=0 Apr 22 16:25:51.644331 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:51.644016 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6044f842-c26c-4c13-8987-6ed3e3e35c1b","Type":"ContainerDied","Data":"689de80578308cd700d1b814a421829e3bbcb1d919805ac3bcc70c29c429accd"} Apr 22 16:25:51.644331 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:51.644049 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6044f842-c26c-4c13-8987-6ed3e3e35c1b","Type":"ContainerStarted","Data":"dac57ebd2f948422def7d0744e446971c68ce086c3e60519c8579a578e0aa153"} Apr 22 16:25:52.649227 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:52.649190 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6044f842-c26c-4c13-8987-6ed3e3e35c1b","Type":"ContainerStarted","Data":"cfbc011a1c70cb63c7f92c0eebf3f1b79fcef708d8adb646e7a300e55fb51a4f"} Apr 22 16:25:52.649227 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:52.649226 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6044f842-c26c-4c13-8987-6ed3e3e35c1b","Type":"ContainerStarted","Data":"0b55eacdba6c9962c8731bbb9b83206ed09d046bead86758558e63d59930ceff"} Apr 22 16:25:52.649227 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:52.649236 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6044f842-c26c-4c13-8987-6ed3e3e35c1b","Type":"ContainerStarted","Data":"15c1e4a41a72534780ff1fe2f4aaa606ed41d478ca6addc4a407b56852824ac3"} Apr 22 16:25:52.649806 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:52.649245 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6044f842-c26c-4c13-8987-6ed3e3e35c1b","Type":"ContainerStarted","Data":"1cc2675fa41aea18f104420fcd4f89ad22ce4af2f403f83c6431a09787cf4751"} Apr 22 16:25:52.649806 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:52.649253 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6044f842-c26c-4c13-8987-6ed3e3e35c1b","Type":"ContainerStarted","Data":"643adc381f902728b5a75313f2fbaf519966031f2efa271b36101813422eeac8"} Apr 22 16:25:52.649806 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:52.649261 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6044f842-c26c-4c13-8987-6ed3e3e35c1b","Type":"ContainerStarted","Data":"0e5c008cb9f89d72b5d7ff42faddb5f60e8e6849fd8571b4f78865f11005ce92"} Apr 22 16:25:52.677788 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:52.677729 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.677704055 podStartE2EDuration="2.677704055s" podCreationTimestamp="2026-04-22 16:25:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:25:52.676858375 +0000 UTC m=+256.612172601" watchObservedRunningTime="2026-04-22 16:25:52.677704055 +0000 UTC m=+256.613018280" Apr 22 16:25:56.031223 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:25:56.031143 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:26:36.491233 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:26:36.491203 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 16:26:36.491723 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:26:36.491211 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 16:26:36.499685 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:26:36.499666 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 16:26:51.030448 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:26:51.030408 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:26:51.046621 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:26:51.046586 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:26:51.826266 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:26:51.826240 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 16:31:36.511790 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:31:36.511744 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 16:31:36.512278 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:31:36.512022 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 16:36:36.530485 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:36:36.530444 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 16:36:36.531239 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:36:36.531222 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 16:41:36.549055 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:41:36.548970 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 16:41:36.551541 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:41:36.550516 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 16:46:36.569313 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:46:36.569205 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 16:46:36.573567 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:46:36.570797 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 16:49:16.522428 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:16.522397 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-db9w9"] Apr 22 16:49:16.525429 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:16.525413 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-db9w9" Apr 22 16:49:16.528958 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:16.528940 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 16:49:16.529594 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:16.529573 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-hbk5z\"" Apr 22 16:49:16.529643 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:16.529581 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 16:49:16.542173 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:16.542155 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-db9w9"] Apr 22 16:49:16.552748 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:16.552727 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zdcz\" (UniqueName: \"kubernetes.io/projected/21a124ae-4b8c-4622-8f94-700bb74f767d-kube-api-access-9zdcz\") pod \"cert-manager-cainjector-68b757865b-db9w9\" (UID: \"21a124ae-4b8c-4622-8f94-700bb74f767d\") " pod="cert-manager/cert-manager-cainjector-68b757865b-db9w9" Apr 22 16:49:16.552855 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:16.552815 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21a124ae-4b8c-4622-8f94-700bb74f767d-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-db9w9\" (UID: \"21a124ae-4b8c-4622-8f94-700bb74f767d\") " pod="cert-manager/cert-manager-cainjector-68b757865b-db9w9" Apr 22 16:49:16.653999 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:16.653944 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zdcz\" (UniqueName: \"kubernetes.io/projected/21a124ae-4b8c-4622-8f94-700bb74f767d-kube-api-access-9zdcz\") pod \"cert-manager-cainjector-68b757865b-db9w9\" (UID: \"21a124ae-4b8c-4622-8f94-700bb74f767d\") " pod="cert-manager/cert-manager-cainjector-68b757865b-db9w9" Apr 22 16:49:16.654113 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:16.654026 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21a124ae-4b8c-4622-8f94-700bb74f767d-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-db9w9\" (UID: \"21a124ae-4b8c-4622-8f94-700bb74f767d\") " pod="cert-manager/cert-manager-cainjector-68b757865b-db9w9" Apr 22 16:49:16.663502 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:16.663476 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21a124ae-4b8c-4622-8f94-700bb74f767d-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-db9w9\" (UID: \"21a124ae-4b8c-4622-8f94-700bb74f767d\") " pod="cert-manager/cert-manager-cainjector-68b757865b-db9w9" Apr 22 16:49:16.663633 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:16.663615 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zdcz\" (UniqueName: \"kubernetes.io/projected/21a124ae-4b8c-4622-8f94-700bb74f767d-kube-api-access-9zdcz\") pod \"cert-manager-cainjector-68b757865b-db9w9\" (UID: \"21a124ae-4b8c-4622-8f94-700bb74f767d\") " pod="cert-manager/cert-manager-cainjector-68b757865b-db9w9" Apr 22 16:49:16.846190 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:16.846169 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-db9w9" Apr 22 16:49:16.969838 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:16.969805 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-db9w9"] Apr 22 16:49:16.973279 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:49:16.973252 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21a124ae_4b8c_4622_8f94_700bb74f767d.slice/crio-ee6c10c44a26cda10553b12ba1dbd639c14b33eca1124ba0641f2fd1fe4a0248 WatchSource:0}: Error finding container ee6c10c44a26cda10553b12ba1dbd639c14b33eca1124ba0641f2fd1fe4a0248: Status 404 returned error can't find the container with id ee6c10c44a26cda10553b12ba1dbd639c14b33eca1124ba0641f2fd1fe4a0248 Apr 22 16:49:16.974931 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:16.974915 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 16:49:17.465191 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:17.465151 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-db9w9" event={"ID":"21a124ae-4b8c-4622-8f94-700bb74f767d","Type":"ContainerStarted","Data":"ee6c10c44a26cda10553b12ba1dbd639c14b33eca1124ba0641f2fd1fe4a0248"} Apr 22 16:49:20.477270 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:20.477237 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-db9w9" event={"ID":"21a124ae-4b8c-4622-8f94-700bb74f767d","Type":"ContainerStarted","Data":"e3d989c16b8c6dc8f61e8ad4c3e6c83778d0d9c9c15686ca6bb78317462ca733"} Apr 22 16:49:20.493078 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:20.493034 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-db9w9" podStartSLOduration=1.493356795 podStartE2EDuration="4.493020607s" podCreationTimestamp="2026-04-22 16:49:16 +0000 UTC" firstStartedPulling="2026-04-22 16:49:16.97503797 +0000 UTC m=+1660.910352173" lastFinishedPulling="2026-04-22 16:49:19.974701769 +0000 UTC m=+1663.910015985" observedRunningTime="2026-04-22 16:49:20.491028027 +0000 UTC m=+1664.426342275" watchObservedRunningTime="2026-04-22 16:49:20.493020607 +0000 UTC m=+1664.428334832" Apr 22 16:49:27.427792 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:27.427749 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-sfr8p"] Apr 22 16:49:27.430992 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:27.430976 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sfr8p" Apr 22 16:49:27.433333 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:27.433305 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-nwgc2\"" Apr 22 16:49:27.433333 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:27.433327 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 16:49:27.434119 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:27.434099 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 16:49:27.438584 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:27.438560 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-sfr8p"] Apr 22 16:49:27.532467 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:27.532442 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6c935793-a12a-459a-beed-c39393195983-tmp\") pod \"openshift-lws-operator-bfc7f696d-sfr8p\" (UID: \"6c935793-a12a-459a-beed-c39393195983\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sfr8p" Apr 22 16:49:27.532583 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:27.532470 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj526\" (UniqueName: \"kubernetes.io/projected/6c935793-a12a-459a-beed-c39393195983-kube-api-access-mj526\") pod \"openshift-lws-operator-bfc7f696d-sfr8p\" (UID: \"6c935793-a12a-459a-beed-c39393195983\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sfr8p" Apr 22 16:49:27.633714 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:27.633685 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6c935793-a12a-459a-beed-c39393195983-tmp\") pod \"openshift-lws-operator-bfc7f696d-sfr8p\" (UID: \"6c935793-a12a-459a-beed-c39393195983\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sfr8p" Apr 22 16:49:27.633714 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:27.633715 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mj526\" (UniqueName: \"kubernetes.io/projected/6c935793-a12a-459a-beed-c39393195983-kube-api-access-mj526\") pod \"openshift-lws-operator-bfc7f696d-sfr8p\" (UID: \"6c935793-a12a-459a-beed-c39393195983\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sfr8p" Apr 22 16:49:27.634037 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:27.634020 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6c935793-a12a-459a-beed-c39393195983-tmp\") pod \"openshift-lws-operator-bfc7f696d-sfr8p\" (UID: \"6c935793-a12a-459a-beed-c39393195983\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sfr8p" Apr 22 16:49:27.641076 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:27.641047 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj526\" (UniqueName: \"kubernetes.io/projected/6c935793-a12a-459a-beed-c39393195983-kube-api-access-mj526\") pod \"openshift-lws-operator-bfc7f696d-sfr8p\" (UID: \"6c935793-a12a-459a-beed-c39393195983\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sfr8p" Apr 22 16:49:27.740830 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:27.740745 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sfr8p" Apr 22 16:49:27.855459 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:27.855434 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-sfr8p"] Apr 22 16:49:27.857927 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:49:27.857900 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c935793_a12a_459a_beed_c39393195983.slice/crio-6d141efa3fc86bc0a8614a3fc6ed388279fb918912140f452209b997bb442e73 WatchSource:0}: Error finding container 6d141efa3fc86bc0a8614a3fc6ed388279fb918912140f452209b997bb442e73: Status 404 returned error can't find the container with id 6d141efa3fc86bc0a8614a3fc6ed388279fb918912140f452209b997bb442e73 Apr 22 16:49:28.499804 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:28.499770 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sfr8p" event={"ID":"6c935793-a12a-459a-beed-c39393195983","Type":"ContainerStarted","Data":"6d141efa3fc86bc0a8614a3fc6ed388279fb918912140f452209b997bb442e73"} Apr 22 16:49:31.509584 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:31.509550 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sfr8p" event={"ID":"6c935793-a12a-459a-beed-c39393195983","Type":"ContainerStarted","Data":"b4e2eaa3cbf3f0e2246e8a3da961778e5e508b13d39c7aa579483f8184bdb69b"} Apr 22 16:49:31.528380 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:31.528328 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sfr8p" podStartSLOduration=1.907383802 podStartE2EDuration="4.528313124s" podCreationTimestamp="2026-04-22 16:49:27 +0000 UTC" firstStartedPulling="2026-04-22 16:49:27.859257012 +0000 UTC m=+1671.794571218" lastFinishedPulling="2026-04-22 16:49:30.480186333 +0000 UTC m=+1674.415500540" observedRunningTime="2026-04-22 16:49:31.527342256 +0000 UTC m=+1675.462656480" watchObservedRunningTime="2026-04-22 16:49:31.528313124 +0000 UTC m=+1675.463627350" Apr 22 16:49:46.557240 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.557206 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7c5749599b-rgl2r"] Apr 22 16:49:46.567326 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.567298 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7c5749599b-rgl2r" Apr 22 16:49:46.571233 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.571211 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 16:49:46.571335 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.571254 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-wf8fc\"" Apr 22 16:49:46.571335 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.571300 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 16:49:46.571335 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.571300 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 16:49:46.579246 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.579228 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7c5749599b-rgl2r"] Apr 22 16:49:46.676264 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.676230 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-787qw\" (UniqueName: \"kubernetes.io/projected/067d2728-bc89-461e-8360-3524d1c1865b-kube-api-access-787qw\") pod \"lws-controller-manager-7c5749599b-rgl2r\" (UID: \"067d2728-bc89-461e-8360-3524d1c1865b\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-rgl2r" Apr 22 16:49:46.676422 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.676271 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/067d2728-bc89-461e-8360-3524d1c1865b-cert\") pod \"lws-controller-manager-7c5749599b-rgl2r\" (UID: \"067d2728-bc89-461e-8360-3524d1c1865b\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-rgl2r" Apr 22 16:49:46.676422 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.676295 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/067d2728-bc89-461e-8360-3524d1c1865b-metrics-cert\") pod \"lws-controller-manager-7c5749599b-rgl2r\" (UID: \"067d2728-bc89-461e-8360-3524d1c1865b\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-rgl2r" Apr 22 16:49:46.676422 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.676378 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/067d2728-bc89-461e-8360-3524d1c1865b-manager-config\") pod \"lws-controller-manager-7c5749599b-rgl2r\" (UID: \"067d2728-bc89-461e-8360-3524d1c1865b\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-rgl2r" Apr 22 16:49:46.777275 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.777248 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/067d2728-bc89-461e-8360-3524d1c1865b-metrics-cert\") pod \"lws-controller-manager-7c5749599b-rgl2r\" (UID: \"067d2728-bc89-461e-8360-3524d1c1865b\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-rgl2r" Apr 22 16:49:46.777393 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.777283 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/067d2728-bc89-461e-8360-3524d1c1865b-manager-config\") pod \"lws-controller-manager-7c5749599b-rgl2r\" (UID: \"067d2728-bc89-461e-8360-3524d1c1865b\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-rgl2r" Apr 22 16:49:46.777436 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.777410 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-787qw\" (UniqueName: \"kubernetes.io/projected/067d2728-bc89-461e-8360-3524d1c1865b-kube-api-access-787qw\") pod \"lws-controller-manager-7c5749599b-rgl2r\" (UID: \"067d2728-bc89-461e-8360-3524d1c1865b\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-rgl2r" Apr 22 16:49:46.777479 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.777463 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/067d2728-bc89-461e-8360-3524d1c1865b-cert\") pod \"lws-controller-manager-7c5749599b-rgl2r\" (UID: \"067d2728-bc89-461e-8360-3524d1c1865b\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-rgl2r" Apr 22 16:49:46.777833 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.777815 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/067d2728-bc89-461e-8360-3524d1c1865b-manager-config\") pod \"lws-controller-manager-7c5749599b-rgl2r\" (UID: \"067d2728-bc89-461e-8360-3524d1c1865b\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-rgl2r" Apr 22 16:49:46.779714 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.779694 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/067d2728-bc89-461e-8360-3524d1c1865b-cert\") pod \"lws-controller-manager-7c5749599b-rgl2r\" (UID: \"067d2728-bc89-461e-8360-3524d1c1865b\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-rgl2r" Apr 22 16:49:46.779825 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.779788 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/067d2728-bc89-461e-8360-3524d1c1865b-metrics-cert\") pod \"lws-controller-manager-7c5749599b-rgl2r\" (UID: \"067d2728-bc89-461e-8360-3524d1c1865b\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-rgl2r" Apr 22 16:49:46.785980 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.785958 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-787qw\" (UniqueName: \"kubernetes.io/projected/067d2728-bc89-461e-8360-3524d1c1865b-kube-api-access-787qw\") pod \"lws-controller-manager-7c5749599b-rgl2r\" (UID: \"067d2728-bc89-461e-8360-3524d1c1865b\") " pod="openshift-lws-operator/lws-controller-manager-7c5749599b-rgl2r" Apr 22 16:49:46.875925 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.875868 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7c5749599b-rgl2r" Apr 22 16:49:46.876787 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.876597 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-57c8d5d679-jt9gf"] Apr 22 16:49:46.881289 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.881272 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-jt9gf" Apr 22 16:49:46.886863 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.886558 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 22 16:49:46.886863 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.886614 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 16:49:46.886863 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.886628 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 22 16:49:46.886863 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.886686 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 16:49:46.887116 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.886916 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-b4qgt\"" Apr 22 16:49:46.895141 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.895097 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-57c8d5d679-jt9gf"] Apr 22 16:49:46.978700 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.978672 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b05e3e8-670f-4b6c-b3d9-eee53548969c-webhook-cert\") pod \"opendatahub-operator-controller-manager-57c8d5d679-jt9gf\" (UID: \"9b05e3e8-670f-4b6c-b3d9-eee53548969c\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-jt9gf" Apr 22 16:49:46.978893 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.978718 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzmg9\" (UniqueName: \"kubernetes.io/projected/9b05e3e8-670f-4b6c-b3d9-eee53548969c-kube-api-access-zzmg9\") pod \"opendatahub-operator-controller-manager-57c8d5d679-jt9gf\" (UID: \"9b05e3e8-670f-4b6c-b3d9-eee53548969c\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-jt9gf" Apr 22 16:49:46.978893 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:46.978854 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b05e3e8-670f-4b6c-b3d9-eee53548969c-apiservice-cert\") pod \"opendatahub-operator-controller-manager-57c8d5d679-jt9gf\" (UID: \"9b05e3e8-670f-4b6c-b3d9-eee53548969c\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-jt9gf" Apr 22 16:49:47.007628 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:47.007609 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7c5749599b-rgl2r"] Apr 22 16:49:47.009250 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:49:47.009226 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod067d2728_bc89_461e_8360_3524d1c1865b.slice/crio-c5fefb35f3c65fc4c336e6b7e6a081f05dd028a0a765c862bd62c4a360c5b307 WatchSource:0}: Error finding container c5fefb35f3c65fc4c336e6b7e6a081f05dd028a0a765c862bd62c4a360c5b307: Status 404 returned error can't find the container with id c5fefb35f3c65fc4c336e6b7e6a081f05dd028a0a765c862bd62c4a360c5b307 Apr 22 16:49:47.079913 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:47.079888 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b05e3e8-670f-4b6c-b3d9-eee53548969c-webhook-cert\") pod \"opendatahub-operator-controller-manager-57c8d5d679-jt9gf\" (UID: \"9b05e3e8-670f-4b6c-b3d9-eee53548969c\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-jt9gf" Apr 22 16:49:47.080057 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:47.079935 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzmg9\" (UniqueName: \"kubernetes.io/projected/9b05e3e8-670f-4b6c-b3d9-eee53548969c-kube-api-access-zzmg9\") pod \"opendatahub-operator-controller-manager-57c8d5d679-jt9gf\" (UID: \"9b05e3e8-670f-4b6c-b3d9-eee53548969c\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-jt9gf" Apr 22 16:49:47.080057 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:47.079980 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b05e3e8-670f-4b6c-b3d9-eee53548969c-apiservice-cert\") pod \"opendatahub-operator-controller-manager-57c8d5d679-jt9gf\" (UID: \"9b05e3e8-670f-4b6c-b3d9-eee53548969c\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-jt9gf" Apr 22 16:49:47.082232 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:47.082206 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b05e3e8-670f-4b6c-b3d9-eee53548969c-webhook-cert\") pod \"opendatahub-operator-controller-manager-57c8d5d679-jt9gf\" (UID: \"9b05e3e8-670f-4b6c-b3d9-eee53548969c\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-jt9gf" Apr 22 16:49:47.082323 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:47.082250 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b05e3e8-670f-4b6c-b3d9-eee53548969c-apiservice-cert\") pod \"opendatahub-operator-controller-manager-57c8d5d679-jt9gf\" (UID: \"9b05e3e8-670f-4b6c-b3d9-eee53548969c\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-jt9gf" Apr 22 16:49:47.092518 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:47.092498 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzmg9\" (UniqueName: \"kubernetes.io/projected/9b05e3e8-670f-4b6c-b3d9-eee53548969c-kube-api-access-zzmg9\") pod \"opendatahub-operator-controller-manager-57c8d5d679-jt9gf\" (UID: \"9b05e3e8-670f-4b6c-b3d9-eee53548969c\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-jt9gf" Apr 22 16:49:47.205054 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:47.205003 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-jt9gf" Apr 22 16:49:47.326626 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:47.326602 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-57c8d5d679-jt9gf"] Apr 22 16:49:47.329549 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:49:47.329516 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b05e3e8_670f_4b6c_b3d9_eee53548969c.slice/crio-75fbbe2c3a2490326805170f85cbacb7a56e1117f9e53b5ed2fe0ae5c40281e5 WatchSource:0}: Error finding container 75fbbe2c3a2490326805170f85cbacb7a56e1117f9e53b5ed2fe0ae5c40281e5: Status 404 returned error can't find the container with id 75fbbe2c3a2490326805170f85cbacb7a56e1117f9e53b5ed2fe0ae5c40281e5 Apr 22 16:49:47.554059 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:47.554022 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7c5749599b-rgl2r" event={"ID":"067d2728-bc89-461e-8360-3524d1c1865b","Type":"ContainerStarted","Data":"c5fefb35f3c65fc4c336e6b7e6a081f05dd028a0a765c862bd62c4a360c5b307"} Apr 22 16:49:47.555078 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:47.555056 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-jt9gf" event={"ID":"9b05e3e8-670f-4b6c-b3d9-eee53548969c","Type":"ContainerStarted","Data":"75fbbe2c3a2490326805170f85cbacb7a56e1117f9e53b5ed2fe0ae5c40281e5"} Apr 22 16:49:50.567325 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:50.567281 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7c5749599b-rgl2r" event={"ID":"067d2728-bc89-461e-8360-3524d1c1865b","Type":"ContainerStarted","Data":"f7ad59a0a2bbf6a48a615b03a85b0a32aabd699aece06dbb2da28c9df659de6c"} Apr 22 16:49:50.567742 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:50.567376 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7c5749599b-rgl2r" Apr 22 16:49:50.568663 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:50.568643 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-jt9gf" event={"ID":"9b05e3e8-670f-4b6c-b3d9-eee53548969c","Type":"ContainerStarted","Data":"89670b1a698e01db0169b43d484f42641f32bb9bc765c0c43e76137bfe1c0ce0"} Apr 22 16:49:50.568809 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:50.568797 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-jt9gf" Apr 22 16:49:50.588454 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:50.588418 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7c5749599b-rgl2r" podStartSLOduration=1.6397797920000001 podStartE2EDuration="4.588407779s" podCreationTimestamp="2026-04-22 16:49:46 +0000 UTC" firstStartedPulling="2026-04-22 16:49:47.011203911 +0000 UTC m=+1690.946518115" lastFinishedPulling="2026-04-22 16:49:49.959831896 +0000 UTC m=+1693.895146102" observedRunningTime="2026-04-22 16:49:50.58607008 +0000 UTC m=+1694.521384304" watchObservedRunningTime="2026-04-22 16:49:50.588407779 +0000 UTC m=+1694.523722004" Apr 22 16:49:50.604549 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:49:50.604504 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-jt9gf" podStartSLOduration=1.937686791 podStartE2EDuration="4.604493545s" podCreationTimestamp="2026-04-22 16:49:46 +0000 UTC" firstStartedPulling="2026-04-22 16:49:47.331360259 +0000 UTC m=+1691.266674462" lastFinishedPulling="2026-04-22 16:49:49.998167005 +0000 UTC m=+1693.933481216" observedRunningTime="2026-04-22 16:49:50.603793479 +0000 UTC m=+1694.539107706" watchObservedRunningTime="2026-04-22 16:49:50.604493545 +0000 UTC m=+1694.539807772" Apr 22 16:50:01.573352 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:01.573273 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7c5749599b-rgl2r" Apr 22 16:50:01.573821 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:01.573464 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-jt9gf" Apr 22 16:50:04.674427 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:04.674392 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-644d48748b-q526j"] Apr 22 16:50:04.677541 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:04.677519 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-644d48748b-q526j" Apr 22 16:50:04.680246 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:04.680225 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 16:50:04.680373 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:04.680225 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 22 16:50:04.680373 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:04.680323 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 16:50:04.681060 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:04.681044 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 22 16:50:04.681158 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:04.681107 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-6859h\"" Apr 22 16:50:04.688666 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:04.688643 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-644d48748b-q526j"] Apr 22 16:50:04.827917 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:04.827881 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgmkl\" (UniqueName: \"kubernetes.io/projected/0ee2d791-3f1a-4d82-bde7-7cd8765d2850-kube-api-access-fgmkl\") pod \"kube-auth-proxy-644d48748b-q526j\" (UID: \"0ee2d791-3f1a-4d82-bde7-7cd8765d2850\") " pod="openshift-ingress/kube-auth-proxy-644d48748b-q526j" Apr 22 16:50:04.828082 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:04.827954 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0ee2d791-3f1a-4d82-bde7-7cd8765d2850-tls-certs\") pod \"kube-auth-proxy-644d48748b-q526j\" (UID: \"0ee2d791-3f1a-4d82-bde7-7cd8765d2850\") " pod="openshift-ingress/kube-auth-proxy-644d48748b-q526j" Apr 22 16:50:04.828082 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:04.827977 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0ee2d791-3f1a-4d82-bde7-7cd8765d2850-tmp\") pod \"kube-auth-proxy-644d48748b-q526j\" (UID: \"0ee2d791-3f1a-4d82-bde7-7cd8765d2850\") " pod="openshift-ingress/kube-auth-proxy-644d48748b-q526j" Apr 22 16:50:04.929127 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:04.929049 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0ee2d791-3f1a-4d82-bde7-7cd8765d2850-tmp\") pod \"kube-auth-proxy-644d48748b-q526j\" (UID: \"0ee2d791-3f1a-4d82-bde7-7cd8765d2850\") " pod="openshift-ingress/kube-auth-proxy-644d48748b-q526j" Apr 22 16:50:04.929127 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:04.929106 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgmkl\" (UniqueName: \"kubernetes.io/projected/0ee2d791-3f1a-4d82-bde7-7cd8765d2850-kube-api-access-fgmkl\") pod \"kube-auth-proxy-644d48748b-q526j\" (UID: \"0ee2d791-3f1a-4d82-bde7-7cd8765d2850\") " pod="openshift-ingress/kube-auth-proxy-644d48748b-q526j" Apr 22 16:50:04.929349 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:04.929171 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0ee2d791-3f1a-4d82-bde7-7cd8765d2850-tls-certs\") pod \"kube-auth-proxy-644d48748b-q526j\" (UID: \"0ee2d791-3f1a-4d82-bde7-7cd8765d2850\") " pod="openshift-ingress/kube-auth-proxy-644d48748b-q526j" Apr 22 16:50:04.931246 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:04.931217 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0ee2d791-3f1a-4d82-bde7-7cd8765d2850-tmp\") pod \"kube-auth-proxy-644d48748b-q526j\" (UID: \"0ee2d791-3f1a-4d82-bde7-7cd8765d2850\") " pod="openshift-ingress/kube-auth-proxy-644d48748b-q526j" Apr 22 16:50:04.931559 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:04.931537 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0ee2d791-3f1a-4d82-bde7-7cd8765d2850-tls-certs\") pod \"kube-auth-proxy-644d48748b-q526j\" (UID: \"0ee2d791-3f1a-4d82-bde7-7cd8765d2850\") " pod="openshift-ingress/kube-auth-proxy-644d48748b-q526j" Apr 22 16:50:04.939548 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:04.939523 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgmkl\" (UniqueName: \"kubernetes.io/projected/0ee2d791-3f1a-4d82-bde7-7cd8765d2850-kube-api-access-fgmkl\") pod \"kube-auth-proxy-644d48748b-q526j\" (UID: \"0ee2d791-3f1a-4d82-bde7-7cd8765d2850\") " pod="openshift-ingress/kube-auth-proxy-644d48748b-q526j" Apr 22 16:50:04.989459 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:04.989429 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-644d48748b-q526j" Apr 22 16:50:05.117935 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:05.117793 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-644d48748b-q526j"] Apr 22 16:50:05.120630 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:50:05.120601 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ee2d791_3f1a_4d82_bde7_7cd8765d2850.slice/crio-86a25778264eea7924f7846e8409821383a3d96fcc8415fe35c9f80abba3a78e WatchSource:0}: Error finding container 86a25778264eea7924f7846e8409821383a3d96fcc8415fe35c9f80abba3a78e: Status 404 returned error can't find the container with id 86a25778264eea7924f7846e8409821383a3d96fcc8415fe35c9f80abba3a78e Apr 22 16:50:05.625458 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:05.625426 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-644d48748b-q526j" event={"ID":"0ee2d791-3f1a-4d82-bde7-7cd8765d2850","Type":"ContainerStarted","Data":"86a25778264eea7924f7846e8409821383a3d96fcc8415fe35c9f80abba3a78e"} Apr 22 16:50:09.639571 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:09.639534 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-644d48748b-q526j" event={"ID":"0ee2d791-3f1a-4d82-bde7-7cd8765d2850","Type":"ContainerStarted","Data":"6efd1c2c132c5ef3fa6506ea4fd3e4a000a271400cda4aa1e4b642c3fffdbf92"} Apr 22 16:50:09.655299 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:50:09.655238 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-644d48748b-q526j" podStartSLOduration=2.089453158 podStartE2EDuration="5.655222164s" podCreationTimestamp="2026-04-22 16:50:04 +0000 UTC" firstStartedPulling="2026-04-22 16:50:05.122321997 +0000 UTC m=+1709.057636204" lastFinishedPulling="2026-04-22 16:50:08.688090992 +0000 UTC m=+1712.623405210" observedRunningTime="2026-04-22 16:50:09.653554821 +0000 UTC m=+1713.588869045" watchObservedRunningTime="2026-04-22 16:50:09.655222164 +0000 UTC m=+1713.590536391" Apr 22 16:51:36.590325 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:36.590209 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 16:51:36.595407 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:36.592476 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 16:51:54.393537 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:54.393508 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-dhl7x"] Apr 22 16:51:54.396732 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:54.396713 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-dhl7x" Apr 22 16:51:54.398946 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:54.398921 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 22 16:51:54.399061 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:54.398968 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 16:51:54.399796 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:54.399773 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-9455s\"" Apr 22 16:51:54.399925 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:54.399800 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 16:51:54.399925 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:54.399839 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 22 16:51:54.406409 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:54.406390 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-dhl7x"] Apr 22 16:51:54.502466 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:54.502436 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f3683328-89c2-455f-a9cd-958a10c37c07-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-dhl7x\" (UID: \"f3683328-89c2-455f-a9cd-958a10c37c07\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-dhl7x" Apr 22 16:51:54.502612 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:54.502491 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt6pw\" (UniqueName: \"kubernetes.io/projected/f3683328-89c2-455f-a9cd-958a10c37c07-kube-api-access-xt6pw\") pod \"kuadrant-console-plugin-6cb54b5c86-dhl7x\" (UID: \"f3683328-89c2-455f-a9cd-958a10c37c07\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-dhl7x" Apr 22 16:51:54.502612 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:54.502581 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3683328-89c2-455f-a9cd-958a10c37c07-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-dhl7x\" (UID: \"f3683328-89c2-455f-a9cd-958a10c37c07\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-dhl7x" Apr 22 16:51:54.603825 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:54.603801 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3683328-89c2-455f-a9cd-958a10c37c07-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-dhl7x\" (UID: \"f3683328-89c2-455f-a9cd-958a10c37c07\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-dhl7x" Apr 22 16:51:54.603926 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:54.603843 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f3683328-89c2-455f-a9cd-958a10c37c07-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-dhl7x\" (UID: \"f3683328-89c2-455f-a9cd-958a10c37c07\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-dhl7x" Apr 22 16:51:54.603926 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:54.603885 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt6pw\" (UniqueName: \"kubernetes.io/projected/f3683328-89c2-455f-a9cd-958a10c37c07-kube-api-access-xt6pw\") pod \"kuadrant-console-plugin-6cb54b5c86-dhl7x\" (UID: \"f3683328-89c2-455f-a9cd-958a10c37c07\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-dhl7x" Apr 22 16:51:54.604031 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:51:54.603958 2573 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 22 16:51:54.604081 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:51:54.604032 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3683328-89c2-455f-a9cd-958a10c37c07-plugin-serving-cert podName:f3683328-89c2-455f-a9cd-958a10c37c07 nodeName:}" failed. No retries permitted until 2026-04-22 16:51:55.1040105 +0000 UTC m=+1819.039324719 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/f3683328-89c2-455f-a9cd-958a10c37c07-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-dhl7x" (UID: "f3683328-89c2-455f-a9cd-958a10c37c07") : secret "plugin-serving-cert" not found Apr 22 16:51:54.604592 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:54.604570 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f3683328-89c2-455f-a9cd-958a10c37c07-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-dhl7x\" (UID: \"f3683328-89c2-455f-a9cd-958a10c37c07\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-dhl7x" Apr 22 16:51:54.612084 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:54.612064 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt6pw\" (UniqueName: \"kubernetes.io/projected/f3683328-89c2-455f-a9cd-958a10c37c07-kube-api-access-xt6pw\") pod \"kuadrant-console-plugin-6cb54b5c86-dhl7x\" (UID: \"f3683328-89c2-455f-a9cd-958a10c37c07\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-dhl7x" Apr 22 16:51:55.107493 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:55.107461 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3683328-89c2-455f-a9cd-958a10c37c07-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-dhl7x\" (UID: \"f3683328-89c2-455f-a9cd-958a10c37c07\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-dhl7x" Apr 22 16:51:55.109850 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:55.109820 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3683328-89c2-455f-a9cd-958a10c37c07-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-dhl7x\" (UID: \"f3683328-89c2-455f-a9cd-958a10c37c07\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-dhl7x" Apr 22 16:51:55.307527 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:55.307498 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-dhl7x" Apr 22 16:51:55.439469 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:55.439448 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-dhl7x"] Apr 22 16:51:55.441928 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:51:55.441902 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3683328_89c2_455f_a9cd_958a10c37c07.slice/crio-7db84e0c98ba7082f43b0934be92d5a8c331dc701a890c15a09453b0c59a221c WatchSource:0}: Error finding container 7db84e0c98ba7082f43b0934be92d5a8c331dc701a890c15a09453b0c59a221c: Status 404 returned error can't find the container with id 7db84e0c98ba7082f43b0934be92d5a8c331dc701a890c15a09453b0c59a221c Apr 22 16:51:55.971808 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:51:55.971773 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-dhl7x" event={"ID":"f3683328-89c2-455f-a9cd-958a10c37c07","Type":"ContainerStarted","Data":"7db84e0c98ba7082f43b0934be92d5a8c331dc701a890c15a09453b0c59a221c"} Apr 22 16:52:21.058244 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:52:21.058205 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-dhl7x" event={"ID":"f3683328-89c2-455f-a9cd-958a10c37c07","Type":"ContainerStarted","Data":"6ccbfb8c031354459862ca4796042b0694b99a68d89200edd946bffa99b6548d"} Apr 22 16:52:21.077212 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:52:21.077113 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-dhl7x" podStartSLOduration=2.316361864 podStartE2EDuration="27.077094895s" podCreationTimestamp="2026-04-22 16:51:54 +0000 UTC" firstStartedPulling="2026-04-22 16:51:55.44353547 +0000 UTC m=+1819.378849674" lastFinishedPulling="2026-04-22 16:52:20.204268502 +0000 UTC m=+1844.139582705" observedRunningTime="2026-04-22 16:52:21.07550594 +0000 UTC m=+1845.010820189" watchObservedRunningTime="2026-04-22 16:52:21.077094895 +0000 UTC m=+1845.012409120" Apr 22 16:52:42.493175 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:52:42.493141 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:52:42.495255 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:52:42.495238 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-wqr5w" Apr 22 16:52:42.497488 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:52:42.497469 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 16:52:42.504413 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:52:42.504391 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:52:42.526849 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:52:42.526827 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:52:42.612694 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:52:42.612668 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/78efe79c-f859-4a19-bbe9-8c92c8dee215-config-file\") pod \"limitador-limitador-78c99df468-wqr5w\" (UID: \"78efe79c-f859-4a19-bbe9-8c92c8dee215\") " pod="kuadrant-system/limitador-limitador-78c99df468-wqr5w" Apr 22 16:52:42.612822 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:52:42.612704 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wvcp\" (UniqueName: \"kubernetes.io/projected/78efe79c-f859-4a19-bbe9-8c92c8dee215-kube-api-access-2wvcp\") pod \"limitador-limitador-78c99df468-wqr5w\" (UID: \"78efe79c-f859-4a19-bbe9-8c92c8dee215\") " pod="kuadrant-system/limitador-limitador-78c99df468-wqr5w" Apr 22 16:52:42.713853 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:52:42.713830 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/78efe79c-f859-4a19-bbe9-8c92c8dee215-config-file\") pod \"limitador-limitador-78c99df468-wqr5w\" (UID: \"78efe79c-f859-4a19-bbe9-8c92c8dee215\") " pod="kuadrant-system/limitador-limitador-78c99df468-wqr5w" Apr 22 16:52:42.713958 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:52:42.713871 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wvcp\" (UniqueName: \"kubernetes.io/projected/78efe79c-f859-4a19-bbe9-8c92c8dee215-kube-api-access-2wvcp\") pod \"limitador-limitador-78c99df468-wqr5w\" (UID: \"78efe79c-f859-4a19-bbe9-8c92c8dee215\") " pod="kuadrant-system/limitador-limitador-78c99df468-wqr5w" Apr 22 16:52:42.714435 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:52:42.714416 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/78efe79c-f859-4a19-bbe9-8c92c8dee215-config-file\") pod \"limitador-limitador-78c99df468-wqr5w\" (UID: \"78efe79c-f859-4a19-bbe9-8c92c8dee215\") " pod="kuadrant-system/limitador-limitador-78c99df468-wqr5w" Apr 22 16:52:42.722402 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:52:42.722382 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wvcp\" (UniqueName: \"kubernetes.io/projected/78efe79c-f859-4a19-bbe9-8c92c8dee215-kube-api-access-2wvcp\") pod \"limitador-limitador-78c99df468-wqr5w\" (UID: \"78efe79c-f859-4a19-bbe9-8c92c8dee215\") " pod="kuadrant-system/limitador-limitador-78c99df468-wqr5w" Apr 22 16:52:42.806024 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:52:42.806003 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-wqr5w" Apr 22 16:52:42.921412 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:52:42.921391 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:52:42.923484 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:52:42.923457 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78efe79c_f859_4a19_bbe9_8c92c8dee215.slice/crio-29c6b73faed7f0aa69377c7b7516baf63029d8b06d734919a47bd8d32aa2de37 WatchSource:0}: Error finding container 29c6b73faed7f0aa69377c7b7516baf63029d8b06d734919a47bd8d32aa2de37: Status 404 returned error can't find the container with id 29c6b73faed7f0aa69377c7b7516baf63029d8b06d734919a47bd8d32aa2de37 Apr 22 16:52:43.131032 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:52:43.130953 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-wqr5w" event={"ID":"78efe79c-f859-4a19-bbe9-8c92c8dee215","Type":"ContainerStarted","Data":"29c6b73faed7f0aa69377c7b7516baf63029d8b06d734919a47bd8d32aa2de37"} Apr 22 16:52:46.142450 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:52:46.142417 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-wqr5w" event={"ID":"78efe79c-f859-4a19-bbe9-8c92c8dee215","Type":"ContainerStarted","Data":"b4b83b2c78c23120a3360309c93f8c6ca080cd4c23c07abbd400ddde54fdaedd"} Apr 22 16:52:46.142882 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:52:46.142557 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-wqr5w" Apr 22 16:52:46.157589 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:52:46.157543 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-wqr5w" podStartSLOduration=1.592143984 podStartE2EDuration="4.157529392s" podCreationTimestamp="2026-04-22 16:52:42 +0000 UTC" firstStartedPulling="2026-04-22 16:52:42.92518518 +0000 UTC m=+1866.860499382" lastFinishedPulling="2026-04-22 16:52:45.490570573 +0000 UTC m=+1869.425884790" observedRunningTime="2026-04-22 16:52:46.156206513 +0000 UTC m=+1870.091520738" watchObservedRunningTime="2026-04-22 16:52:46.157529392 +0000 UTC m=+1870.092843619" Apr 22 16:52:57.146622 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:52:57.146550 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-wqr5w" Apr 22 16:53:10.847667 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:10.847632 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8666ccb95d-h78nc"] Apr 22 16:53:10.849747 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:10.849729 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8666ccb95d-h78nc" Apr 22 16:53:10.851798 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:10.851776 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-wgb6c\"" Apr 22 16:53:10.856286 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:10.856262 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8666ccb95d-h78nc"] Apr 22 16:53:10.936674 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:10.936641 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdwgw\" (UniqueName: \"kubernetes.io/projected/43ed479a-fc50-47b4-9a87-058c565111c9-kube-api-access-gdwgw\") pod \"authorino-8666ccb95d-h78nc\" (UID: \"43ed479a-fc50-47b4-9a87-058c565111c9\") " pod="kuadrant-system/authorino-8666ccb95d-h78nc" Apr 22 16:53:11.037655 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:11.037627 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdwgw\" (UniqueName: \"kubernetes.io/projected/43ed479a-fc50-47b4-9a87-058c565111c9-kube-api-access-gdwgw\") pod \"authorino-8666ccb95d-h78nc\" (UID: \"43ed479a-fc50-47b4-9a87-058c565111c9\") " pod="kuadrant-system/authorino-8666ccb95d-h78nc" Apr 22 16:53:11.044645 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:11.044617 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdwgw\" (UniqueName: \"kubernetes.io/projected/43ed479a-fc50-47b4-9a87-058c565111c9-kube-api-access-gdwgw\") pod \"authorino-8666ccb95d-h78nc\" (UID: \"43ed479a-fc50-47b4-9a87-058c565111c9\") " pod="kuadrant-system/authorino-8666ccb95d-h78nc" Apr 22 16:53:11.052472 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:11.052452 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8666ccb95d-h78nc"] Apr 22 16:53:11.052647 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:11.052636 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8666ccb95d-h78nc" Apr 22 16:53:11.167256 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:11.167218 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8666ccb95d-h78nc"] Apr 22 16:53:11.169487 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:53:11.169459 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43ed479a_fc50_47b4_9a87_058c565111c9.slice/crio-58684282cdfd63b6258dd53c00c271819b2bc77b622c634f1ad6090a4e96c9e5 WatchSource:0}: Error finding container 58684282cdfd63b6258dd53c00c271819b2bc77b622c634f1ad6090a4e96c9e5: Status 404 returned error can't find the container with id 58684282cdfd63b6258dd53c00c271819b2bc77b622c634f1ad6090a4e96c9e5 Apr 22 16:53:11.226342 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:11.226316 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8666ccb95d-h78nc" event={"ID":"43ed479a-fc50-47b4-9a87-058c565111c9","Type":"ContainerStarted","Data":"58684282cdfd63b6258dd53c00c271819b2bc77b622c634f1ad6090a4e96c9e5"} Apr 22 16:53:14.237108 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:14.237022 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8666ccb95d-h78nc" event={"ID":"43ed479a-fc50-47b4-9a87-058c565111c9","Type":"ContainerStarted","Data":"a3dc9a9b5d6753ef5bb4cabce65f0a076d49545f97c8183d69a00410e0b00c18"} Apr 22 16:53:14.237507 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:14.237107 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8666ccb95d-h78nc" podUID="43ed479a-fc50-47b4-9a87-058c565111c9" containerName="authorino" containerID="cri-o://a3dc9a9b5d6753ef5bb4cabce65f0a076d49545f97c8183d69a00410e0b00c18" gracePeriod=30 Apr 22 16:53:14.252747 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:14.252707 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8666ccb95d-h78nc" podStartSLOduration=1.524087489 podStartE2EDuration="4.252691988s" podCreationTimestamp="2026-04-22 16:53:10 +0000 UTC" firstStartedPulling="2026-04-22 16:53:11.170558819 +0000 UTC m=+1895.105873021" lastFinishedPulling="2026-04-22 16:53:13.899163299 +0000 UTC m=+1897.834477520" observedRunningTime="2026-04-22 16:53:14.251426378 +0000 UTC m=+1898.186740603" watchObservedRunningTime="2026-04-22 16:53:14.252691988 +0000 UTC m=+1898.188006272" Apr 22 16:53:14.476676 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:14.476650 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8666ccb95d-h78nc" Apr 22 16:53:14.567727 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:14.567706 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdwgw\" (UniqueName: \"kubernetes.io/projected/43ed479a-fc50-47b4-9a87-058c565111c9-kube-api-access-gdwgw\") pod \"43ed479a-fc50-47b4-9a87-058c565111c9\" (UID: \"43ed479a-fc50-47b4-9a87-058c565111c9\") " Apr 22 16:53:14.569743 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:14.569713 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43ed479a-fc50-47b4-9a87-058c565111c9-kube-api-access-gdwgw" (OuterVolumeSpecName: "kube-api-access-gdwgw") pod "43ed479a-fc50-47b4-9a87-058c565111c9" (UID: "43ed479a-fc50-47b4-9a87-058c565111c9"). InnerVolumeSpecName "kube-api-access-gdwgw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:53:14.668538 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:14.668515 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gdwgw\" (UniqueName: \"kubernetes.io/projected/43ed479a-fc50-47b4-9a87-058c565111c9-kube-api-access-gdwgw\") on node \"ip-10-0-141-251.ec2.internal\" DevicePath \"\"" Apr 22 16:53:15.241126 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:15.241051 2573 generic.go:358] "Generic (PLEG): container finished" podID="43ed479a-fc50-47b4-9a87-058c565111c9" containerID="a3dc9a9b5d6753ef5bb4cabce65f0a076d49545f97c8183d69a00410e0b00c18" exitCode=0 Apr 22 16:53:15.241126 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:15.241102 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8666ccb95d-h78nc" Apr 22 16:53:15.241554 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:15.241101 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8666ccb95d-h78nc" event={"ID":"43ed479a-fc50-47b4-9a87-058c565111c9","Type":"ContainerDied","Data":"a3dc9a9b5d6753ef5bb4cabce65f0a076d49545f97c8183d69a00410e0b00c18"} Apr 22 16:53:15.241554 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:15.241216 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8666ccb95d-h78nc" event={"ID":"43ed479a-fc50-47b4-9a87-058c565111c9","Type":"ContainerDied","Data":"58684282cdfd63b6258dd53c00c271819b2bc77b622c634f1ad6090a4e96c9e5"} Apr 22 16:53:15.241554 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:15.241240 2573 scope.go:117] "RemoveContainer" containerID="a3dc9a9b5d6753ef5bb4cabce65f0a076d49545f97c8183d69a00410e0b00c18" Apr 22 16:53:15.248874 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:15.248854 2573 scope.go:117] "RemoveContainer" containerID="a3dc9a9b5d6753ef5bb4cabce65f0a076d49545f97c8183d69a00410e0b00c18" Apr 22 16:53:15.249115 ip-10-0-141-251 kubenswrapper[2573]: E0422 16:53:15.249095 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3dc9a9b5d6753ef5bb4cabce65f0a076d49545f97c8183d69a00410e0b00c18\": container with ID starting with a3dc9a9b5d6753ef5bb4cabce65f0a076d49545f97c8183d69a00410e0b00c18 not found: ID does not exist" containerID="a3dc9a9b5d6753ef5bb4cabce65f0a076d49545f97c8183d69a00410e0b00c18" Apr 22 16:53:15.249185 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:15.249123 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3dc9a9b5d6753ef5bb4cabce65f0a076d49545f97c8183d69a00410e0b00c18"} err="failed to get container status \"a3dc9a9b5d6753ef5bb4cabce65f0a076d49545f97c8183d69a00410e0b00c18\": rpc error: code = NotFound desc = could not find container \"a3dc9a9b5d6753ef5bb4cabce65f0a076d49545f97c8183d69a00410e0b00c18\": container with ID starting with a3dc9a9b5d6753ef5bb4cabce65f0a076d49545f97c8183d69a00410e0b00c18 not found: ID does not exist" Apr 22 16:53:15.256983 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:15.256958 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8666ccb95d-h78nc"] Apr 22 16:53:15.261260 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:15.261241 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8666ccb95d-h78nc"] Apr 22 16:53:16.900848 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:16.900809 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43ed479a-fc50-47b4-9a87-058c565111c9" path="/var/lib/kubelet/pods/43ed479a-fc50-47b4-9a87-058c565111c9/volumes" Apr 22 16:53:39.793411 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:53:39.793376 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:54:06.906284 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:54:06.906255 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:54:14.801994 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:54:14.801957 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:54:34.610500 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:54:34.610423 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:54:40.298691 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:54:40.298642 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:54:43.511478 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:54:43.511435 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:55:12.311841 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:55:12.311805 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:55:37.166296 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:55:37.166265 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79f986cbc5-26kjp"] Apr 22 16:55:37.166671 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:55:37.166608 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43ed479a-fc50-47b4-9a87-058c565111c9" containerName="authorino" Apr 22 16:55:37.166671 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:55:37.166619 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ed479a-fc50-47b4-9a87-058c565111c9" containerName="authorino" Apr 22 16:55:37.166671 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:55:37.166668 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="43ed479a-fc50-47b4-9a87-058c565111c9" containerName="authorino" Apr 22 16:55:37.169512 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:55:37.169495 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79f986cbc5-26kjp" Apr 22 16:55:37.172483 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:55:37.172456 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 22 16:55:37.172483 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:55:37.172475 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-wgb6c\"" Apr 22 16:55:37.177090 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:55:37.177065 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79f986cbc5-26kjp"] Apr 22 16:55:37.246966 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:55:37.246938 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnqcv\" (UniqueName: \"kubernetes.io/projected/31574b08-4d39-40f4-83bf-6bce64bd938d-kube-api-access-hnqcv\") pod \"authorino-79f986cbc5-26kjp\" (UID: \"31574b08-4d39-40f4-83bf-6bce64bd938d\") " pod="kuadrant-system/authorino-79f986cbc5-26kjp" Apr 22 16:55:37.247127 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:55:37.247005 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/31574b08-4d39-40f4-83bf-6bce64bd938d-tls-cert\") pod \"authorino-79f986cbc5-26kjp\" (UID: \"31574b08-4d39-40f4-83bf-6bce64bd938d\") " pod="kuadrant-system/authorino-79f986cbc5-26kjp" Apr 22 16:55:37.347697 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:55:37.347670 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hnqcv\" (UniqueName: \"kubernetes.io/projected/31574b08-4d39-40f4-83bf-6bce64bd938d-kube-api-access-hnqcv\") pod \"authorino-79f986cbc5-26kjp\" (UID: \"31574b08-4d39-40f4-83bf-6bce64bd938d\") " pod="kuadrant-system/authorino-79f986cbc5-26kjp" Apr 22 16:55:37.347819 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:55:37.347727 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/31574b08-4d39-40f4-83bf-6bce64bd938d-tls-cert\") pod \"authorino-79f986cbc5-26kjp\" (UID: \"31574b08-4d39-40f4-83bf-6bce64bd938d\") " pod="kuadrant-system/authorino-79f986cbc5-26kjp" Apr 22 16:55:37.350090 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:55:37.350071 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/31574b08-4d39-40f4-83bf-6bce64bd938d-tls-cert\") pod \"authorino-79f986cbc5-26kjp\" (UID: \"31574b08-4d39-40f4-83bf-6bce64bd938d\") " pod="kuadrant-system/authorino-79f986cbc5-26kjp" Apr 22 16:55:37.355535 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:55:37.355516 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnqcv\" (UniqueName: \"kubernetes.io/projected/31574b08-4d39-40f4-83bf-6bce64bd938d-kube-api-access-hnqcv\") pod \"authorino-79f986cbc5-26kjp\" (UID: \"31574b08-4d39-40f4-83bf-6bce64bd938d\") " pod="kuadrant-system/authorino-79f986cbc5-26kjp" Apr 22 16:55:37.478661 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:55:37.478591 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79f986cbc5-26kjp" Apr 22 16:55:37.599849 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:55:37.599653 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79f986cbc5-26kjp"] Apr 22 16:55:37.602561 ip-10-0-141-251 kubenswrapper[2573]: W0422 16:55:37.602534 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31574b08_4d39_40f4_83bf_6bce64bd938d.slice/crio-9d83a78285f81c64ed2649f9aa5504f66e3099ed46a58079c05aea34042beafa WatchSource:0}: Error finding container 9d83a78285f81c64ed2649f9aa5504f66e3099ed46a58079c05aea34042beafa: Status 404 returned error can't find the container with id 9d83a78285f81c64ed2649f9aa5504f66e3099ed46a58079c05aea34042beafa Apr 22 16:55:37.603782 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:55:37.603735 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 16:55:37.695508 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:55:37.695475 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79f986cbc5-26kjp" event={"ID":"31574b08-4d39-40f4-83bf-6bce64bd938d","Type":"ContainerStarted","Data":"9d83a78285f81c64ed2649f9aa5504f66e3099ed46a58079c05aea34042beafa"} Apr 22 16:55:38.702085 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:55:38.702045 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79f986cbc5-26kjp" event={"ID":"31574b08-4d39-40f4-83bf-6bce64bd938d","Type":"ContainerStarted","Data":"35b2a59c8ab2629ff3b0f8f04d1ecb66e7b4d8d0338f16912f14c85b7a440faf"} Apr 22 16:55:38.718277 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:55:38.718214 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79f986cbc5-26kjp" podStartSLOduration=1.291981567 podStartE2EDuration="1.718197124s" podCreationTimestamp="2026-04-22 16:55:37 +0000 UTC" firstStartedPulling="2026-04-22 16:55:37.603880381 +0000 UTC m=+2041.539194584" lastFinishedPulling="2026-04-22 16:55:38.030095938 +0000 UTC m=+2041.965410141" observedRunningTime="2026-04-22 16:55:38.716069034 +0000 UTC m=+2042.651383253" watchObservedRunningTime="2026-04-22 16:55:38.718197124 +0000 UTC m=+2042.653511349" Apr 22 16:55:57.410581 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:55:57.410499 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:56:02.711585 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:56:02.711538 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:56:08.802386 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:56:08.802355 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:56:19.229901 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:56:19.229858 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:56:27.897554 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:56:27.897517 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:56:36.612108 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:56:36.612000 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 16:56:36.617434 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:56:36.615188 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 16:56:38.107112 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:56:38.107067 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:56:46.806097 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:56:46.806068 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:56:57.302307 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:56:57.302263 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:58:02.610943 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:58:02.610907 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:58:17.308986 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:58:17.308945 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:58:57.077989 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:58:57.077910 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:59:14.251571 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:59:14.251532 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:59:27.909613 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:59:27.909582 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 16:59:44.098989 ip-10-0-141-251 kubenswrapper[2573]: I0422 16:59:44.098954 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:00:12.399771 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:00:12.399713 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:00:16.798246 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:00:16.798208 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:00:22.202277 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:00:22.202238 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:00:43.000181 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:00:43.000096 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:00:52.597232 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:00:52.597197 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:01:08.606765 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:01:08.606719 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:01:16.903935 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:01:16.903904 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:01:33.704260 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:01:33.704222 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:01:36.634496 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:01:36.634386 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 17:01:36.638677 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:01:36.638106 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 17:01:42.907106 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:01:42.907070 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:02:14.606496 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:02:14.606404 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:02:23.000132 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:02:23.000096 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:02:32.099074 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:02:32.099036 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:02:40.487922 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:02:40.487884 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:02:48.992018 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:02:48.991984 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:03:05.997547 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:03:05.997508 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:03:19.099653 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:03:19.099614 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:04:05.103525 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:04:05.103486 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:04:14.417061 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:04:14.417022 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:04:22.599454 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:04:22.599415 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:04:31.505665 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:04:31.505631 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:04:41.012634 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:04:41.012603 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:04:48.403522 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:04:48.403487 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:04:58.000821 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:04:58.000721 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:05:06.795556 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:05:06.795517 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:05:16.095529 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:05:16.095495 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:05:23.813321 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:05:23.813286 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:05:33.211015 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:05:33.210976 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:05:41.804558 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:05:41.804520 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:05:50.699016 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:05:50.698981 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:05:59.203820 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:05:59.203784 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:06:08.098510 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:06:08.098478 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:06:16.801096 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:06:16.801059 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:06:25.072472 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:06:25.072440 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:06:33.304019 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:06:33.303980 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:06:36.655653 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:06:36.655537 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 17:06:36.660135 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:06:36.660116 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 17:08:50.407198 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:08:50.407158 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:08:55.203558 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:08:55.203522 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:09:20.601016 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:09:20.600983 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:09:24.999911 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:09:24.999881 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:09:35.199247 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:09:35.199214 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:09:45.694114 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:09:45.694079 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:09:54.503545 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:09:54.503512 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:10:05.003870 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:10:05.003831 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:10:13.698064 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:10:13.698023 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:10:24.208666 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:10:24.208630 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:10:33.005268 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:10:33.005235 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:10:43.198096 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:10:43.198064 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:10:53.198557 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:10:53.198516 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:11:25.807803 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:11:25.807690 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:11:36.677814 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:11:36.677689 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 17:11:36.682494 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:11:36.682477 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 17:12:07.697432 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:12:07.697395 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:12:17.103403 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:12:17.103362 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:12:25.298269 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:12:25.298171 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:12:33.713351 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:12:33.713318 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:12:43.108090 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:12:43.108052 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:12:53.599727 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:12:53.599691 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:13:03.399699 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:13:03.399666 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:13:08.696239 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:13:08.696201 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:13:18.790857 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:13:18.790821 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:13:26.401188 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:13:26.401159 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:13:35.402006 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:13:35.401971 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:13:45.197311 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:13:45.197271 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:14:04.700415 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:14:04.700336 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:14:12.105592 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:14:12.105551 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:14:20.800799 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:14:20.800746 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:14:28.704812 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:14:28.704771 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:14:45.502557 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:14:45.502515 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:14:54.212135 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:14:54.212102 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:15:03.441736 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:15:03.441702 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:15:11.410646 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:15:11.410612 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:15:20.752318 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:15:20.752280 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:15:28.902370 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:15:28.902303 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:15:37.605205 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:15:37.605167 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:15:49.100835 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:15:49.100801 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:15:58.204012 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:15:58.203974 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:16:11.099073 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:16:11.099038 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:16:20.204832 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:16:20.204797 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:16:27.896153 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:16:27.896114 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:16:36.499955 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:16:36.499920 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:16:36.701260 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:16:36.700992 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 17:16:36.712406 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:16:36.712378 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 17:16:43.597048 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:16:43.597008 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:17:01.100399 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:17:01.100323 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:17:09.599849 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:17:09.599815 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:17:18.009950 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:17:18.009914 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:17:26.400990 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:17:26.400949 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:17:50.593523 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:17:50.593486 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:18:02.505799 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:02.505750 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-wqr5w"] Apr 22 17:18:04.591772 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:04.591718 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-79f986cbc5-26kjp_31574b08-4d39-40f4-83bf-6bce64bd938d/authorino/0.log" Apr 22 17:18:09.189543 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:09.189511 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-57c8d5d679-jt9gf_9b05e3e8-670f-4b6c-b3d9-eee53548969c/manager/0.log" Apr 22 17:18:10.600197 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:10.600134 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-79f986cbc5-26kjp_31574b08-4d39-40f4-83bf-6bce64bd938d/authorino/0.log" Apr 22 17:18:10.924320 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:10.924249 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-dhl7x_f3683328-89c2-455f-a9cd-958a10c37c07/kuadrant-console-plugin/0.log" Apr 22 17:18:11.265282 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:11.265215 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-wqr5w_78efe79c-f859-4a19-bbe9-8c92c8dee215/limitador/0.log" Apr 22 17:18:12.036665 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:12.036637 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-644d48748b-q526j_0ee2d791-3f1a-4d82-bde7-7cd8765d2850/kube-auth-proxy/0.log" Apr 22 17:18:20.032211 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:20.032181 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7455g_f07a62bd-5116-4f09-94bd-5cf21c3a890b/global-pull-secret-syncer/0.log" Apr 22 17:18:20.201465 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:20.201436 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-q9qkz_751cb35b-96a4-4016-bb77-4c55bff2e4d6/konnectivity-agent/0.log" Apr 22 17:18:20.290302 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:20.290271 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-251.ec2.internal_0c1aad3db1097ad1e69a197a09060d7e/haproxy/0.log" Apr 22 17:18:24.392015 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:24.391983 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-79f986cbc5-26kjp_31574b08-4d39-40f4-83bf-6bce64bd938d/authorino/0.log" Apr 22 17:18:24.491744 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:24.491711 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-dhl7x_f3683328-89c2-455f-a9cd-958a10c37c07/kuadrant-console-plugin/0.log" Apr 22 17:18:24.643824 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:24.643710 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-wqr5w_78efe79c-f859-4a19-bbe9-8c92c8dee215/limitador/0.log" Apr 22 17:18:26.325832 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:26.325806 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-c7m6g_c9c547d5-42c1-445c-b145-1e317ab8947a/cluster-monitoring-operator/0.log" Apr 22 17:18:26.446776 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:26.446727 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-9759db7d6-7t7pd_e7916ac0-f665-4081-a897-ae7825389217/metrics-server/0.log" Apr 22 17:18:26.471881 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:26.471858 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-bl9p4_09d5a02d-d631-44a9-b1f0-97d6de575878/monitoring-plugin/0.log" Apr 22 17:18:26.505863 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:26.505842 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-62wb6_9ff261de-df3b-469b-a87d-1dc5330b2f0c/node-exporter/0.log" Apr 22 17:18:26.528957 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:26.528938 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-62wb6_9ff261de-df3b-469b-a87d-1dc5330b2f0c/kube-rbac-proxy/0.log" Apr 22 17:18:26.550903 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:26.550881 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-62wb6_9ff261de-df3b-469b-a87d-1dc5330b2f0c/init-textfile/0.log" Apr 22 17:18:26.776445 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:26.776426 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-jfwzw_7f700c39-c35c-4f3b-b024-463588166278/kube-rbac-proxy-main/0.log" Apr 22 17:18:26.797481 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:26.797440 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-jfwzw_7f700c39-c35c-4f3b-b024-463588166278/kube-rbac-proxy-self/0.log" Apr 22 17:18:26.818467 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:26.818449 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-jfwzw_7f700c39-c35c-4f3b-b024-463588166278/openshift-state-metrics/0.log" Apr 22 17:18:26.851962 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:26.851940 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6044f842-c26c-4c13-8987-6ed3e3e35c1b/prometheus/0.log" Apr 22 17:18:26.871698 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:26.871644 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6044f842-c26c-4c13-8987-6ed3e3e35c1b/config-reloader/0.log" Apr 22 17:18:26.904398 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:26.904379 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6044f842-c26c-4c13-8987-6ed3e3e35c1b/thanos-sidecar/0.log" Apr 22 17:18:26.928946 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:26.928914 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6044f842-c26c-4c13-8987-6ed3e3e35c1b/kube-rbac-proxy-web/0.log" Apr 22 17:18:26.957837 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:26.957818 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6044f842-c26c-4c13-8987-6ed3e3e35c1b/kube-rbac-proxy/0.log" Apr 22 17:18:26.979508 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:26.979491 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6044f842-c26c-4c13-8987-6ed3e3e35c1b/kube-rbac-proxy-thanos/0.log" Apr 22 17:18:26.999574 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:26.999559 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6044f842-c26c-4c13-8987-6ed3e3e35c1b/init-config-reloader/0.log" Apr 22 17:18:27.037085 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:27.037065 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-gmcgm_82e21656-91bb-49a6-8a79-6241b36c61e0/prometheus-operator/0.log" Apr 22 17:18:27.057123 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:27.057104 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-gmcgm_82e21656-91bb-49a6-8a79-6241b36c61e0/kube-rbac-proxy/0.log" Apr 22 17:18:28.866166 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:28.866139 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/1.log" Apr 22 17:18:28.870535 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:28.870512 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-hx648_4ae7e321-7d7a-4cff-b23f-dfbc5af07459/console-operator/2.log" Apr 22 17:18:29.075342 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.069805 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq"] Apr 22 17:18:29.075342 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.075292 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq" Apr 22 17:18:29.078136 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.078107 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-lqwg2\"/\"default-dockercfg-w4hxb\"" Apr 22 17:18:29.078269 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.078150 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lqwg2\"/\"openshift-service-ca.crt\"" Apr 22 17:18:29.078834 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.078816 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lqwg2\"/\"kube-root-ca.crt\"" Apr 22 17:18:29.079536 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.079514 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq"] Apr 22 17:18:29.248565 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.248495 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0022d91e-080c-4885-a9ba-2ff3437c8273-lib-modules\") pod \"perf-node-gather-daemonset-xbdbq\" (UID: \"0022d91e-080c-4885-a9ba-2ff3437c8273\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq" Apr 22 17:18:29.248565 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.248532 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0022d91e-080c-4885-a9ba-2ff3437c8273-podres\") pod \"perf-node-gather-daemonset-xbdbq\" (UID: \"0022d91e-080c-4885-a9ba-2ff3437c8273\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq" Apr 22 17:18:29.248750 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.248592 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0022d91e-080c-4885-a9ba-2ff3437c8273-sys\") pod \"perf-node-gather-daemonset-xbdbq\" (UID: \"0022d91e-080c-4885-a9ba-2ff3437c8273\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq" Apr 22 17:18:29.248750 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.248628 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0022d91e-080c-4885-a9ba-2ff3437c8273-proc\") pod \"perf-node-gather-daemonset-xbdbq\" (UID: \"0022d91e-080c-4885-a9ba-2ff3437c8273\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq" Apr 22 17:18:29.248750 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.248652 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzx76\" (UniqueName: \"kubernetes.io/projected/0022d91e-080c-4885-a9ba-2ff3437c8273-kube-api-access-hzx76\") pod \"perf-node-gather-daemonset-xbdbq\" (UID: \"0022d91e-080c-4885-a9ba-2ff3437c8273\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq" Apr 22 17:18:29.349182 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.349158 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hzx76\" (UniqueName: \"kubernetes.io/projected/0022d91e-080c-4885-a9ba-2ff3437c8273-kube-api-access-hzx76\") pod \"perf-node-gather-daemonset-xbdbq\" (UID: \"0022d91e-080c-4885-a9ba-2ff3437c8273\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq" Apr 22 17:18:29.349300 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.349213 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0022d91e-080c-4885-a9ba-2ff3437c8273-lib-modules\") pod \"perf-node-gather-daemonset-xbdbq\" (UID: \"0022d91e-080c-4885-a9ba-2ff3437c8273\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq" Apr 22 17:18:29.349300 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.349244 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0022d91e-080c-4885-a9ba-2ff3437c8273-podres\") pod \"perf-node-gather-daemonset-xbdbq\" (UID: \"0022d91e-080c-4885-a9ba-2ff3437c8273\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq" Apr 22 17:18:29.349300 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.349292 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0022d91e-080c-4885-a9ba-2ff3437c8273-sys\") pod \"perf-node-gather-daemonset-xbdbq\" (UID: \"0022d91e-080c-4885-a9ba-2ff3437c8273\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq" Apr 22 17:18:29.349441 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.349317 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0022d91e-080c-4885-a9ba-2ff3437c8273-proc\") pod \"perf-node-gather-daemonset-xbdbq\" (UID: \"0022d91e-080c-4885-a9ba-2ff3437c8273\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq" Apr 22 17:18:29.349441 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.349348 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0022d91e-080c-4885-a9ba-2ff3437c8273-lib-modules\") pod \"perf-node-gather-daemonset-xbdbq\" (UID: \"0022d91e-080c-4885-a9ba-2ff3437c8273\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq" Apr 22 17:18:29.349441 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.349385 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0022d91e-080c-4885-a9ba-2ff3437c8273-proc\") pod \"perf-node-gather-daemonset-xbdbq\" (UID: \"0022d91e-080c-4885-a9ba-2ff3437c8273\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq" Apr 22 17:18:29.349441 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.349401 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0022d91e-080c-4885-a9ba-2ff3437c8273-podres\") pod \"perf-node-gather-daemonset-xbdbq\" (UID: \"0022d91e-080c-4885-a9ba-2ff3437c8273\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq" Apr 22 17:18:29.349441 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.349396 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0022d91e-080c-4885-a9ba-2ff3437c8273-sys\") pod \"perf-node-gather-daemonset-xbdbq\" (UID: \"0022d91e-080c-4885-a9ba-2ff3437c8273\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq" Apr 22 17:18:29.357178 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.357154 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzx76\" (UniqueName: \"kubernetes.io/projected/0022d91e-080c-4885-a9ba-2ff3437c8273-kube-api-access-hzx76\") pod \"perf-node-gather-daemonset-xbdbq\" (UID: \"0022d91e-080c-4885-a9ba-2ff3437c8273\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq" Apr 22 17:18:29.386733 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.386713 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq" Apr 22 17:18:29.400624 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.400605 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-65frr_a96be859-8cab-480c-a151-485aa4b28fca/download-server/0.log" Apr 22 17:18:29.709815 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.709790 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq"] Apr 22 17:18:29.712111 ip-10-0-141-251 kubenswrapper[2573]: W0422 17:18:29.712078 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0022d91e_080c_4885_a9ba_2ff3437c8273.slice/crio-394ebff1fb4d3474e108656293c422806651bcdbec7bb9a2ef75aadd2afce0cb WatchSource:0}: Error finding container 394ebff1fb4d3474e108656293c422806651bcdbec7bb9a2ef75aadd2afce0cb: Status 404 returned error can't find the container with id 394ebff1fb4d3474e108656293c422806651bcdbec7bb9a2ef75aadd2afce0cb Apr 22 17:18:29.713823 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:29.713804 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:18:30.100768 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:30.100720 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq" event={"ID":"0022d91e-080c-4885-a9ba-2ff3437c8273","Type":"ContainerStarted","Data":"90f4e20e6cfa3562dd18dffe2109ce9ff155d2e58302d5d3fc4203206de529bd"} Apr 22 17:18:30.101129 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:30.100789 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq" event={"ID":"0022d91e-080c-4885-a9ba-2ff3437c8273","Type":"ContainerStarted","Data":"394ebff1fb4d3474e108656293c422806651bcdbec7bb9a2ef75aadd2afce0cb"} Apr 22 17:18:30.101129 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:30.100840 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq" Apr 22 17:18:30.116207 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:30.116164 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq" podStartSLOduration=1.116151712 podStartE2EDuration="1.116151712s" podCreationTimestamp="2026-04-22 17:18:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:18:30.114106535 +0000 UTC m=+3414.049420775" watchObservedRunningTime="2026-04-22 17:18:30.116151712 +0000 UTC m=+3414.051465937" Apr 22 17:18:30.857917 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:30.857892 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8524b_ed730508-b5b5-44cd-b56a-f58225697c5d/dns/0.log" Apr 22 17:18:30.906072 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:30.906049 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8524b_ed730508-b5b5-44cd-b56a-f58225697c5d/kube-rbac-proxy/0.log" Apr 22 17:18:31.029394 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:31.029371 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ljkt5_1f2b5ca6-7540-4d0e-88b4-b34788bdeb77/dns-node-resolver/0.log" Apr 22 17:18:31.568742 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:31.568718 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-zgppz_ca6eb44a-3f8d-408b-ae0d-0ef553dc08d8/node-ca/0.log" Apr 22 17:18:32.556739 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:32.556711 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-644d48748b-q526j_0ee2d791-3f1a-4d82-bde7-7cd8765d2850/kube-auth-proxy/0.log" Apr 22 17:18:33.150692 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:33.150666 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-c2xk7_bf42ef1a-eb82-48c4-b318-0b119dfdda61/serve-healthcheck-canary/0.log" Apr 22 17:18:33.774966 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:33.774938 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q72tk_dbd5bed6-525a-4375-8584-35c15db9f5ac/kube-rbac-proxy/0.log" Apr 22 17:18:33.793568 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:33.793546 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q72tk_dbd5bed6-525a-4375-8584-35c15db9f5ac/exporter/0.log" Apr 22 17:18:33.813049 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:33.813029 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q72tk_dbd5bed6-525a-4375-8584-35c15db9f5ac/extractor/0.log" Apr 22 17:18:36.056346 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:36.056319 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-57c8d5d679-jt9gf_9b05e3e8-670f-4b6c-b3d9-eee53548969c/manager/0.log" Apr 22 17:18:36.112774 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:36.112733 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-xbdbq" Apr 22 17:18:37.208082 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:37.208054 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7c5749599b-rgl2r_067d2728-bc89-461e-8360-3524d1c1865b/manager/0.log" Apr 22 17:18:37.234069 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:37.234047 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-sfr8p_6c935793-a12a-459a-beed-c39393195983/openshift-lws-operator/0.log" Apr 22 17:18:41.530327 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:41.530296 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-2hv2b_94ee8e4f-0d0b-4075-95d8-5f19844fb295/migrator/0.log" Apr 22 17:18:41.549407 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:41.549378 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-2hv2b_94ee8e4f-0d0b-4075-95d8-5f19844fb295/graceful-termination/0.log" Apr 22 17:18:42.998388 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:42.998359 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gh4hm_c9801573-c6f5-4b2d-a5ea-6b5b53cf411b/kube-multus-additional-cni-plugins/0.log" Apr 22 17:18:43.020467 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:43.020443 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gh4hm_c9801573-c6f5-4b2d-a5ea-6b5b53cf411b/egress-router-binary-copy/0.log" Apr 22 17:18:43.040663 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:43.040636 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gh4hm_c9801573-c6f5-4b2d-a5ea-6b5b53cf411b/cni-plugins/0.log" Apr 22 17:18:43.060597 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:43.060574 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gh4hm_c9801573-c6f5-4b2d-a5ea-6b5b53cf411b/bond-cni-plugin/0.log" Apr 22 17:18:43.081516 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:43.081493 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gh4hm_c9801573-c6f5-4b2d-a5ea-6b5b53cf411b/routeoverride-cni/0.log" Apr 22 17:18:43.102440 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:43.102421 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gh4hm_c9801573-c6f5-4b2d-a5ea-6b5b53cf411b/whereabouts-cni-bincopy/0.log" Apr 22 17:18:43.122580 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:43.122563 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-gh4hm_c9801573-c6f5-4b2d-a5ea-6b5b53cf411b/whereabouts-cni/0.log" Apr 22 17:18:43.352839 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:43.352812 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-k6bwx_bbeeecf5-1dc0-40b2-bd6f-5a62c3da9927/kube-multus/0.log" Apr 22 17:18:43.416308 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:43.416279 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cwt8x_e324836e-ef75-432e-978a-639279d2702e/network-metrics-daemon/0.log" Apr 22 17:18:43.435457 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:43.435436 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cwt8x_e324836e-ef75-432e-978a-639279d2702e/kube-rbac-proxy/0.log" Apr 22 17:18:44.243402 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:44.243372 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mskd_239cf2fe-c977-430f-bf19-3a0e5dbd5f8c/ovn-controller/0.log" Apr 22 17:18:44.278200 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:44.278176 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mskd_239cf2fe-c977-430f-bf19-3a0e5dbd5f8c/ovn-acl-logging/0.log" Apr 22 17:18:44.295558 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:44.295537 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mskd_239cf2fe-c977-430f-bf19-3a0e5dbd5f8c/kube-rbac-proxy-node/0.log" Apr 22 17:18:44.314773 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:44.314745 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mskd_239cf2fe-c977-430f-bf19-3a0e5dbd5f8c/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 17:18:44.330955 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:44.330935 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mskd_239cf2fe-c977-430f-bf19-3a0e5dbd5f8c/northd/0.log" Apr 22 17:18:44.353809 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:44.353793 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mskd_239cf2fe-c977-430f-bf19-3a0e5dbd5f8c/nbdb/0.log" Apr 22 17:18:44.372224 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:44.372208 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mskd_239cf2fe-c977-430f-bf19-3a0e5dbd5f8c/sbdb/0.log" Apr 22 17:18:44.466864 ip-10-0-141-251 kubenswrapper[2573]: I0422 17:18:44.466839 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2mskd_239cf2fe-c977-430f-bf19-3a0e5dbd5f8c/ovnkube-controller/0.log"