Apr 22 19:58:27.254635 ip-10-0-128-61 systemd[1]: Starting Kubernetes Kubelet... Apr 22 19:58:27.664121 ip-10-0-128-61 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:58:27.664121 ip-10-0-128-61 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 19:58:27.664121 ip-10-0-128-61 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:58:27.664121 ip-10-0-128-61 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 19:58:27.664121 ip-10-0-128-61 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 19:58:27.665635 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.665554 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 19:58:27.668211 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668195 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:58:27.668248 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668212 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:58:27.668248 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668224 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:58:27.668248 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668228 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:58:27.668248 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668232 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:58:27.668248 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668235 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:58:27.668248 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668238 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:58:27.668248 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668241 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:58:27.668248 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668244 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:58:27.668248 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668247 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:58:27.668248 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668250 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:58:27.668248 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668253 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:58:27.668523 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668256 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:58:27.668523 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668259 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:58:27.668523 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668261 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:58:27.668523 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668264 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:58:27.668523 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668267 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:58:27.668523 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668269 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:58:27.668523 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668272 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:58:27.668523 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668279 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:58:27.668523 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668282 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:58:27.668523 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668284 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:58:27.668523 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668287 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:58:27.668523 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668290 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:58:27.668523 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668292 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:58:27.668523 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668295 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:58:27.668523 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668297 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:58:27.668523 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668300 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:58:27.668523 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668302 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:58:27.668523 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668305 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:58:27.668523 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668307 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:58:27.668523 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668312 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:58:27.668996 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668316 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:58:27.668996 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668319 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:58:27.668996 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668322 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:58:27.668996 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668324 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:58:27.668996 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668327 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:58:27.668996 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668330 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:58:27.668996 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668332 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:58:27.668996 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668336 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:58:27.668996 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668338 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:58:27.668996 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668341 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:58:27.668996 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668344 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:58:27.668996 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668346 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:58:27.668996 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668349 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:58:27.668996 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668367 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:58:27.668996 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668370 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:58:27.668996 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668373 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:58:27.668996 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668376 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:58:27.668996 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668378 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:58:27.668996 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668381 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:58:27.669500 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668384 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:58:27.669500 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668386 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:58:27.669500 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668389 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:58:27.669500 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668391 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:58:27.669500 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668394 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:58:27.669500 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668396 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:58:27.669500 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668400 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:58:27.669500 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668404 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:58:27.669500 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668408 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:58:27.669500 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668412 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:58:27.669500 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668415 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:58:27.669500 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668418 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:58:27.669500 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668421 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:58:27.669500 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668423 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:58:27.669500 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668426 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:58:27.669500 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668435 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:58:27.669500 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668438 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:58:27.669500 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668440 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:58:27.669500 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668443 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:58:27.669500 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668446 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:58:27.669978 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668448 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:58:27.669978 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668451 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:58:27.669978 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668453 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:58:27.669978 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668457 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:58:27.669978 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668460 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:58:27.669978 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668484 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:58:27.669978 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668488 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:58:27.669978 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668491 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:58:27.669978 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668494 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:58:27.669978 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668497 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:58:27.669978 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668500 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:58:27.669978 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668502 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:58:27.669978 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668505 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:58:27.669978 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668508 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:58:27.669978 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668511 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:58:27.669978 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668865 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:58:27.669978 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668870 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:58:27.669978 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668873 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:58:27.669978 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668875 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:58:27.670438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668879 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:58:27.670438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668881 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:58:27.670438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668884 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:58:27.670438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668887 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:58:27.670438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668889 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:58:27.670438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668892 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:58:27.670438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668894 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:58:27.670438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668897 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:58:27.670438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668901 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:58:27.670438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668903 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:58:27.670438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668906 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:58:27.670438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668909 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:58:27.670438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668912 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:58:27.670438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668915 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:58:27.670438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668917 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:58:27.670438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668920 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:58:27.670438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668922 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:58:27.670438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668925 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:58:27.670438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668928 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:58:27.670438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668931 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:58:27.670928 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668933 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:58:27.670928 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668936 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:58:27.670928 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668938 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:58:27.670928 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668941 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:58:27.670928 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668944 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:58:27.670928 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668946 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:58:27.670928 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668949 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:58:27.670928 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668951 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:58:27.670928 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668953 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:58:27.670928 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668956 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:58:27.670928 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668958 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:58:27.670928 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668960 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:58:27.670928 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668964 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:58:27.670928 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668967 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:58:27.670928 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668970 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:58:27.670928 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668972 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:58:27.670928 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668975 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:58:27.670928 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668978 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:58:27.670928 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668980 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:58:27.670928 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668983 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:58:27.671438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668985 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:58:27.671438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668989 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:58:27.671438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668991 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:58:27.671438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668994 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:58:27.671438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668997 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:58:27.671438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.668999 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:58:27.671438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669002 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:58:27.671438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669006 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:58:27.671438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669010 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:58:27.671438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669014 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:58:27.671438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669017 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:58:27.671438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669020 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:58:27.671438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669022 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:58:27.671438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669025 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:58:27.671438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669028 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:58:27.671438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669030 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:58:27.671438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669032 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:58:27.671438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669035 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:58:27.671438 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669038 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:58:27.671897 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669040 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:58:27.671897 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669043 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:58:27.671897 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669045 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:58:27.671897 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669048 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:58:27.671897 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669050 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:58:27.671897 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669053 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:58:27.671897 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669056 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:58:27.671897 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669058 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:58:27.671897 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669061 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:58:27.671897 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669064 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:58:27.671897 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669068 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:58:27.671897 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669072 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:58:27.671897 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669078 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:58:27.671897 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669081 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:58:27.671897 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669084 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:58:27.671897 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669087 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:58:27.671897 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669090 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:58:27.671897 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669093 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:58:27.671897 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669096 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669098 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669101 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669104 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669106 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669193 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669200 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669207 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669212 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669216 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669219 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669224 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669229 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669232 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669235 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669238 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669241 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669245 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669251 2574 flags.go:64] FLAG: --cgroup-root="" Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669254 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669257 2574 flags.go:64] FLAG: --client-ca-file="" Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669260 2574 flags.go:64] FLAG: --cloud-config="" Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669263 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669266 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 19:58:27.672379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669271 2574 flags.go:64] FLAG: --cluster-domain="" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669274 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669278 2574 flags.go:64] FLAG: --config-dir="" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669281 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669285 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669289 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669296 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669299 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669302 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669305 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669308 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669312 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669315 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669318 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669321 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669324 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669327 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669330 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669333 2574 flags.go:64] FLAG: --enable-server="true" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669336 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669343 2574 flags.go:64] FLAG: --event-burst="100" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669346 2574 flags.go:64] FLAG: --event-qps="50" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669349 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669352 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669369 2574 flags.go:64] FLAG: --eviction-hard="" Apr 22 19:58:27.672959 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669373 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669377 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669380 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669384 2574 flags.go:64] FLAG: --eviction-soft="" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669386 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669389 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669392 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669395 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669398 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669402 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669405 2574 flags.go:64] FLAG: --feature-gates="" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669409 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669412 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669414 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669418 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669422 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669425 2574 flags.go:64] FLAG: --help="false" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669428 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-128-61.ec2.internal" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669431 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669434 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669436 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669440 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669443 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669446 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 19:58:27.673614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669448 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669451 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669454 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669457 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669460 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669463 2574 flags.go:64] FLAG: --kube-reserved="" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669466 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669469 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669472 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669476 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669479 2574 flags.go:64] FLAG: --lock-file="" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669481 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669484 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669487 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669492 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669495 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669498 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669502 2574 flags.go:64] FLAG: --logging-format="text" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669505 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669509 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669512 2574 flags.go:64] FLAG: --manifest-url="" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669515 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669519 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669526 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669530 2574 flags.go:64] FLAG: --max-pods="110" Apr 22 19:58:27.674206 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669533 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669536 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669538 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669541 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669544 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669547 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669550 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669557 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669560 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669563 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669566 2574 flags.go:64] FLAG: --pod-cidr="" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669568 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669574 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669577 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669580 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669583 2574 flags.go:64] FLAG: --port="10250" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669586 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669589 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-00bf6fa47d49627e4" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669592 2574 flags.go:64] FLAG: --qos-reserved="" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669595 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669598 2574 flags.go:64] FLAG: --register-node="true" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669601 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669604 2574 flags.go:64] FLAG: --register-with-taints="" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669607 2574 flags.go:64] FLAG: --registry-burst="10" Apr 22 19:58:27.674858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669611 2574 flags.go:64] FLAG: --registry-qps="5" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669614 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669617 2574 flags.go:64] FLAG: --reserved-memory="" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669620 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669623 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669626 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669629 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669632 2574 flags.go:64] FLAG: --runonce="false" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669635 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669638 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669641 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669644 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669646 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669649 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669652 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669655 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669658 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669661 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669664 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669667 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669670 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669672 2574 flags.go:64] FLAG: --system-cgroups="" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669675 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669680 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669683 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 22 19:58:27.675482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669686 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 19:58:27.676114 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669694 2574 flags.go:64] FLAG: --tls-min-version="" Apr 22 19:58:27.676114 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669697 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 19:58:27.676114 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669700 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 19:58:27.676114 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669703 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 19:58:27.676114 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669706 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 19:58:27.676114 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669709 2574 flags.go:64] FLAG: --v="2" Apr 22 19:58:27.676114 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669714 2574 flags.go:64] FLAG: --version="false" Apr 22 19:58:27.676114 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669718 2574 flags.go:64] FLAG: --vmodule="" Apr 22 19:58:27.676114 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669722 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 19:58:27.676114 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.669725 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 19:58:27.676114 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669825 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:58:27.676114 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669829 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:58:27.676114 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669833 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:58:27.676114 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669836 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:58:27.676114 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669840 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:58:27.676114 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669842 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:58:27.676114 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669845 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:58:27.676114 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669848 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:58:27.676114 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669851 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:58:27.676114 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669853 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:58:27.676114 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669856 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:58:27.676114 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669859 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:58:27.676650 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669862 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:58:27.676650 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669865 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:58:27.676650 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669867 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:58:27.676650 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669870 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:58:27.676650 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669872 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:58:27.676650 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669875 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:58:27.676650 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669878 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:58:27.676650 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669880 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:58:27.676650 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669883 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:58:27.676650 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669885 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:58:27.676650 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669888 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:58:27.676650 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669890 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:58:27.676650 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669893 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:58:27.676650 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669896 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:58:27.676650 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669898 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:58:27.676650 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669901 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:58:27.676650 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669904 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:58:27.676650 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669907 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:58:27.676650 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669909 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:58:27.676650 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669912 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:58:27.677152 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669915 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:58:27.677152 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669917 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:58:27.677152 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669920 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:58:27.677152 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669922 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:58:27.677152 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669925 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:58:27.677152 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669927 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:58:27.677152 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669930 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:58:27.677152 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669932 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:58:27.677152 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669935 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:58:27.677152 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669938 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:58:27.677152 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669940 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:58:27.677152 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669942 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:58:27.677152 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669945 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:58:27.677152 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669947 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:58:27.677152 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669951 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:58:27.677152 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669955 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:58:27.677152 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669958 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:58:27.677152 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669961 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:58:27.677152 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669964 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:58:27.677152 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669966 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:58:27.677653 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669969 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:58:27.677653 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669972 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:58:27.677653 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669975 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:58:27.677653 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669977 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:58:27.677653 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669980 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:58:27.677653 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669983 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:58:27.677653 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669985 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:58:27.677653 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669988 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:58:27.677653 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669993 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:58:27.677653 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669996 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:58:27.677653 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.669999 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:58:27.677653 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670002 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:58:27.677653 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670004 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:58:27.677653 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670007 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:58:27.677653 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670009 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:58:27.677653 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670012 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:58:27.677653 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670014 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:58:27.677653 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670017 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:58:27.677653 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670026 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:58:27.677653 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670028 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:58:27.678144 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670031 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:58:27.678144 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670033 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:58:27.678144 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670036 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:58:27.678144 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670039 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:58:27.678144 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670044 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:58:27.678144 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670046 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:58:27.678144 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670049 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:58:27.678144 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670052 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:58:27.678144 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670054 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:58:27.678144 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670057 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:58:27.678144 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670059 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:58:27.678144 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670062 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:58:27.678144 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670065 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:58:27.678144 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.670067 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:58:27.678144 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.670700 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:58:27.678526 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.676971 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 19:58:27.678526 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.676987 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 19:58:27.678526 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677035 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:58:27.678526 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677039 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:58:27.678526 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677043 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:58:27.678526 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677046 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:58:27.678526 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677050 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:58:27.678526 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677056 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:58:27.678526 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677059 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:58:27.678526 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677062 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:58:27.678526 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677065 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:58:27.678526 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677067 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:58:27.678526 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677070 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:58:27.678526 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677073 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:58:27.678526 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677076 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:58:27.678526 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677078 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:58:27.678526 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677081 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:58:27.678526 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677083 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:58:27.678526 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677086 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:58:27.678526 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677089 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:58:27.679029 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677091 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:58:27.679029 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677094 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:58:27.679029 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677097 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:58:27.679029 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677099 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:58:27.679029 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677102 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:58:27.679029 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677105 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:58:27.679029 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677107 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:58:27.679029 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677110 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:58:27.679029 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677112 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:58:27.679029 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677115 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:58:27.679029 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677117 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:58:27.679029 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677120 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:58:27.679029 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677122 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:58:27.679029 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677127 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:58:27.679029 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677129 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:58:27.679029 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677132 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:58:27.679029 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677134 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:58:27.679029 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677137 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:58:27.679029 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677140 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:58:27.679029 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677142 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:58:27.679640 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677145 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:58:27.679640 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677148 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:58:27.679640 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677150 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:58:27.679640 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677153 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:58:27.679640 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677156 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:58:27.679640 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677158 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:58:27.679640 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677161 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:58:27.679640 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677164 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:58:27.679640 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677167 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:58:27.679640 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677169 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:58:27.679640 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677172 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:58:27.679640 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677175 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:58:27.679640 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677178 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:58:27.679640 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677181 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:58:27.679640 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677183 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:58:27.679640 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677186 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:58:27.679640 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677188 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:58:27.679640 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677191 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:58:27.679640 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677193 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:58:27.679640 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677196 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:58:27.680129 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677198 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:58:27.680129 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677201 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:58:27.680129 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677203 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:58:27.680129 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677206 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:58:27.680129 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677208 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:58:27.680129 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677211 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:58:27.680129 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677214 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:58:27.680129 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677217 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:58:27.680129 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677221 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:58:27.680129 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677224 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:58:27.680129 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677227 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:58:27.680129 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677230 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:58:27.680129 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677232 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:58:27.680129 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677235 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:58:27.680129 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677238 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:58:27.680129 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677240 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:58:27.680129 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677243 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:58:27.680129 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677246 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:58:27.680129 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677248 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:58:27.680659 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677251 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:58:27.680659 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677253 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:58:27.680659 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677256 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:58:27.680659 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677258 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:58:27.680659 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677261 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:58:27.680659 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677263 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:58:27.680659 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677266 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:58:27.680659 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677268 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:58:27.680659 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677271 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:58:27.680659 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.677276 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:58:27.680659 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677404 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 19:58:27.680659 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677410 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 19:58:27.680659 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677413 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 19:58:27.680659 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677416 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 19:58:27.680659 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677419 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 19:58:27.680659 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677422 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 19:58:27.681065 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677425 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 19:58:27.681065 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677428 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 19:58:27.681065 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677431 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 19:58:27.681065 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677434 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 19:58:27.681065 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677437 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 19:58:27.681065 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677439 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 19:58:27.681065 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677442 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 19:58:27.681065 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677444 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 19:58:27.681065 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677447 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 19:58:27.681065 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677449 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 19:58:27.681065 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677452 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 19:58:27.681065 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677455 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 19:58:27.681065 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677457 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 19:58:27.681065 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677460 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 19:58:27.681065 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677462 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 22 19:58:27.681065 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677465 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 19:58:27.681065 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677467 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 19:58:27.681065 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677470 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 19:58:27.681065 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677472 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 19:58:27.681065 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677475 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 19:58:27.681559 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677477 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 19:58:27.681559 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677480 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 19:58:27.681559 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677482 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 19:58:27.681559 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677484 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 19:58:27.681559 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677487 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 19:58:27.681559 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677489 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 19:58:27.681559 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677493 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 19:58:27.681559 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677497 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 19:58:27.681559 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677500 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 19:58:27.681559 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677503 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 19:58:27.681559 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677505 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 19:58:27.681559 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677508 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 19:58:27.681559 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677510 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 19:58:27.681559 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677513 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 19:58:27.681559 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677516 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 19:58:27.681559 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677518 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 19:58:27.681559 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677521 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 19:58:27.681559 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677524 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 19:58:27.681559 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677527 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 19:58:27.681559 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677529 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 19:58:27.682037 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677532 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 19:58:27.682037 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677534 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 19:58:27.682037 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677536 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 19:58:27.682037 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677539 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 19:58:27.682037 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677541 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 19:58:27.682037 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677544 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 19:58:27.682037 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677546 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 19:58:27.682037 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677549 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 19:58:27.682037 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677551 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 19:58:27.682037 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677554 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 19:58:27.682037 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677556 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 19:58:27.682037 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677559 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 19:58:27.682037 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677561 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 19:58:27.682037 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677563 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 19:58:27.682037 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677566 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 19:58:27.682037 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677569 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 19:58:27.682037 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677571 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 19:58:27.682037 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677574 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 19:58:27.682037 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677576 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 19:58:27.682037 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677578 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 19:58:27.682662 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677581 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 19:58:27.682662 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677583 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 19:58:27.682662 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677588 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 19:58:27.682662 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677591 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 19:58:27.682662 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677594 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 19:58:27.682662 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677597 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 19:58:27.682662 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677600 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 19:58:27.682662 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677603 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 19:58:27.682662 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677605 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 19:58:27.682662 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677608 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 19:58:27.682662 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677611 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 19:58:27.682662 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677614 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 19:58:27.682662 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677616 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 19:58:27.682662 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677619 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 19:58:27.682662 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677621 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 19:58:27.682662 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677624 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 19:58:27.682662 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677626 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 19:58:27.682662 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677628 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 19:58:27.682662 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677631 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 19:58:27.683110 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:27.677633 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 19:58:27.683110 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.677638 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 19:58:27.683110 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.678239 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 19:58:27.683110 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.680093 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 19:58:27.683110 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.680899 2574 server.go:1019] "Starting client certificate rotation" Apr 22 19:58:27.683110 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.680999 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:58:27.683110 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.681045 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 19:58:27.706248 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.706225 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:58:27.708666 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.708532 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 19:58:27.718037 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.718021 2574 log.go:25] "Validated CRI v1 runtime API" Apr 22 19:58:27.722831 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.722815 2574 log.go:25] "Validated CRI v1 image API" Apr 22 19:58:27.725794 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.725779 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 19:58:27.729664 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.729645 2574 fs.go:135] Filesystem UUIDs: map[033fe98d-c9a2-47eb-8925-20f1a7c116f0:/dev/nvme0n1p4 73c3bad4-8013-4848-9e25-9a77e09e1fc5:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 22 19:58:27.729739 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.729664 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 19:58:27.732156 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.732140 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:58:27.734855 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.734748 2574 manager.go:217] Machine: {Timestamp:2026-04-22 19:58:27.733158428 +0000 UTC m=+0.368542057 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3095552 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec264c5100588f5508844e9c856c8ac5 SystemUUID:ec264c51-0058-8f55-0884-4e9c856c8ac5 BootID:99718bda-9a17-4104-8a36-7e2e4b81aa28 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:eb:f3:4e:36:5f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:eb:f3:4e:36:5f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:8e:3d:c2:b7:3b:fd Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 19:58:27.734855 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.734851 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 19:58:27.734955 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.734917 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 19:58:27.735779 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.735755 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 19:58:27.735903 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.735782 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-61.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 19:58:27.735949 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.735912 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 19:58:27.735949 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.735920 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 19:58:27.735949 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.735932 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:58:27.736530 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.736520 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 19:58:27.738022 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.738012 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:58:27.738121 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.738112 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 19:58:27.741226 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.741216 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 22 19:58:27.741270 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.741229 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 19:58:27.741270 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.741243 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 19:58:27.741270 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.741253 2574 kubelet.go:397] "Adding apiserver pod source" Apr 22 19:58:27.741270 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.741265 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 19:58:27.744164 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.744143 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:58:27.744249 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.744172 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 19:58:27.748224 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.748206 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 19:58:27.749540 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.749527 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 19:58:27.751032 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.751014 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 19:58:27.751032 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.751035 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 19:58:27.751187 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.751044 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 19:58:27.751187 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.751052 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 19:58:27.751187 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.751060 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 19:58:27.751187 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.751067 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 19:58:27.751187 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.751076 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 19:58:27.751187 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.751084 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 19:58:27.751187 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.751093 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 19:58:27.751187 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.751101 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 19:58:27.751187 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.751112 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 19:58:27.751187 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.751124 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 19:58:27.751876 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.751865 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 19:58:27.751929 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.751879 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 19:58:27.755059 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.755041 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-61.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 19:58:27.755379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.755349 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 19:58:27.755452 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.755408 2574 server.go:1295] "Started kubelet" Apr 22 19:58:27.755510 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.755485 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 19:58:27.755756 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:27.755737 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-61.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 19:58:27.755810 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:27.755736 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 19:58:27.755810 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.755771 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 19:58:27.755867 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.755840 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 19:58:27.756242 ip-10-0-128-61 systemd[1]: Started Kubernetes Kubelet. Apr 22 19:58:27.757075 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.756930 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 19:58:27.757334 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.757319 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 22 19:58:27.759972 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.759952 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9d85x" Apr 22 19:58:27.761001 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.760979 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 19:58:27.761370 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.761347 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 19:58:27.762159 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.762143 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 19:58:27.762284 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.762273 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 19:58:27.762476 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.762462 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 19:58:27.762574 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:27.762157 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-61.ec2.internal\" not found" Apr 22 19:58:27.762643 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.762616 2574 factory.go:55] Registering systemd factory Apr 22 19:58:27.762695 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.762681 2574 factory.go:223] Registration of the systemd container factory successfully Apr 22 19:58:27.762791 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.762779 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 22 19:58:27.762886 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.762868 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 22 19:58:27.762980 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.762901 2574 factory.go:153] Registering CRI-O factory Apr 22 19:58:27.762980 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.762911 2574 factory.go:223] Registration of the crio container factory successfully Apr 22 19:58:27.762980 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:27.761628 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-61.ec2.internal.18a8c62343724629 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-61.ec2.internal,UID:ip-10-0-128-61.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-61.ec2.internal,},FirstTimestamp:2026-04-22 19:58:27.755378217 +0000 UTC m=+0.390761861,LastTimestamp:2026-04-22 19:58:27.755378217 +0000 UTC m=+0.390761861,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-61.ec2.internal,}" Apr 22 19:58:27.762980 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.762951 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 19:58:27.762980 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.762975 2574 factory.go:103] Registering Raw factory Apr 22 19:58:27.762980 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.762986 2574 manager.go:1196] Started watching for new ooms in manager Apr 22 19:58:27.763627 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.763611 2574 manager.go:319] Starting recovery of all containers Apr 22 19:58:27.768945 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.768917 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9d85x" Apr 22 19:58:27.769056 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:27.769034 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 19:58:27.771503 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:27.771403 2574 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-61.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 19:58:27.776292 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.776278 2574 manager.go:324] Recovery completed Apr 22 19:58:27.777601 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:27.777579 2574 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 19:58:27.780307 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.780295 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:27.782291 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.782273 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-61.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:27.782392 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.782301 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-61.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:27.782392 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.782311 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-61.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:27.782770 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.782758 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 19:58:27.782770 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.782768 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 19:58:27.782874 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.782782 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 22 19:58:27.784274 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:27.784211 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-61.ec2.internal.18a8c623450ce5ca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-61.ec2.internal,UID:ip-10-0-128-61.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-128-61.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-128-61.ec2.internal,},FirstTimestamp:2026-04-22 19:58:27.782288842 +0000 UTC m=+0.417672473,LastTimestamp:2026-04-22 19:58:27.782288842 +0000 UTC m=+0.417672473,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-61.ec2.internal,}" Apr 22 19:58:27.785917 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.785906 2574 policy_none.go:49] "None policy: Start" Apr 22 19:58:27.785968 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.785921 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 19:58:27.785968 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.785932 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 22 19:58:27.835002 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.821868 2574 manager.go:341] "Starting Device Plugin manager" Apr 22 19:58:27.835002 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:27.821897 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 19:58:27.835002 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.821907 2574 server.go:85] "Starting device plugin registration server" Apr 22 19:58:27.835002 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.822092 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 19:58:27.835002 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.822105 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 19:58:27.835002 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.822197 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 19:58:27.835002 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.822261 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 19:58:27.835002 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.822268 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 19:58:27.835002 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:27.822906 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 19:58:27.835002 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:27.822944 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-61.ec2.internal\" not found" Apr 22 19:58:27.886941 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.886911 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 19:58:27.888064 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.888049 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 19:58:27.888146 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.888078 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 19:58:27.888146 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.888098 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 19:58:27.888146 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.888106 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 19:58:27.888298 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:27.888181 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 19:58:27.890407 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.890386 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:27.922587 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.922544 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:27.923311 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.923296 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-61.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:27.923390 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.923325 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-61.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:27.923390 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.923336 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-61.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:27.923390 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.923372 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-61.ec2.internal" Apr 22 19:58:27.929295 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.929282 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-61.ec2.internal" Apr 22 19:58:27.929338 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:27.929301 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-61.ec2.internal\": node \"ip-10-0-128-61.ec2.internal\" not found" Apr 22 19:58:27.947937 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:27.947913 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-61.ec2.internal\" not found" Apr 22 19:58:27.988701 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.988675 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-61.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-61.ec2.internal"] Apr 22 19:58:27.988828 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.988744 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:27.990545 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.990528 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-61.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:27.990646 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.990556 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-61.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:27.990646 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.990566 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-61.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:27.992925 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.992912 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:27.993079 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.993062 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-61.ec2.internal" Apr 22 19:58:27.993133 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.993101 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:27.995707 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.995680 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-61.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:27.995797 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.995717 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-61.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:27.995797 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.995729 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-61.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:27.995797 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.995683 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-61.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:27.995797 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.995788 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-61.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:27.995929 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.995804 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-61.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:27.998008 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.997993 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-61.ec2.internal" Apr 22 19:58:27.998066 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.998026 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 19:58:27.998852 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.998838 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-61.ec2.internal" event="NodeHasSufficientMemory" Apr 22 19:58:27.998922 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.998865 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-61.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 19:58:27.998922 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:27.998883 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-61.ec2.internal" event="NodeHasSufficientPID" Apr 22 19:58:28.020856 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:28.020833 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-61.ec2.internal\" not found" node="ip-10-0-128-61.ec2.internal" Apr 22 19:58:28.025308 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:28.025292 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-61.ec2.internal\" not found" node="ip-10-0-128-61.ec2.internal" Apr 22 19:58:28.048137 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:28.048117 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-61.ec2.internal\" not found" Apr 22 19:58:28.148387 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:28.148353 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-61.ec2.internal\" not found" Apr 22 19:58:28.163855 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.163832 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5327cd329c9207a4f0ec6817995e74eb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-61.ec2.internal\" (UID: \"5327cd329c9207a4f0ec6817995e74eb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-61.ec2.internal" Apr 22 19:58:28.163941 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.163863 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5327cd329c9207a4f0ec6817995e74eb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-61.ec2.internal\" (UID: \"5327cd329c9207a4f0ec6817995e74eb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-61.ec2.internal" Apr 22 19:58:28.163941 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.163890 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/38ecb0e569cd2e23a0f7c4c92d20d963-config\") pod \"kube-apiserver-proxy-ip-10-0-128-61.ec2.internal\" (UID: \"38ecb0e569cd2e23a0f7c4c92d20d963\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-61.ec2.internal" Apr 22 19:58:28.249217 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:28.249166 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-61.ec2.internal\" not found" Apr 22 19:58:28.264533 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.264515 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5327cd329c9207a4f0ec6817995e74eb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-61.ec2.internal\" (UID: \"5327cd329c9207a4f0ec6817995e74eb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-61.ec2.internal" Apr 22 19:58:28.264582 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.264538 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5327cd329c9207a4f0ec6817995e74eb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-61.ec2.internal\" (UID: \"5327cd329c9207a4f0ec6817995e74eb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-61.ec2.internal" Apr 22 19:58:28.264582 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.264555 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/38ecb0e569cd2e23a0f7c4c92d20d963-config\") pod \"kube-apiserver-proxy-ip-10-0-128-61.ec2.internal\" (UID: \"38ecb0e569cd2e23a0f7c4c92d20d963\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-61.ec2.internal" Apr 22 19:58:28.264669 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.264585 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/38ecb0e569cd2e23a0f7c4c92d20d963-config\") pod \"kube-apiserver-proxy-ip-10-0-128-61.ec2.internal\" (UID: \"38ecb0e569cd2e23a0f7c4c92d20d963\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-61.ec2.internal" Apr 22 19:58:28.264669 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.264597 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5327cd329c9207a4f0ec6817995e74eb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-61.ec2.internal\" (UID: \"5327cd329c9207a4f0ec6817995e74eb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-61.ec2.internal" Apr 22 19:58:28.264669 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.264619 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5327cd329c9207a4f0ec6817995e74eb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-61.ec2.internal\" (UID: \"5327cd329c9207a4f0ec6817995e74eb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-61.ec2.internal" Apr 22 19:58:28.322657 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.322641 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-61.ec2.internal" Apr 22 19:58:28.328048 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.328032 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-61.ec2.internal" Apr 22 19:58:28.349875 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:28.349850 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-61.ec2.internal\" not found" Apr 22 19:58:28.450380 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:28.450347 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-61.ec2.internal\" not found" Apr 22 19:58:28.550932 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:28.550887 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-61.ec2.internal\" not found" Apr 22 19:58:28.651540 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:28.651521 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-61.ec2.internal\" not found" Apr 22 19:58:28.680947 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.680929 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 19:58:28.681529 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.681049 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 19:58:28.752522 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:28.752496 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-61.ec2.internal\" not found" Apr 22 19:58:28.761216 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.761197 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 19:58:28.772180 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.772138 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 19:53:27 +0000 UTC" deadline="2027-10-19 12:48:41.089074737 +0000 UTC" Apr 22 19:58:28.772180 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.772177 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13072h50m12.316901715s" Apr 22 19:58:28.781622 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.781602 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 19:58:28.797900 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.797764 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:28.802578 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:28.802532 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5327cd329c9207a4f0ec6817995e74eb.slice/crio-6860722cf709b03f8875d21110046e0e7570f54914b5ad88a8b1e93eb873cf0e WatchSource:0}: Error finding container 6860722cf709b03f8875d21110046e0e7570f54914b5ad88a8b1e93eb873cf0e: Status 404 returned error can't find the container with id 6860722cf709b03f8875d21110046e0e7570f54914b5ad88a8b1e93eb873cf0e Apr 22 19:58:28.802997 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:28.802978 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38ecb0e569cd2e23a0f7c4c92d20d963.slice/crio-25526c851c62d8e324be04d13760cd46fc5162594a2d755fb4bc4a642e525585 WatchSource:0}: Error finding container 25526c851c62d8e324be04d13760cd46fc5162594a2d755fb4bc4a642e525585: Status 404 returned error can't find the container with id 25526c851c62d8e324be04d13760cd46fc5162594a2d755fb4bc4a642e525585 Apr 22 19:58:28.805219 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.805204 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wmkmw" Apr 22 19:58:28.809426 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.809411 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:58:28.813104 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.813090 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wmkmw" Apr 22 19:58:28.853492 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:28.853474 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-61.ec2.internal\" not found" Apr 22 19:58:28.870508 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.870490 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:28.891287 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.891240 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-61.ec2.internal" event={"ID":"5327cd329c9207a4f0ec6817995e74eb","Type":"ContainerStarted","Data":"6860722cf709b03f8875d21110046e0e7570f54914b5ad88a8b1e93eb873cf0e"} Apr 22 19:58:28.892131 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.892115 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-61.ec2.internal" event={"ID":"38ecb0e569cd2e23a0f7c4c92d20d963","Type":"ContainerStarted","Data":"25526c851c62d8e324be04d13760cd46fc5162594a2d755fb4bc4a642e525585"} Apr 22 19:58:28.962138 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.962111 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-61.ec2.internal" Apr 22 19:58:28.974632 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.974607 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:58:28.976294 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.976280 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-61.ec2.internal" Apr 22 19:58:28.985091 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:28.985077 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 19:58:29.098125 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.098074 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:29.743192 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.743152 2574 apiserver.go:52] "Watching apiserver" Apr 22 19:58:29.750928 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.750908 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 19:58:29.752390 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.752351 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-2sr9c","openshift-ovn-kubernetes/ovnkube-node-mpl4p","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd","openshift-cluster-node-tuning-operator/tuned-mqslj","openshift-dns/node-resolver-s75m7","openshift-image-registry/node-ca-m58j4","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-61.ec2.internal","kube-system/konnectivity-agent-z2q6l","kube-system/kube-apiserver-proxy-ip-10-0-128-61.ec2.internal","openshift-multus/multus-additional-cni-plugins-6q2sq","openshift-multus/multus-ch4j4","openshift-multus/network-metrics-daemon-4qs28","openshift-network-diagnostics/network-check-target-8blrm"] Apr 22 19:58:29.755349 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.755324 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.757688 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.757664 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.758188 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.758058 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 19:58:29.758188 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.758078 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 19:58:29.758325 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.758212 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hpcg5\"" Apr 22 19:58:29.758325 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.758295 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 19:58:29.758516 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.758369 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 19:58:29.758516 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.758393 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 19:58:29.760069 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.760001 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:29.760171 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.760146 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 19:58:29.760379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.760325 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 19:58:29.760484 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.760405 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 19:58:29.760543 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.760523 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 19:58:29.760597 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.760576 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 19:58:29.761073 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.761053 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 19:58:29.761196 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.761098 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-j5lfq\"" Apr 22 19:58:29.762097 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.762079 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 19:58:29.762512 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.762347 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 19:58:29.762512 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.762384 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-snqz7\"" Apr 22 19:58:29.762512 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.762350 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 19:58:29.762788 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.762642 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2sr9c" Apr 22 19:58:29.764695 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.764620 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:58:29.764782 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.764709 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 19:58:29.764782 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.764752 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 19:58:29.764991 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.764977 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s75m7" Apr 22 19:58:29.765068 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.765052 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-fxvrl\"" Apr 22 19:58:29.768834 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.767567 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cktzz\"" Apr 22 19:58:29.768834 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.767827 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 19:58:29.768834 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.767870 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 19:58:29.769137 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.769105 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:29.769717 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:29.769564 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8blrm" podUID="7099cbe7-07ec-402e-846b-d9dddfeea3e4" Apr 22 19:58:29.771776 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.771739 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:29.771843 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:29.771815 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qs28" podUID="6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4" Apr 22 19:58:29.773091 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773068 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-host-cni-bin\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.773195 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773107 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-cni-binary-copy\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.773195 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773134 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-run-openvswitch\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.773195 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773158 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpff4\" (UniqueName: \"kubernetes.io/projected/cef96626-b47d-47c7-97d8-32bb8a23c1f6-kube-api-access-zpff4\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.773195 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773189 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cb719ba-85f7-48ad-a93a-f86e8eff6450-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9xgkd\" (UID: \"1cb719ba-85f7-48ad-a93a-f86e8eff6450\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:29.773424 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773245 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1cb719ba-85f7-48ad-a93a-f86e8eff6450-socket-dir\") pod \"aws-ebs-csi-driver-node-9xgkd\" (UID: \"1cb719ba-85f7-48ad-a93a-f86e8eff6450\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:29.773424 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773275 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1cb719ba-85f7-48ad-a93a-f86e8eff6450-device-dir\") pod \"aws-ebs-csi-driver-node-9xgkd\" (UID: \"1cb719ba-85f7-48ad-a93a-f86e8eff6450\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:29.773424 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773303 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbd6j\" (UniqueName: \"kubernetes.io/projected/1cb719ba-85f7-48ad-a93a-f86e8eff6450-kube-api-access-dbd6j\") pod \"aws-ebs-csi-driver-node-9xgkd\" (UID: \"1cb719ba-85f7-48ad-a93a-f86e8eff6450\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:29.773424 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773329 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.773424 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773399 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-run-ovn\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.773424 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773423 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/125e85b6-fd01-43bd-8fce-5440d140a0a1-host-slash\") pod \"iptables-alerter-2sr9c\" (UID: \"125e85b6-fd01-43bd-8fce-5440d140a0a1\") " pod="openshift-network-operator/iptables-alerter-2sr9c" Apr 22 19:58:29.773699 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773449 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-system-cni-dir\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.773699 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773472 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-var-lib-openvswitch\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.773699 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773497 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/125e85b6-fd01-43bd-8fce-5440d140a0a1-iptables-alerter-script\") pod \"iptables-alerter-2sr9c\" (UID: \"125e85b6-fd01-43bd-8fce-5440d140a0a1\") " pod="openshift-network-operator/iptables-alerter-2sr9c" Apr 22 19:58:29.773699 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773537 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncfkw\" (UniqueName: \"kubernetes.io/projected/125e85b6-fd01-43bd-8fce-5440d140a0a1-kube-api-access-ncfkw\") pod \"iptables-alerter-2sr9c\" (UID: \"125e85b6-fd01-43bd-8fce-5440d140a0a1\") " pod="openshift-network-operator/iptables-alerter-2sr9c" Apr 22 19:58:29.773699 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773570 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/64949cb3-7087-4f51-8a7a-81b46c0895c9-tmp-dir\") pod \"node-resolver-s75m7\" (UID: \"64949cb3-7087-4f51-8a7a-81b46c0895c9\") " pod="openshift-dns/node-resolver-s75m7" Apr 22 19:58:29.773699 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773598 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-systemd-units\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.773699 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773621 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-host-run-netns\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.773699 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773639 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-etc-openvswitch\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.773699 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773654 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/64949cb3-7087-4f51-8a7a-81b46c0895c9-hosts-file\") pod \"node-resolver-s75m7\" (UID: \"64949cb3-7087-4f51-8a7a-81b46c0895c9\") " pod="openshift-dns/node-resolver-s75m7" Apr 22 19:58:29.773699 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773670 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.773699 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773700 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-os-release\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.774273 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773725 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-host-kubelet\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.774273 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773747 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-host-slash\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.774273 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773763 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-host-run-ovn-kubernetes\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.774273 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773779 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cef96626-b47d-47c7-97d8-32bb8a23c1f6-ovn-node-metrics-cert\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.774273 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773795 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1cb719ba-85f7-48ad-a93a-f86e8eff6450-sys-fs\") pod \"aws-ebs-csi-driver-node-9xgkd\" (UID: \"1cb719ba-85f7-48ad-a93a-f86e8eff6450\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:29.774273 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773821 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gg5n\" (UniqueName: \"kubernetes.io/projected/64949cb3-7087-4f51-8a7a-81b46c0895c9-kube-api-access-9gg5n\") pod \"node-resolver-s75m7\" (UID: \"64949cb3-7087-4f51-8a7a-81b46c0895c9\") " pod="openshift-dns/node-resolver-s75m7" Apr 22 19:58:29.774273 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773854 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.774273 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773877 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1cb719ba-85f7-48ad-a93a-f86e8eff6450-etc-selinux\") pod \"aws-ebs-csi-driver-node-9xgkd\" (UID: \"1cb719ba-85f7-48ad-a93a-f86e8eff6450\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:29.774273 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773893 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.774273 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773906 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-run-systemd\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.774273 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773923 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cef96626-b47d-47c7-97d8-32bb8a23c1f6-env-overrides\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.774273 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773948 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cef96626-b47d-47c7-97d8-32bb8a23c1f6-ovnkube-script-lib\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.774273 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773970 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1cb719ba-85f7-48ad-a93a-f86e8eff6450-registration-dir\") pod \"aws-ebs-csi-driver-node-9xgkd\" (UID: \"1cb719ba-85f7-48ad-a93a-f86e8eff6450\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:29.774273 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.773992 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-host-cni-netd\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.774273 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.774014 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cef96626-b47d-47c7-97d8-32bb8a23c1f6-ovnkube-config\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.774273 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.774039 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-cnibin\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.775055 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.774061 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xstl\" (UniqueName: \"kubernetes.io/projected/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-kube-api-access-8xstl\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.775055 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.774101 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-node-log\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.775055 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.774109 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.775055 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.774124 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-log-socket\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.776552 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.776465 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-z2q6l" Apr 22 19:58:29.776735 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.776715 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:58:29.776839 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.776770 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-fg666\"" Apr 22 19:58:29.776945 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.776927 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 19:58:29.778577 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.778556 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 19:58:29.778806 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.778696 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wfwz5\"" Apr 22 19:58:29.778862 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.778812 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 19:58:29.778955 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.778937 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.780922 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.780904 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 19:58:29.781002 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.780935 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-kcjxn\"" Apr 22 19:58:29.781250 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.781231 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-m58j4" Apr 22 19:58:29.783290 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.783228 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 19:58:29.783290 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.783255 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 19:58:29.783481 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.783369 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6c4lr\"" Apr 22 19:58:29.783481 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.783471 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 19:58:29.813896 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.813874 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:53:28 +0000 UTC" deadline="2027-11-02 05:52:14.800884688 +0000 UTC" Apr 22 19:58:29.813896 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.813894 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13401h53m44.986993085s" Apr 22 19:58:29.865868 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.865838 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 19:58:29.875243 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875213 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cef96626-b47d-47c7-97d8-32bb8a23c1f6-ovnkube-script-lib\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.875243 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875254 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1cb719ba-85f7-48ad-a93a-f86e8eff6450-registration-dir\") pod \"aws-ebs-csi-driver-node-9xgkd\" (UID: \"1cb719ba-85f7-48ad-a93a-f86e8eff6450\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:29.875426 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875283 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-var-lib-kubelet\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.875426 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875352 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1cb719ba-85f7-48ad-a93a-f86e8eff6450-registration-dir\") pod \"aws-ebs-csi-driver-node-9xgkd\" (UID: \"1cb719ba-85f7-48ad-a93a-f86e8eff6450\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:29.875426 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875397 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-multus-cni-dir\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.875554 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875434 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-os-release\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.875554 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875457 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cef96626-b47d-47c7-97d8-32bb8a23c1f6-ovnkube-config\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.875554 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875475 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-cnibin\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.875554 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875499 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xstl\" (UniqueName: \"kubernetes.io/projected/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-kube-api-access-8xstl\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.875554 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875526 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-log-socket\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.875738 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875546 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-etc-modprobe-d\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.875738 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875561 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-cnibin\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.875738 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875626 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-log-socket\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.875738 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875668 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-sys\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.875901 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875745 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-cni-binary-copy\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.875901 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875767 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cb719ba-85f7-48ad-a93a-f86e8eff6450-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9xgkd\" (UID: \"1cb719ba-85f7-48ad-a93a-f86e8eff6450\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:29.875901 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875790 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1cb719ba-85f7-48ad-a93a-f86e8eff6450-socket-dir\") pod \"aws-ebs-csi-driver-node-9xgkd\" (UID: \"1cb719ba-85f7-48ad-a93a-f86e8eff6450\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:29.875901 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875814 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1cb719ba-85f7-48ad-a93a-f86e8eff6450-device-dir\") pod \"aws-ebs-csi-driver-node-9xgkd\" (UID: \"1cb719ba-85f7-48ad-a93a-f86e8eff6450\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:29.875901 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875837 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbd6j\" (UniqueName: \"kubernetes.io/projected/1cb719ba-85f7-48ad-a93a-f86e8eff6450-kube-api-access-dbd6j\") pod \"aws-ebs-csi-driver-node-9xgkd\" (UID: \"1cb719ba-85f7-48ad-a93a-f86e8eff6450\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:29.875901 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875861 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1cb719ba-85f7-48ad-a93a-f86e8eff6450-kubelet-dir\") pod \"aws-ebs-csi-driver-node-9xgkd\" (UID: \"1cb719ba-85f7-48ad-a93a-f86e8eff6450\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:29.875901 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875864 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-tmp\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.875901 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875899 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bjrq\" (UniqueName: \"kubernetes.io/projected/77f8f3a1-aebf-4a43-97ef-0a217a8920be-kube-api-access-6bjrq\") pod \"node-ca-m58j4\" (UID: \"77f8f3a1-aebf-4a43-97ef-0a217a8920be\") " pod="openshift-image-registry/node-ca-m58j4" Apr 22 19:58:29.876255 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875915 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs\") pod \"network-metrics-daemon-4qs28\" (UID: \"6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4\") " pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:29.876255 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875949 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.876255 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875971 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1cb719ba-85f7-48ad-a93a-f86e8eff6450-socket-dir\") pod \"aws-ebs-csi-driver-node-9xgkd\" (UID: \"1cb719ba-85f7-48ad-a93a-f86e8eff6450\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:29.876255 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876028 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1cb719ba-85f7-48ad-a93a-f86e8eff6450-device-dir\") pod \"aws-ebs-csi-driver-node-9xgkd\" (UID: \"1cb719ba-85f7-48ad-a93a-f86e8eff6450\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:29.876255 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876112 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.876255 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.875975 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-etc-sysctl-conf\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.876255 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876158 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-var-lib-openvswitch\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.876255 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876185 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-lib-modules\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.876255 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876210 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-cnibin\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.876255 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876227 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-host-var-lib-cni-bin\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.876690 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876268 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-var-lib-openvswitch\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.876690 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876269 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-host-run-netns\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.876690 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876303 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-etc-openvswitch\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.876690 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876324 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-cni-binary-copy\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.876690 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876336 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-etc-openvswitch\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.876690 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876305 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-host-run-netns\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.876690 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876348 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-etc-tuned\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.876690 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876393 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-multus-conf-dir\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.876690 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876393 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cef96626-b47d-47c7-97d8-32bb8a23c1f6-ovnkube-script-lib\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.876690 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876440 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-etc-systemd\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.876690 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876478 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-os-release\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.876690 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876509 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-host-kubelet\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.876690 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876532 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-host-slash\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.876690 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876554 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-host-slash\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.876690 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876570 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnbgx\" (UniqueName: \"kubernetes.io/projected/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-kube-api-access-rnbgx\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.876690 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876606 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-os-release\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.876690 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876607 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-multus-socket-dir-parent\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.877515 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876637 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-host-kubelet\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.877515 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876699 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1cb719ba-85f7-48ad-a93a-f86e8eff6450-etc-selinux\") pod \"aws-ebs-csi-driver-node-9xgkd\" (UID: \"1cb719ba-85f7-48ad-a93a-f86e8eff6450\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:29.877515 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876729 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjvcf\" (UniqueName: \"kubernetes.io/projected/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-kube-api-access-wjvcf\") pod \"network-metrics-daemon-4qs28\" (UID: \"6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4\") " pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:29.877515 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876758 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-host-var-lib-cni-multus\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.877515 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876766 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1cb719ba-85f7-48ad-a93a-f86e8eff6450-etc-selinux\") pod \"aws-ebs-csi-driver-node-9xgkd\" (UID: \"1cb719ba-85f7-48ad-a93a-f86e8eff6450\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:29.877515 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876787 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.877515 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876802 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cef96626-b47d-47c7-97d8-32bb8a23c1f6-ovnkube-config\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.877515 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876813 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-run-systemd\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.877515 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876847 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cef96626-b47d-47c7-97d8-32bb8a23c1f6-env-overrides\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.877515 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876855 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-run-systemd\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.877515 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.876875 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/abb9a485-04a7-4c17-a721-ef4a0635e91f-agent-certs\") pod \"konnectivity-agent-z2q6l\" (UID: \"abb9a485-04a7-4c17-a721-ef4a0635e91f\") " pod="kube-system/konnectivity-agent-z2q6l" Apr 22 19:58:29.877515 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877011 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-host-cni-netd\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.877515 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877042 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-node-log\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.877515 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877067 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/64949cb3-7087-4f51-8a7a-81b46c0895c9-tmp-dir\") pod \"node-resolver-s75m7\" (UID: \"64949cb3-7087-4f51-8a7a-81b46c0895c9\") " pod="openshift-dns/node-resolver-s75m7" Apr 22 19:58:29.877515 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877112 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-node-log\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.877515 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877119 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-host-cni-netd\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.877515 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877126 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-host-cni-bin\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.878380 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877172 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-host-cni-bin\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.878380 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877201 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-run-openvswitch\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.878380 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877230 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-run-openvswitch\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.878380 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877202 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cef96626-b47d-47c7-97d8-32bb8a23c1f6-env-overrides\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.878380 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877230 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpff4\" (UniqueName: \"kubernetes.io/projected/cef96626-b47d-47c7-97d8-32bb8a23c1f6-kube-api-access-zpff4\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.878380 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877254 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.878380 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877279 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-host-run-netns\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.878380 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877309 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-host-run-multus-certs\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.878380 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877332 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkhwt\" (UniqueName: \"kubernetes.io/projected/3ee96bf5-836a-4404-833d-bc0c54aa990f-kube-api-access-lkhwt\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.878380 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877377 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-run-ovn\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.878380 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877414 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/64949cb3-7087-4f51-8a7a-81b46c0895c9-tmp-dir\") pod \"node-resolver-s75m7\" (UID: \"64949cb3-7087-4f51-8a7a-81b46c0895c9\") " pod="openshift-dns/node-resolver-s75m7" Apr 22 19:58:29.878380 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877423 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/125e85b6-fd01-43bd-8fce-5440d140a0a1-host-slash\") pod \"iptables-alerter-2sr9c\" (UID: \"125e85b6-fd01-43bd-8fce-5440d140a0a1\") " pod="openshift-network-operator/iptables-alerter-2sr9c" Apr 22 19:58:29.878380 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877438 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-run-ovn\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.878380 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877482 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w8wz\" (UniqueName: \"kubernetes.io/projected/7099cbe7-07ec-402e-846b-d9dddfeea3e4-kube-api-access-7w8wz\") pod \"network-check-target-8blrm\" (UID: \"7099cbe7-07ec-402e-846b-d9dddfeea3e4\") " pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:29.878380 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877512 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-etc-sysconfig\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.878380 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877515 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/125e85b6-fd01-43bd-8fce-5440d140a0a1-host-slash\") pod \"iptables-alerter-2sr9c\" (UID: \"125e85b6-fd01-43bd-8fce-5440d140a0a1\") " pod="openshift-network-operator/iptables-alerter-2sr9c" Apr 22 19:58:29.878380 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877534 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-run\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.879112 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877558 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-host\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.879112 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877582 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/abb9a485-04a7-4c17-a721-ef4a0635e91f-konnectivity-ca\") pod \"konnectivity-agent-z2q6l\" (UID: \"abb9a485-04a7-4c17-a721-ef4a0635e91f\") " pod="kube-system/konnectivity-agent-z2q6l" Apr 22 19:58:29.879112 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877619 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-system-cni-dir\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.879112 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877645 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/125e85b6-fd01-43bd-8fce-5440d140a0a1-iptables-alerter-script\") pod \"iptables-alerter-2sr9c\" (UID: \"125e85b6-fd01-43bd-8fce-5440d140a0a1\") " pod="openshift-network-operator/iptables-alerter-2sr9c" Apr 22 19:58:29.879112 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877664 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-system-cni-dir\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.879112 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877670 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ncfkw\" (UniqueName: \"kubernetes.io/projected/125e85b6-fd01-43bd-8fce-5440d140a0a1-kube-api-access-ncfkw\") pod \"iptables-alerter-2sr9c\" (UID: \"125e85b6-fd01-43bd-8fce-5440d140a0a1\") " pod="openshift-network-operator/iptables-alerter-2sr9c" Apr 22 19:58:29.879112 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877703 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-etc-kubernetes\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.879112 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877727 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77f8f3a1-aebf-4a43-97ef-0a217a8920be-host\") pod \"node-ca-m58j4\" (UID: \"77f8f3a1-aebf-4a43-97ef-0a217a8920be\") " pod="openshift-image-registry/node-ca-m58j4" Apr 22 19:58:29.879112 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877751 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/77f8f3a1-aebf-4a43-97ef-0a217a8920be-serviceca\") pod \"node-ca-m58j4\" (UID: \"77f8f3a1-aebf-4a43-97ef-0a217a8920be\") " pod="openshift-image-registry/node-ca-m58j4" Apr 22 19:58:29.879112 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877784 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-system-cni-dir\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.879112 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877820 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-host-run-k8s-cni-cncf-io\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.879112 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877873 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-systemd-units\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.879112 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877901 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/64949cb3-7087-4f51-8a7a-81b46c0895c9-hosts-file\") pod \"node-resolver-s75m7\" (UID: \"64949cb3-7087-4f51-8a7a-81b46c0895c9\") " pod="openshift-dns/node-resolver-s75m7" Apr 22 19:58:29.879112 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877934 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-host-var-lib-kubelet\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.879112 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877966 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-systemd-units\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.879112 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.877959 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-hostroot\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.879112 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.878049 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/64949cb3-7087-4f51-8a7a-81b46c0895c9-hosts-file\") pod \"node-resolver-s75m7\" (UID: \"64949cb3-7087-4f51-8a7a-81b46c0895c9\") " pod="openshift-dns/node-resolver-s75m7" Apr 22 19:58:29.879782 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.878106 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-etc-kubernetes\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.879782 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.878111 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/125e85b6-fd01-43bd-8fce-5440d140a0a1-iptables-alerter-script\") pod \"iptables-alerter-2sr9c\" (UID: \"125e85b6-fd01-43bd-8fce-5440d140a0a1\") " pod="openshift-network-operator/iptables-alerter-2sr9c" Apr 22 19:58:29.879782 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.878140 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-etc-sysctl-d\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.879782 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.878169 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.879782 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.878205 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-host-run-ovn-kubernetes\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.879782 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.878230 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cef96626-b47d-47c7-97d8-32bb8a23c1f6-ovn-node-metrics-cert\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.879782 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.878255 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1cb719ba-85f7-48ad-a93a-f86e8eff6450-sys-fs\") pod \"aws-ebs-csi-driver-node-9xgkd\" (UID: \"1cb719ba-85f7-48ad-a93a-f86e8eff6450\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:29.879782 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.878280 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9gg5n\" (UniqueName: \"kubernetes.io/projected/64949cb3-7087-4f51-8a7a-81b46c0895c9-kube-api-access-9gg5n\") pod \"node-resolver-s75m7\" (UID: \"64949cb3-7087-4f51-8a7a-81b46c0895c9\") " pod="openshift-dns/node-resolver-s75m7" Apr 22 19:58:29.879782 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.878341 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3ee96bf5-836a-4404-833d-bc0c54aa990f-cni-binary-copy\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.879782 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.878422 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1cb719ba-85f7-48ad-a93a-f86e8eff6450-sys-fs\") pod \"aws-ebs-csi-driver-node-9xgkd\" (UID: \"1cb719ba-85f7-48ad-a93a-f86e8eff6450\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:29.879782 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.878421 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.879782 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.878468 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3ee96bf5-836a-4404-833d-bc0c54aa990f-multus-daemon-config\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.879782 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.878492 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-host-run-ovn-kubernetes\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.879782 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.878553 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cef96626-b47d-47c7-97d8-32bb8a23c1f6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.879782 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.878669 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 19:58:29.879782 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.878818 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.881540 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.881515 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cef96626-b47d-47c7-97d8-32bb8a23c1f6-ovn-node-metrics-cert\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.884813 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.884793 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xstl\" (UniqueName: \"kubernetes.io/projected/4c8642cf-9d39-42d2-bdf4-178ffbc1f890-kube-api-access-8xstl\") pod \"multus-additional-cni-plugins-6q2sq\" (UID: \"4c8642cf-9d39-42d2-bdf4-178ffbc1f890\") " pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:29.884913 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.884821 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbd6j\" (UniqueName: \"kubernetes.io/projected/1cb719ba-85f7-48ad-a93a-f86e8eff6450-kube-api-access-dbd6j\") pod \"aws-ebs-csi-driver-node-9xgkd\" (UID: \"1cb719ba-85f7-48ad-a93a-f86e8eff6450\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:29.885547 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.885506 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpff4\" (UniqueName: \"kubernetes.io/projected/cef96626-b47d-47c7-97d8-32bb8a23c1f6-kube-api-access-zpff4\") pod \"ovnkube-node-mpl4p\" (UID: \"cef96626-b47d-47c7-97d8-32bb8a23c1f6\") " pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:29.885966 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.885944 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncfkw\" (UniqueName: \"kubernetes.io/projected/125e85b6-fd01-43bd-8fce-5440d140a0a1-kube-api-access-ncfkw\") pod \"iptables-alerter-2sr9c\" (UID: \"125e85b6-fd01-43bd-8fce-5440d140a0a1\") " pod="openshift-network-operator/iptables-alerter-2sr9c" Apr 22 19:58:29.886693 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.886675 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gg5n\" (UniqueName: \"kubernetes.io/projected/64949cb3-7087-4f51-8a7a-81b46c0895c9-kube-api-access-9gg5n\") pod \"node-resolver-s75m7\" (UID: \"64949cb3-7087-4f51-8a7a-81b46c0895c9\") " pod="openshift-dns/node-resolver-s75m7" Apr 22 19:58:29.970371 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.970331 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 19:58:29.979422 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979395 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-tmp\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.979553 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979431 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bjrq\" (UniqueName: \"kubernetes.io/projected/77f8f3a1-aebf-4a43-97ef-0a217a8920be-kube-api-access-6bjrq\") pod \"node-ca-m58j4\" (UID: \"77f8f3a1-aebf-4a43-97ef-0a217a8920be\") " pod="openshift-image-registry/node-ca-m58j4" Apr 22 19:58:29.979553 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979455 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs\") pod \"network-metrics-daemon-4qs28\" (UID: \"6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4\") " pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:29.979553 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979479 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-etc-sysctl-conf\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.979553 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979530 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-lib-modules\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.979759 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979570 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-cnibin\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.979759 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979597 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-host-var-lib-cni-bin\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.979759 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979621 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-etc-sysctl-conf\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.979759 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979632 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-etc-tuned\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.979759 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:29.979640 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:29.979759 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979676 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-cnibin\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.979759 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:29.979705 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs podName:6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:30.479685222 +0000 UTC m=+3.115068853 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs") pod "network-metrics-daemon-4qs28" (UID: "6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:29.979759 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979727 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-multus-conf-dir\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.979759 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979756 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-etc-systemd\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.980155 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979770 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-lib-modules\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.980155 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979785 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnbgx\" (UniqueName: \"kubernetes.io/projected/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-kube-api-access-rnbgx\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.980155 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979803 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-multus-conf-dir\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.980155 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979811 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-multus-socket-dir-parent\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.980155 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979727 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-host-var-lib-cni-bin\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.980155 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979831 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-etc-systemd\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.980155 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979854 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wjvcf\" (UniqueName: \"kubernetes.io/projected/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-kube-api-access-wjvcf\") pod \"network-metrics-daemon-4qs28\" (UID: \"6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4\") " pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:29.980155 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979870 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-multus-socket-dir-parent\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.980155 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979895 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-host-var-lib-cni-multus\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.980155 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979925 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/abb9a485-04a7-4c17-a721-ef4a0635e91f-agent-certs\") pod \"konnectivity-agent-z2q6l\" (UID: \"abb9a485-04a7-4c17-a721-ef4a0635e91f\") " pod="kube-system/konnectivity-agent-z2q6l" Apr 22 19:58:29.980155 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979956 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-host-run-netns\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.980155 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979982 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-host-run-multus-certs\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.980155 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.979985 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-host-var-lib-cni-multus\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.980155 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980019 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lkhwt\" (UniqueName: \"kubernetes.io/projected/3ee96bf5-836a-4404-833d-bc0c54aa990f-kube-api-access-lkhwt\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.980155 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980053 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7w8wz\" (UniqueName: \"kubernetes.io/projected/7099cbe7-07ec-402e-846b-d9dddfeea3e4-kube-api-access-7w8wz\") pod \"network-check-target-8blrm\" (UID: \"7099cbe7-07ec-402e-846b-d9dddfeea3e4\") " pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:29.980155 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980082 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-host-run-netns\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.980155 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980089 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-etc-sysconfig\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.980155 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980054 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-host-run-multus-certs\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.980984 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980125 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-run\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.980984 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980135 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-etc-sysconfig\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.980984 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980153 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-host\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.980984 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980177 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/abb9a485-04a7-4c17-a721-ef4a0635e91f-konnectivity-ca\") pod \"konnectivity-agent-z2q6l\" (UID: \"abb9a485-04a7-4c17-a721-ef4a0635e91f\") " pod="kube-system/konnectivity-agent-z2q6l" Apr 22 19:58:29.980984 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980191 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-run\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.980984 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980208 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-etc-kubernetes\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.980984 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980251 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-host\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.980984 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980264 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-etc-kubernetes\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.980984 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980352 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77f8f3a1-aebf-4a43-97ef-0a217a8920be-host\") pod \"node-ca-m58j4\" (UID: \"77f8f3a1-aebf-4a43-97ef-0a217a8920be\") " pod="openshift-image-registry/node-ca-m58j4" Apr 22 19:58:29.980984 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980416 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/77f8f3a1-aebf-4a43-97ef-0a217a8920be-serviceca\") pod \"node-ca-m58j4\" (UID: \"77f8f3a1-aebf-4a43-97ef-0a217a8920be\") " pod="openshift-image-registry/node-ca-m58j4" Apr 22 19:58:29.980984 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980443 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-system-cni-dir\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.980984 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980456 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/77f8f3a1-aebf-4a43-97ef-0a217a8920be-host\") pod \"node-ca-m58j4\" (UID: \"77f8f3a1-aebf-4a43-97ef-0a217a8920be\") " pod="openshift-image-registry/node-ca-m58j4" Apr 22 19:58:29.980984 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980469 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-host-run-k8s-cni-cncf-io\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.980984 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980497 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-host-var-lib-kubelet\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.980984 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980522 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-hostroot\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.980984 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980547 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-etc-kubernetes\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.980984 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980572 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-etc-sysctl-d\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.980984 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980604 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3ee96bf5-836a-4404-833d-bc0c54aa990f-cni-binary-copy\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.981793 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980630 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3ee96bf5-836a-4404-833d-bc0c54aa990f-multus-daemon-config\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.981793 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980657 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-var-lib-kubelet\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.981793 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980688 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-multus-cni-dir\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.981793 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980715 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-os-release\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.981793 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980746 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-etc-modprobe-d\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.981793 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980772 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-sys\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.981793 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980782 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/abb9a485-04a7-4c17-a721-ef4a0635e91f-konnectivity-ca\") pod \"konnectivity-agent-z2q6l\" (UID: \"abb9a485-04a7-4c17-a721-ef4a0635e91f\") " pod="kube-system/konnectivity-agent-z2q6l" Apr 22 19:58:29.981793 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980875 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/77f8f3a1-aebf-4a43-97ef-0a217a8920be-serviceca\") pod \"node-ca-m58j4\" (UID: \"77f8f3a1-aebf-4a43-97ef-0a217a8920be\") " pod="openshift-image-registry/node-ca-m58j4" Apr 22 19:58:29.981793 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980880 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-hostroot\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.981793 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980887 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-etc-kubernetes\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.981793 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980945 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-host-run-k8s-cni-cncf-io\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.981793 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980960 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-multus-cni-dir\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.981793 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.980995 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-system-cni-dir\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.981793 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.981009 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-etc-modprobe-d\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.981793 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.981010 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-etc-sysctl-d\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.981793 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.981050 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-sys\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.981793 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.981093 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-var-lib-kubelet\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.981793 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.981105 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-os-release\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.982668 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.981133 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3ee96bf5-836a-4404-833d-bc0c54aa990f-host-var-lib-kubelet\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.982668 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.981471 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3ee96bf5-836a-4404-833d-bc0c54aa990f-multus-daemon-config\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.982668 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.982078 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-etc-tuned\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.982668 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.982185 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3ee96bf5-836a-4404-833d-bc0c54aa990f-cni-binary-copy\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.982668 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.982270 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-tmp\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.982668 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.982593 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/abb9a485-04a7-4c17-a721-ef4a0635e91f-agent-certs\") pod \"konnectivity-agent-z2q6l\" (UID: \"abb9a485-04a7-4c17-a721-ef4a0635e91f\") " pod="kube-system/konnectivity-agent-z2q6l" Apr 22 19:58:29.995176 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:29.995065 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:29.995176 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:29.995090 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:29.995176 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:29.995104 2574 projected.go:194] Error preparing data for projected volume kube-api-access-7w8wz for pod openshift-network-diagnostics/network-check-target-8blrm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:29.995176 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:29.995164 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7099cbe7-07ec-402e-846b-d9dddfeea3e4-kube-api-access-7w8wz podName:7099cbe7-07ec-402e-846b-d9dddfeea3e4 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:30.495145987 +0000 UTC m=+3.130529606 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7w8wz" (UniqueName: "kubernetes.io/projected/7099cbe7-07ec-402e-846b-d9dddfeea3e4-kube-api-access-7w8wz") pod "network-check-target-8blrm" (UID: "7099cbe7-07ec-402e-846b-d9dddfeea3e4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:29.996025 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.996006 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnbgx\" (UniqueName: \"kubernetes.io/projected/6fcb38ad-26c2-4519-ba9e-b5d0da90b12c-kube-api-access-rnbgx\") pod \"tuned-mqslj\" (UID: \"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c\") " pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:29.996974 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.996949 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bjrq\" (UniqueName: \"kubernetes.io/projected/77f8f3a1-aebf-4a43-97ef-0a217a8920be-kube-api-access-6bjrq\") pod \"node-ca-m58j4\" (UID: \"77f8f3a1-aebf-4a43-97ef-0a217a8920be\") " pod="openshift-image-registry/node-ca-m58j4" Apr 22 19:58:29.997434 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.997400 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkhwt\" (UniqueName: \"kubernetes.io/projected/3ee96bf5-836a-4404-833d-bc0c54aa990f-kube-api-access-lkhwt\") pod \"multus-ch4j4\" (UID: \"3ee96bf5-836a-4404-833d-bc0c54aa990f\") " pod="openshift-multus/multus-ch4j4" Apr 22 19:58:29.997434 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:29.997407 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjvcf\" (UniqueName: \"kubernetes.io/projected/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-kube-api-access-wjvcf\") pod \"network-metrics-daemon-4qs28\" (UID: \"6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4\") " pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:30.068887 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.068857 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6q2sq" Apr 22 19:58:30.075503 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.075482 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:30.083195 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.083172 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" Apr 22 19:58:30.089767 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.089743 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2sr9c" Apr 22 19:58:30.096290 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.096274 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s75m7" Apr 22 19:58:30.102874 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.102858 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-mqslj" Apr 22 19:58:30.108458 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.108437 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-z2q6l" Apr 22 19:58:30.113985 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.113966 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ch4j4" Apr 22 19:58:30.118437 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.118418 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-m58j4" Apr 22 19:58:30.393025 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:30.393003 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fcb38ad_26c2_4519_ba9e_b5d0da90b12c.slice/crio-ddcb41c2d7abec1e8e06b5c09cb671e5287adb80fcf43d520cd28f88e9008262 WatchSource:0}: Error finding container ddcb41c2d7abec1e8e06b5c09cb671e5287adb80fcf43d520cd28f88e9008262: Status 404 returned error can't find the container with id ddcb41c2d7abec1e8e06b5c09cb671e5287adb80fcf43d520cd28f88e9008262 Apr 22 19:58:30.394259 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:30.394239 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ee96bf5_836a_4404_833d_bc0c54aa990f.slice/crio-d67fe3b2be2a37740c9a392601381d737a10a5ecad74657a026ae8620745a9dc WatchSource:0}: Error finding container d67fe3b2be2a37740c9a392601381d737a10a5ecad74657a026ae8620745a9dc: Status 404 returned error can't find the container with id d67fe3b2be2a37740c9a392601381d737a10a5ecad74657a026ae8620745a9dc Apr 22 19:58:30.396428 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:30.396407 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64949cb3_7087_4f51_8a7a_81b46c0895c9.slice/crio-1ecd4e52a8b0601efdad98f736bbb4484badd19cb1cd6f0de3e8f4ee608f2f54 WatchSource:0}: Error finding container 1ecd4e52a8b0601efdad98f736bbb4484badd19cb1cd6f0de3e8f4ee608f2f54: Status 404 returned error can't find the container with id 1ecd4e52a8b0601efdad98f736bbb4484badd19cb1cd6f0de3e8f4ee608f2f54 Apr 22 19:58:30.405260 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:30.405236 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabb9a485_04a7_4c17_a721_ef4a0635e91f.slice/crio-ee58c9a32bb1e8d8d55960a0dcbb702a69a9d7d740820034d7be3571be8309cd WatchSource:0}: Error finding container ee58c9a32bb1e8d8d55960a0dcbb702a69a9d7d740820034d7be3571be8309cd: Status 404 returned error can't find the container with id ee58c9a32bb1e8d8d55960a0dcbb702a69a9d7d740820034d7be3571be8309cd Apr 22 19:58:30.405558 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:58:30.405520 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77f8f3a1_aebf_4a43_97ef_0a217a8920be.slice/crio-1f1d61487bdb040fa1e8b10ecfd50297a7b75189719057144072228486020505 WatchSource:0}: Error finding container 1f1d61487bdb040fa1e8b10ecfd50297a7b75189719057144072228486020505: Status 404 returned error can't find the container with id 1f1d61487bdb040fa1e8b10ecfd50297a7b75189719057144072228486020505 Apr 22 19:58:30.484041 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.484017 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs\") pod \"network-metrics-daemon-4qs28\" (UID: \"6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4\") " pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:30.484138 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:30.484122 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:30.484181 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:30.484171 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs podName:6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:31.484158037 +0000 UTC m=+4.119541654 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs") pod "network-metrics-daemon-4qs28" (UID: "6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:30.584950 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.584927 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7w8wz\" (UniqueName: \"kubernetes.io/projected/7099cbe7-07ec-402e-846b-d9dddfeea3e4-kube-api-access-7w8wz\") pod \"network-check-target-8blrm\" (UID: \"7099cbe7-07ec-402e-846b-d9dddfeea3e4\") " pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:30.585104 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:30.585087 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:30.585159 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:30.585108 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:30.585159 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:30.585118 2574 projected.go:194] Error preparing data for projected volume kube-api-access-7w8wz for pod openshift-network-diagnostics/network-check-target-8blrm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:30.585227 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:30.585163 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7099cbe7-07ec-402e-846b-d9dddfeea3e4-kube-api-access-7w8wz podName:7099cbe7-07ec-402e-846b-d9dddfeea3e4 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:31.585146901 +0000 UTC m=+4.220530533 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7w8wz" (UniqueName: "kubernetes.io/projected/7099cbe7-07ec-402e-846b-d9dddfeea3e4-kube-api-access-7w8wz") pod "network-check-target-8blrm" (UID: "7099cbe7-07ec-402e-846b-d9dddfeea3e4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:30.814991 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.814901 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 19:53:28 +0000 UTC" deadline="2028-01-03 08:12:18.732385376 +0000 UTC" Apr 22 19:58:30.814991 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.814939 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14892h13m47.917449351s" Apr 22 19:58:30.898767 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.898731 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-m58j4" event={"ID":"77f8f3a1-aebf-4a43-97ef-0a217a8920be","Type":"ContainerStarted","Data":"1f1d61487bdb040fa1e8b10ecfd50297a7b75189719057144072228486020505"} Apr 22 19:58:30.903031 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.902977 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6q2sq" event={"ID":"4c8642cf-9d39-42d2-bdf4-178ffbc1f890","Type":"ContainerStarted","Data":"2ce6362c3530dcc07f647b7c171774e4fec025b149b28fa5fee415376ad1a564"} Apr 22 19:58:30.905827 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.905774 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s75m7" event={"ID":"64949cb3-7087-4f51-8a7a-81b46c0895c9","Type":"ContainerStarted","Data":"1ecd4e52a8b0601efdad98f736bbb4484badd19cb1cd6f0de3e8f4ee608f2f54"} Apr 22 19:58:30.916755 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.916698 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ch4j4" event={"ID":"3ee96bf5-836a-4404-833d-bc0c54aa990f","Type":"ContainerStarted","Data":"d67fe3b2be2a37740c9a392601381d737a10a5ecad74657a026ae8620745a9dc"} Apr 22 19:58:30.932672 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.932596 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" event={"ID":"cef96626-b47d-47c7-97d8-32bb8a23c1f6","Type":"ContainerStarted","Data":"b748294913cfc655ee1771f9a8258ec0c11b5ebeb35e58b6ebcf5c844105e6ef"} Apr 22 19:58:30.941640 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.940981 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-61.ec2.internal" event={"ID":"38ecb0e569cd2e23a0f7c4c92d20d963","Type":"ContainerStarted","Data":"191b8fc0cbd16f098eb19f6b49e3d0be6e4099e20206870adeab080da99e5128"} Apr 22 19:58:30.951182 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.951131 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-z2q6l" event={"ID":"abb9a485-04a7-4c17-a721-ef4a0635e91f","Type":"ContainerStarted","Data":"ee58c9a32bb1e8d8d55960a0dcbb702a69a9d7d740820034d7be3571be8309cd"} Apr 22 19:58:30.956246 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.956131 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" event={"ID":"1cb719ba-85f7-48ad-a93a-f86e8eff6450","Type":"ContainerStarted","Data":"1ebeb138284dc6560591b6687c4b71e86da14409c21c941b29a747f1cd0b4127"} Apr 22 19:58:30.959161 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.958780 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-61.ec2.internal" podStartSLOduration=2.958764622 podStartE2EDuration="2.958764622s" podCreationTimestamp="2026-04-22 19:58:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:58:30.958330175 +0000 UTC m=+3.593713818" watchObservedRunningTime="2026-04-22 19:58:30.958764622 +0000 UTC m=+3.594148262" Apr 22 19:58:30.965571 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.965542 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2sr9c" event={"ID":"125e85b6-fd01-43bd-8fce-5440d140a0a1","Type":"ContainerStarted","Data":"98cc386ab4738e4a349f13ea84a410934e8d795dc1903bdbc2351ed74e8e9212"} Apr 22 19:58:30.968673 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:30.968646 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mqslj" event={"ID":"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c","Type":"ContainerStarted","Data":"ddcb41c2d7abec1e8e06b5c09cb671e5287adb80fcf43d520cd28f88e9008262"} Apr 22 19:58:31.491870 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:31.491831 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs\") pod \"network-metrics-daemon-4qs28\" (UID: \"6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4\") " pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:31.492051 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:31.491976 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:31.492051 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:31.492038 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs podName:6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:33.49201941 +0000 UTC m=+6.127403033 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs") pod "network-metrics-daemon-4qs28" (UID: "6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:31.593401 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:31.592791 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7w8wz\" (UniqueName: \"kubernetes.io/projected/7099cbe7-07ec-402e-846b-d9dddfeea3e4-kube-api-access-7w8wz\") pod \"network-check-target-8blrm\" (UID: \"7099cbe7-07ec-402e-846b-d9dddfeea3e4\") " pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:31.593401 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:31.592979 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:31.593401 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:31.592996 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:31.593401 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:31.593009 2574 projected.go:194] Error preparing data for projected volume kube-api-access-7w8wz for pod openshift-network-diagnostics/network-check-target-8blrm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:31.593401 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:31.593062 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7099cbe7-07ec-402e-846b-d9dddfeea3e4-kube-api-access-7w8wz podName:7099cbe7-07ec-402e-846b-d9dddfeea3e4 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:33.593043707 +0000 UTC m=+6.228427329 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7w8wz" (UniqueName: "kubernetes.io/projected/7099cbe7-07ec-402e-846b-d9dddfeea3e4-kube-api-access-7w8wz") pod "network-check-target-8blrm" (UID: "7099cbe7-07ec-402e-846b-d9dddfeea3e4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:31.890208 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:31.889431 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:31.890208 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:31.889557 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8blrm" podUID="7099cbe7-07ec-402e-846b-d9dddfeea3e4" Apr 22 19:58:31.890208 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:31.890004 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:31.890208 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:31.890108 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qs28" podUID="6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4" Apr 22 19:58:31.982379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:31.982265 2574 generic.go:358] "Generic (PLEG): container finished" podID="5327cd329c9207a4f0ec6817995e74eb" containerID="d6af8c6266679a0c61a39cbb1ef52c62c8149ef7970600d645f426d8f356a561" exitCode=0 Apr 22 19:58:31.982541 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:31.982389 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-61.ec2.internal" event={"ID":"5327cd329c9207a4f0ec6817995e74eb","Type":"ContainerDied","Data":"d6af8c6266679a0c61a39cbb1ef52c62c8149ef7970600d645f426d8f356a561"} Apr 22 19:58:32.345449 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:32.345411 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-kdnth"] Apr 22 19:58:32.348957 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:32.348934 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:32.349073 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:32.349029 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kdnth" podUID="c2238649-8094-4f67-abfd-33276e6b9b3a" Apr 22 19:58:32.400309 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:32.400062 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c2238649-8094-4f67-abfd-33276e6b9b3a-dbus\") pod \"global-pull-secret-syncer-kdnth\" (UID: \"c2238649-8094-4f67-abfd-33276e6b9b3a\") " pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:32.400309 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:32.400137 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c2238649-8094-4f67-abfd-33276e6b9b3a-kubelet-config\") pod \"global-pull-secret-syncer-kdnth\" (UID: \"c2238649-8094-4f67-abfd-33276e6b9b3a\") " pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:32.400309 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:32.400163 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c2238649-8094-4f67-abfd-33276e6b9b3a-original-pull-secret\") pod \"global-pull-secret-syncer-kdnth\" (UID: \"c2238649-8094-4f67-abfd-33276e6b9b3a\") " pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:32.501545 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:32.500860 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c2238649-8094-4f67-abfd-33276e6b9b3a-dbus\") pod \"global-pull-secret-syncer-kdnth\" (UID: \"c2238649-8094-4f67-abfd-33276e6b9b3a\") " pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:32.501545 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:32.500922 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c2238649-8094-4f67-abfd-33276e6b9b3a-kubelet-config\") pod \"global-pull-secret-syncer-kdnth\" (UID: \"c2238649-8094-4f67-abfd-33276e6b9b3a\") " pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:32.501545 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:32.500949 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c2238649-8094-4f67-abfd-33276e6b9b3a-original-pull-secret\") pod \"global-pull-secret-syncer-kdnth\" (UID: \"c2238649-8094-4f67-abfd-33276e6b9b3a\") " pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:32.501545 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:32.501061 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:32.501545 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:32.501118 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2238649-8094-4f67-abfd-33276e6b9b3a-original-pull-secret podName:c2238649-8094-4f67-abfd-33276e6b9b3a nodeName:}" failed. No retries permitted until 2026-04-22 19:58:33.001099465 +0000 UTC m=+5.636483085 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c2238649-8094-4f67-abfd-33276e6b9b3a-original-pull-secret") pod "global-pull-secret-syncer-kdnth" (UID: "c2238649-8094-4f67-abfd-33276e6b9b3a") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:32.501545 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:32.501410 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c2238649-8094-4f67-abfd-33276e6b9b3a-dbus\") pod \"global-pull-secret-syncer-kdnth\" (UID: \"c2238649-8094-4f67-abfd-33276e6b9b3a\") " pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:32.501545 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:32.501469 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c2238649-8094-4f67-abfd-33276e6b9b3a-kubelet-config\") pod \"global-pull-secret-syncer-kdnth\" (UID: \"c2238649-8094-4f67-abfd-33276e6b9b3a\") " pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:32.991746 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:32.991663 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-61.ec2.internal" event={"ID":"5327cd329c9207a4f0ec6817995e74eb","Type":"ContainerStarted","Data":"4785d744ba337d4918dcc6810d529d947e984ee403aa229520968538b3a762a3"} Apr 22 19:58:33.005220 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:33.005173 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c2238649-8094-4f67-abfd-33276e6b9b3a-original-pull-secret\") pod \"global-pull-secret-syncer-kdnth\" (UID: \"c2238649-8094-4f67-abfd-33276e6b9b3a\") " pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:33.005347 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:33.005312 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:33.005444 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:33.005385 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2238649-8094-4f67-abfd-33276e6b9b3a-original-pull-secret podName:c2238649-8094-4f67-abfd-33276e6b9b3a nodeName:}" failed. No retries permitted until 2026-04-22 19:58:34.005348553 +0000 UTC m=+6.640732184 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c2238649-8094-4f67-abfd-33276e6b9b3a-original-pull-secret") pod "global-pull-secret-syncer-kdnth" (UID: "c2238649-8094-4f67-abfd-33276e6b9b3a") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:33.510472 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:33.510426 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs\") pod \"network-metrics-daemon-4qs28\" (UID: \"6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4\") " pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:33.510640 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:33.510568 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:33.510640 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:33.510632 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs podName:6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:37.510614454 +0000 UTC m=+10.145998077 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs") pod "network-metrics-daemon-4qs28" (UID: "6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:33.611232 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:33.611187 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7w8wz\" (UniqueName: \"kubernetes.io/projected/7099cbe7-07ec-402e-846b-d9dddfeea3e4-kube-api-access-7w8wz\") pod \"network-check-target-8blrm\" (UID: \"7099cbe7-07ec-402e-846b-d9dddfeea3e4\") " pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:33.611442 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:33.611344 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:33.611442 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:33.611385 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:33.611442 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:33.611399 2574 projected.go:194] Error preparing data for projected volume kube-api-access-7w8wz for pod openshift-network-diagnostics/network-check-target-8blrm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:33.611607 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:33.611460 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7099cbe7-07ec-402e-846b-d9dddfeea3e4-kube-api-access-7w8wz podName:7099cbe7-07ec-402e-846b-d9dddfeea3e4 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:37.611440029 +0000 UTC m=+10.246823668 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7w8wz" (UniqueName: "kubernetes.io/projected/7099cbe7-07ec-402e-846b-d9dddfeea3e4-kube-api-access-7w8wz") pod "network-check-target-8blrm" (UID: "7099cbe7-07ec-402e-846b-d9dddfeea3e4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:33.889422 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:33.889335 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:33.889579 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:33.889477 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kdnth" podUID="c2238649-8094-4f67-abfd-33276e6b9b3a" Apr 22 19:58:33.889860 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:33.889834 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:33.890009 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:33.889945 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qs28" podUID="6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4" Apr 22 19:58:33.891572 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:33.891426 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:33.891572 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:33.891530 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8blrm" podUID="7099cbe7-07ec-402e-846b-d9dddfeea3e4" Apr 22 19:58:34.013519 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:34.013476 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c2238649-8094-4f67-abfd-33276e6b9b3a-original-pull-secret\") pod \"global-pull-secret-syncer-kdnth\" (UID: \"c2238649-8094-4f67-abfd-33276e6b9b3a\") " pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:34.013896 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:34.013660 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:34.013896 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:34.013715 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2238649-8094-4f67-abfd-33276e6b9b3a-original-pull-secret podName:c2238649-8094-4f67-abfd-33276e6b9b3a nodeName:}" failed. No retries permitted until 2026-04-22 19:58:36.013697327 +0000 UTC m=+8.649080947 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c2238649-8094-4f67-abfd-33276e6b9b3a-original-pull-secret") pod "global-pull-secret-syncer-kdnth" (UID: "c2238649-8094-4f67-abfd-33276e6b9b3a") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:35.888563 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:35.888444 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:35.888563 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:35.888448 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:35.889052 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:35.888581 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qs28" podUID="6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4" Apr 22 19:58:35.889052 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:35.888677 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8blrm" podUID="7099cbe7-07ec-402e-846b-d9dddfeea3e4" Apr 22 19:58:35.889052 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:35.888457 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:35.889052 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:35.888783 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kdnth" podUID="c2238649-8094-4f67-abfd-33276e6b9b3a" Apr 22 19:58:36.030461 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:36.030343 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c2238649-8094-4f67-abfd-33276e6b9b3a-original-pull-secret\") pod \"global-pull-secret-syncer-kdnth\" (UID: \"c2238649-8094-4f67-abfd-33276e6b9b3a\") " pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:36.030646 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:36.030469 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:36.030646 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:36.030530 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2238649-8094-4f67-abfd-33276e6b9b3a-original-pull-secret podName:c2238649-8094-4f67-abfd-33276e6b9b3a nodeName:}" failed. No retries permitted until 2026-04-22 19:58:40.030512201 +0000 UTC m=+12.665895823 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c2238649-8094-4f67-abfd-33276e6b9b3a-original-pull-secret") pod "global-pull-secret-syncer-kdnth" (UID: "c2238649-8094-4f67-abfd-33276e6b9b3a") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:37.541690 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:37.541651 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs\") pod \"network-metrics-daemon-4qs28\" (UID: \"6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4\") " pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:37.542080 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:37.541837 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:37.542080 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:37.541903 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs podName:6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:45.541882164 +0000 UTC m=+18.177265782 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs") pod "network-metrics-daemon-4qs28" (UID: "6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:37.642290 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:37.642241 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7w8wz\" (UniqueName: \"kubernetes.io/projected/7099cbe7-07ec-402e-846b-d9dddfeea3e4-kube-api-access-7w8wz\") pod \"network-check-target-8blrm\" (UID: \"7099cbe7-07ec-402e-846b-d9dddfeea3e4\") " pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:37.642489 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:37.642421 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:37.642489 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:37.642444 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:37.642489 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:37.642459 2574 projected.go:194] Error preparing data for projected volume kube-api-access-7w8wz for pod openshift-network-diagnostics/network-check-target-8blrm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:37.642662 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:37.642527 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7099cbe7-07ec-402e-846b-d9dddfeea3e4-kube-api-access-7w8wz podName:7099cbe7-07ec-402e-846b-d9dddfeea3e4 nodeName:}" failed. No retries permitted until 2026-04-22 19:58:45.64250653 +0000 UTC m=+18.277890171 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7w8wz" (UniqueName: "kubernetes.io/projected/7099cbe7-07ec-402e-846b-d9dddfeea3e4-kube-api-access-7w8wz") pod "network-check-target-8blrm" (UID: "7099cbe7-07ec-402e-846b-d9dddfeea3e4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:37.890569 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:37.890256 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:37.890726 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:37.890670 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8blrm" podUID="7099cbe7-07ec-402e-846b-d9dddfeea3e4" Apr 22 19:58:37.890726 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:37.890413 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:37.890840 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:37.890774 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kdnth" podUID="c2238649-8094-4f67-abfd-33276e6b9b3a" Apr 22 19:58:37.890840 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:37.890388 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:37.890934 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:37.890861 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qs28" podUID="6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4" Apr 22 19:58:39.889328 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:39.889297 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:39.889808 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:39.889296 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:39.889808 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:39.889418 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kdnth" podUID="c2238649-8094-4f67-abfd-33276e6b9b3a" Apr 22 19:58:39.889808 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:39.889296 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:39.889808 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:39.889520 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qs28" podUID="6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4" Apr 22 19:58:39.889808 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:39.889586 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8blrm" podUID="7099cbe7-07ec-402e-846b-d9dddfeea3e4" Apr 22 19:58:40.061635 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:40.061606 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c2238649-8094-4f67-abfd-33276e6b9b3a-original-pull-secret\") pod \"global-pull-secret-syncer-kdnth\" (UID: \"c2238649-8094-4f67-abfd-33276e6b9b3a\") " pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:40.061790 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:40.061727 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:40.061790 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:40.061786 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2238649-8094-4f67-abfd-33276e6b9b3a-original-pull-secret podName:c2238649-8094-4f67-abfd-33276e6b9b3a nodeName:}" failed. No retries permitted until 2026-04-22 19:58:48.061768256 +0000 UTC m=+20.697151875 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c2238649-8094-4f67-abfd-33276e6b9b3a-original-pull-secret") pod "global-pull-secret-syncer-kdnth" (UID: "c2238649-8094-4f67-abfd-33276e6b9b3a") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:41.889104 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:41.889069 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:41.889104 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:41.889069 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:41.889104 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:41.889087 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:41.889636 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:41.889194 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kdnth" podUID="c2238649-8094-4f67-abfd-33276e6b9b3a" Apr 22 19:58:41.889636 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:41.889247 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qs28" podUID="6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4" Apr 22 19:58:41.889636 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:41.889272 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8blrm" podUID="7099cbe7-07ec-402e-846b-d9dddfeea3e4" Apr 22 19:58:43.888924 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:43.888892 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:43.888924 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:43.888908 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:43.889397 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:43.888897 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:43.889397 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:43.889011 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8blrm" podUID="7099cbe7-07ec-402e-846b-d9dddfeea3e4" Apr 22 19:58:43.889397 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:43.889083 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kdnth" podUID="c2238649-8094-4f67-abfd-33276e6b9b3a" Apr 22 19:58:43.889397 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:43.889163 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qs28" podUID="6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4" Apr 22 19:58:45.609190 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:45.609140 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs\") pod \"network-metrics-daemon-4qs28\" (UID: \"6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4\") " pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:45.609699 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:45.609290 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:45.609699 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:45.609380 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs podName:6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:01.6093449 +0000 UTC m=+34.244728533 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs") pod "network-metrics-daemon-4qs28" (UID: "6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 19:58:45.710033 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:45.710002 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7w8wz\" (UniqueName: \"kubernetes.io/projected/7099cbe7-07ec-402e-846b-d9dddfeea3e4-kube-api-access-7w8wz\") pod \"network-check-target-8blrm\" (UID: \"7099cbe7-07ec-402e-846b-d9dddfeea3e4\") " pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:45.710190 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:45.710152 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 19:58:45.710190 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:45.710177 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 19:58:45.710190 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:45.710190 2574 projected.go:194] Error preparing data for projected volume kube-api-access-7w8wz for pod openshift-network-diagnostics/network-check-target-8blrm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:45.710330 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:45.710252 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7099cbe7-07ec-402e-846b-d9dddfeea3e4-kube-api-access-7w8wz podName:7099cbe7-07ec-402e-846b-d9dddfeea3e4 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:01.710233371 +0000 UTC m=+34.345616993 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7w8wz" (UniqueName: "kubernetes.io/projected/7099cbe7-07ec-402e-846b-d9dddfeea3e4-kube-api-access-7w8wz") pod "network-check-target-8blrm" (UID: "7099cbe7-07ec-402e-846b-d9dddfeea3e4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 19:58:45.889036 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:45.888959 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:45.889177 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:45.888964 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:45.889177 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:45.889081 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8blrm" podUID="7099cbe7-07ec-402e-846b-d9dddfeea3e4" Apr 22 19:58:45.889294 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:45.889186 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qs28" podUID="6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4" Apr 22 19:58:45.889294 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:45.888974 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:45.889413 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:45.889287 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kdnth" podUID="c2238649-8094-4f67-abfd-33276e6b9b3a" Apr 22 19:58:47.889878 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:47.889726 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:47.890436 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:47.889920 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kdnth" podUID="c2238649-8094-4f67-abfd-33276e6b9b3a" Apr 22 19:58:47.890436 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:47.889841 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:47.890436 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:47.889973 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8blrm" podUID="7099cbe7-07ec-402e-846b-d9dddfeea3e4" Apr 22 19:58:47.890436 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:47.889814 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:47.890436 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:47.890045 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qs28" podUID="6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4" Apr 22 19:58:48.019830 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:48.019758 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-z2q6l" event={"ID":"abb9a485-04a7-4c17-a721-ef4a0635e91f","Type":"ContainerStarted","Data":"8d126569e45facecc992f914054e3722ac5fedd1df6aeba42d7374987978a57c"} Apr 22 19:58:48.021036 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:48.021007 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" event={"ID":"1cb719ba-85f7-48ad-a93a-f86e8eff6450","Type":"ContainerStarted","Data":"ea565a2948b069adfdeb0ab3dbfb5b2c4ad9d0becdad9b1ff6808bea6cca044e"} Apr 22 19:58:48.022343 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:48.022298 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mqslj" event={"ID":"6fcb38ad-26c2-4519-ba9e-b5d0da90b12c","Type":"ContainerStarted","Data":"1ae7ac0e66382238a89dbd2a9b9e8c2b480205084fddd85f95b4b7cecdee0a54"} Apr 22 19:58:48.023701 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:48.023682 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-m58j4" event={"ID":"77f8f3a1-aebf-4a43-97ef-0a217a8920be","Type":"ContainerStarted","Data":"18316887cc57866d927b62006ffa471598bf4bc31b8dd8af5d5e4b06920d3820"} Apr 22 19:58:48.024995 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:48.024970 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6q2sq" event={"ID":"4c8642cf-9d39-42d2-bdf4-178ffbc1f890","Type":"ContainerStarted","Data":"97b931b513ea97a1359582e48e0b0a763f06f6330b1446bb339a8d62b2ed8caf"} Apr 22 19:58:48.026227 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:48.026209 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s75m7" event={"ID":"64949cb3-7087-4f51-8a7a-81b46c0895c9","Type":"ContainerStarted","Data":"b6d2cff18edba1fcf418d9e4372bffb2324f8609cdc4f07bb3f7b4c471f403b5"} Apr 22 19:58:48.027537 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:48.027457 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ch4j4" event={"ID":"3ee96bf5-836a-4404-833d-bc0c54aa990f","Type":"ContainerStarted","Data":"21167e7a8a163ca06a2bf410a5f8e4c63e95c8beadee5fdbf334bc58f042c6c7"} Apr 22 19:58:48.029570 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:48.029548 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" event={"ID":"cef96626-b47d-47c7-97d8-32bb8a23c1f6","Type":"ContainerStarted","Data":"9b9560ebeaebf193e13067b25ed31bc5afd4e01a3c7067b60e0b77a586975e03"} Apr 22 19:58:48.029653 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:48.029579 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" event={"ID":"cef96626-b47d-47c7-97d8-32bb8a23c1f6","Type":"ContainerStarted","Data":"a0b6128afa73fcd44bb611cab2c3335e6e2cdfaa7b5105f7df7ade56ea0f6db5"} Apr 22 19:58:48.029653 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:48.029594 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" event={"ID":"cef96626-b47d-47c7-97d8-32bb8a23c1f6","Type":"ContainerStarted","Data":"4553179bc390b54a27512a890d7c31494289b585122f1105a51423d56601beab"} Apr 22 19:58:48.032957 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:48.032925 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-61.ec2.internal" podStartSLOduration=20.032915638 podStartE2EDuration="20.032915638s" podCreationTimestamp="2026-04-22 19:58:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:58:33.009199327 +0000 UTC m=+5.644582968" watchObservedRunningTime="2026-04-22 19:58:48.032915638 +0000 UTC m=+20.668299256" Apr 22 19:58:48.033034 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:48.033010 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-z2q6l" podStartSLOduration=2.865996546 podStartE2EDuration="20.033005177s" podCreationTimestamp="2026-04-22 19:58:28 +0000 UTC" firstStartedPulling="2026-04-22 19:58:30.407829674 +0000 UTC m=+3.043213305" lastFinishedPulling="2026-04-22 19:58:47.574838305 +0000 UTC m=+20.210221936" observedRunningTime="2026-04-22 19:58:48.032658282 +0000 UTC m=+20.668041920" watchObservedRunningTime="2026-04-22 19:58:48.033005177 +0000 UTC m=+20.668388812" Apr 22 19:58:48.045410 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:48.045298 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-s75m7" podStartSLOduration=4.190215121 podStartE2EDuration="21.045286695s" podCreationTimestamp="2026-04-22 19:58:27 +0000 UTC" firstStartedPulling="2026-04-22 19:58:30.3991769 +0000 UTC m=+3.034560520" lastFinishedPulling="2026-04-22 19:58:47.254248474 +0000 UTC m=+19.889632094" observedRunningTime="2026-04-22 19:58:48.044940565 +0000 UTC m=+20.680324203" watchObservedRunningTime="2026-04-22 19:58:48.045286695 +0000 UTC m=+20.680670333" Apr 22 19:58:48.060397 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:48.060341 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-mqslj" podStartSLOduration=2.878694711 podStartE2EDuration="20.060331138s" podCreationTimestamp="2026-04-22 19:58:28 +0000 UTC" firstStartedPulling="2026-04-22 19:58:30.395070746 +0000 UTC m=+3.030454363" lastFinishedPulling="2026-04-22 19:58:47.576707159 +0000 UTC m=+20.212090790" observedRunningTime="2026-04-22 19:58:48.059711793 +0000 UTC m=+20.695095426" watchObservedRunningTime="2026-04-22 19:58:48.060331138 +0000 UTC m=+20.695714777" Apr 22 19:58:48.072765 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:48.072731 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-m58j4" podStartSLOduration=11.243867823 podStartE2EDuration="20.072721615s" podCreationTimestamp="2026-04-22 19:58:28 +0000 UTC" firstStartedPulling="2026-04-22 19:58:30.407667197 +0000 UTC m=+3.043050814" lastFinishedPulling="2026-04-22 19:58:39.236520975 +0000 UTC m=+11.871904606" observedRunningTime="2026-04-22 19:58:48.072665362 +0000 UTC m=+20.708049002" watchObservedRunningTime="2026-04-22 19:58:48.072721615 +0000 UTC m=+20.708105254" Apr 22 19:58:48.107003 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:48.106963 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ch4j4" podStartSLOduration=2.561453025 podStartE2EDuration="20.106950558s" podCreationTimestamp="2026-04-22 19:58:28 +0000 UTC" firstStartedPulling="2026-04-22 19:58:30.396625863 +0000 UTC m=+3.032009483" lastFinishedPulling="2026-04-22 19:58:47.942123399 +0000 UTC m=+20.577507016" observedRunningTime="2026-04-22 19:58:48.106905052 +0000 UTC m=+20.742288691" watchObservedRunningTime="2026-04-22 19:58:48.106950558 +0000 UTC m=+20.742334196" Apr 22 19:58:48.125021 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:48.124199 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c2238649-8094-4f67-abfd-33276e6b9b3a-original-pull-secret\") pod \"global-pull-secret-syncer-kdnth\" (UID: \"c2238649-8094-4f67-abfd-33276e6b9b3a\") " pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:48.125021 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:48.124298 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:48.125021 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:48.124346 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2238649-8094-4f67-abfd-33276e6b9b3a-original-pull-secret podName:c2238649-8094-4f67-abfd-33276e6b9b3a nodeName:}" failed. No retries permitted until 2026-04-22 19:59:04.124330801 +0000 UTC m=+36.759714434 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c2238649-8094-4f67-abfd-33276e6b9b3a-original-pull-secret") pod "global-pull-secret-syncer-kdnth" (UID: "c2238649-8094-4f67-abfd-33276e6b9b3a") : object "kube-system"/"original-pull-secret" not registered Apr 22 19:58:48.787020 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:48.786904 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 19:58:48.833884 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:48.833809 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T19:58:48.78692412Z","UUID":"3187ccbb-8aa3-472f-9e8b-5da4e0f95cf1","Handler":null,"Name":"","Endpoint":""} Apr 22 19:58:48.835179 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:48.835164 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 19:58:48.835252 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:48.835187 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 19:58:49.032659 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:49.032580 2574 generic.go:358] "Generic (PLEG): container finished" podID="4c8642cf-9d39-42d2-bdf4-178ffbc1f890" containerID="97b931b513ea97a1359582e48e0b0a763f06f6330b1446bb339a8d62b2ed8caf" exitCode=0 Apr 22 19:58:49.033087 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:49.032654 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6q2sq" event={"ID":"4c8642cf-9d39-42d2-bdf4-178ffbc1f890","Type":"ContainerDied","Data":"97b931b513ea97a1359582e48e0b0a763f06f6330b1446bb339a8d62b2ed8caf"} Apr 22 19:58:49.035240 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:49.035072 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" event={"ID":"cef96626-b47d-47c7-97d8-32bb8a23c1f6","Type":"ContainerStarted","Data":"3785d565716b2dc66e626a4750aac7e664b0ad37cfea18313e0cde9e23b867e5"} Apr 22 19:58:49.035240 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:49.035101 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" event={"ID":"cef96626-b47d-47c7-97d8-32bb8a23c1f6","Type":"ContainerStarted","Data":"1486f0a418f076eb09b5d536d33c93599d186c4ba35c7f7457ad4b53df520ba0"} Apr 22 19:58:49.035240 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:49.035114 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" event={"ID":"cef96626-b47d-47c7-97d8-32bb8a23c1f6","Type":"ContainerStarted","Data":"8daa78a080f7f7310f484023c82ad1b94ee30497bf9cd6b750c71013599ead38"} Apr 22 19:58:49.036658 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:49.036638 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" event={"ID":"1cb719ba-85f7-48ad-a93a-f86e8eff6450","Type":"ContainerStarted","Data":"2b571d02983d3051991dac197d7da6719b11e4488d85363cb171b755a9b033d3"} Apr 22 19:58:49.037890 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:49.037843 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2sr9c" event={"ID":"125e85b6-fd01-43bd-8fce-5440d140a0a1","Type":"ContainerStarted","Data":"897eb0d98031dc2f8f718750780fa672414906aba58ee2abe7d63ac306e0abfc"} Apr 22 19:58:49.073383 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:49.073328 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2sr9c" podStartSLOduration=4.946461133 podStartE2EDuration="22.073318737s" podCreationTimestamp="2026-04-22 19:58:27 +0000 UTC" firstStartedPulling="2026-04-22 19:58:30.406789614 +0000 UTC m=+3.042173234" lastFinishedPulling="2026-04-22 19:58:47.533647214 +0000 UTC m=+20.169030838" observedRunningTime="2026-04-22 19:58:49.073186402 +0000 UTC m=+21.708570041" watchObservedRunningTime="2026-04-22 19:58:49.073318737 +0000 UTC m=+21.708702406" Apr 22 19:58:49.889174 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:49.889090 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:49.889333 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:49.889095 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:49.889333 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:49.889196 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kdnth" podUID="c2238649-8094-4f67-abfd-33276e6b9b3a" Apr 22 19:58:49.889333 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:49.889095 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:49.889333 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:49.889272 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qs28" podUID="6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4" Apr 22 19:58:49.889568 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:49.889372 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8blrm" podUID="7099cbe7-07ec-402e-846b-d9dddfeea3e4" Apr 22 19:58:50.041808 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:50.041773 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" event={"ID":"1cb719ba-85f7-48ad-a93a-f86e8eff6450","Type":"ContainerStarted","Data":"165b1ee4b314fff92d8497fb970ee4308003c2e48107a9853ef86b933d0b69e6"} Apr 22 19:58:50.058584 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:50.058537 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-9xgkd" podStartSLOduration=3.9666876 podStartE2EDuration="23.058520496s" podCreationTimestamp="2026-04-22 19:58:27 +0000 UTC" firstStartedPulling="2026-04-22 19:58:30.406419339 +0000 UTC m=+3.041802956" lastFinishedPulling="2026-04-22 19:58:49.498252229 +0000 UTC m=+22.133635852" observedRunningTime="2026-04-22 19:58:50.058214125 +0000 UTC m=+22.693597786" watchObservedRunningTime="2026-04-22 19:58:50.058520496 +0000 UTC m=+22.693904137" Apr 22 19:58:50.228535 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:50.228459 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-z2q6l" Apr 22 19:58:50.229119 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:50.229098 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-z2q6l" Apr 22 19:58:51.047123 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:51.047084 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" event={"ID":"cef96626-b47d-47c7-97d8-32bb8a23c1f6","Type":"ContainerStarted","Data":"582bf65419991685a8c56309c56d124f314882babe8ba4d24d71985dc5b82318"} Apr 22 19:58:51.047785 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:51.047316 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-z2q6l" Apr 22 19:58:51.047902 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:51.047885 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-z2q6l" Apr 22 19:58:51.888465 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:51.888434 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:51.888625 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:51.888482 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:51.888625 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:51.888544 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qs28" podUID="6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4" Apr 22 19:58:51.888625 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:51.888593 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8blrm" podUID="7099cbe7-07ec-402e-846b-d9dddfeea3e4" Apr 22 19:58:51.888815 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:51.888646 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:51.888815 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:51.888743 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kdnth" podUID="c2238649-8094-4f67-abfd-33276e6b9b3a" Apr 22 19:58:53.889184 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:53.888952 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:53.889577 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:53.888952 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:53.889577 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:53.889276 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8blrm" podUID="7099cbe7-07ec-402e-846b-d9dddfeea3e4" Apr 22 19:58:53.889577 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:53.888952 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:53.889577 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:53.889338 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qs28" podUID="6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4" Apr 22 19:58:53.889577 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:53.889430 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kdnth" podUID="c2238649-8094-4f67-abfd-33276e6b9b3a" Apr 22 19:58:54.056335 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:54.056269 2574 generic.go:358] "Generic (PLEG): container finished" podID="4c8642cf-9d39-42d2-bdf4-178ffbc1f890" containerID="105418c94c212bf6dbeb1d4a39a5fdfaecb7a75f13007ebcb832e44d9614f03b" exitCode=0 Apr 22 19:58:54.056457 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:54.056343 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6q2sq" event={"ID":"4c8642cf-9d39-42d2-bdf4-178ffbc1f890","Type":"ContainerDied","Data":"105418c94c212bf6dbeb1d4a39a5fdfaecb7a75f13007ebcb832e44d9614f03b"} Apr 22 19:58:54.059393 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:54.059371 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" event={"ID":"cef96626-b47d-47c7-97d8-32bb8a23c1f6","Type":"ContainerStarted","Data":"4c511705cf1b28554a0bef2993071fde117ce280cadfe191573c1302696cf6d8"} Apr 22 19:58:54.059655 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:54.059625 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:54.059655 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:54.059650 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:54.059655 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:54.059662 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:54.073498 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:54.073481 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:54.073593 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:54.073563 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:58:54.104877 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:54.104842 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" podStartSLOduration=9.874194095 podStartE2EDuration="27.104833754s" podCreationTimestamp="2026-04-22 19:58:27 +0000 UTC" firstStartedPulling="2026-04-22 19:58:30.393061916 +0000 UTC m=+3.028445533" lastFinishedPulling="2026-04-22 19:58:47.623701575 +0000 UTC m=+20.259085192" observedRunningTime="2026-04-22 19:58:54.103490169 +0000 UTC m=+26.738873807" watchObservedRunningTime="2026-04-22 19:58:54.104833754 +0000 UTC m=+26.740217392" Apr 22 19:58:55.063572 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:55.063234 2574 generic.go:358] "Generic (PLEG): container finished" podID="4c8642cf-9d39-42d2-bdf4-178ffbc1f890" containerID="80602f7fbe8c97c7362a14468a785dbe70742d01ac1484a060dff9c5dd75aa06" exitCode=0 Apr 22 19:58:55.064243 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:55.063313 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6q2sq" event={"ID":"4c8642cf-9d39-42d2-bdf4-178ffbc1f890","Type":"ContainerDied","Data":"80602f7fbe8c97c7362a14468a785dbe70742d01ac1484a060dff9c5dd75aa06"} Apr 22 19:58:55.064243 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:55.063733 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kdnth"] Apr 22 19:58:55.064243 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:55.064122 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:55.064243 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:55.064210 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kdnth" podUID="c2238649-8094-4f67-abfd-33276e6b9b3a" Apr 22 19:58:55.064772 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:55.064743 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-8blrm"] Apr 22 19:58:55.064883 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:55.064825 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:55.065509 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:55.065477 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8blrm" podUID="7099cbe7-07ec-402e-846b-d9dddfeea3e4" Apr 22 19:58:55.068086 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:55.067292 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4qs28"] Apr 22 19:58:55.068086 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:55.067427 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:55.068086 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:55.067547 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qs28" podUID="6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4" Apr 22 19:58:56.067314 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:56.067232 2574 generic.go:358] "Generic (PLEG): container finished" podID="4c8642cf-9d39-42d2-bdf4-178ffbc1f890" containerID="97bbd5ef87a8b56ecd7e9acd9902975fce3d92e1b6e4ac70c5094c3286f909a2" exitCode=0 Apr 22 19:58:56.067751 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:56.067321 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6q2sq" event={"ID":"4c8642cf-9d39-42d2-bdf4-178ffbc1f890","Type":"ContainerDied","Data":"97bbd5ef87a8b56ecd7e9acd9902975fce3d92e1b6e4ac70c5094c3286f909a2"} Apr 22 19:58:56.889282 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:56.889202 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:56.889282 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:56.889220 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:56.889502 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:56.889202 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:56.889502 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:56.889304 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kdnth" podUID="c2238649-8094-4f67-abfd-33276e6b9b3a" Apr 22 19:58:56.889502 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:56.889416 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8blrm" podUID="7099cbe7-07ec-402e-846b-d9dddfeea3e4" Apr 22 19:58:56.889659 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:56.889513 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qs28" podUID="6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4" Apr 22 19:58:58.889250 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:58.889179 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:58:58.889250 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:58.889194 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:58:58.890042 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:58:58.889180 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:58:58.890042 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:58.889283 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-kdnth" podUID="c2238649-8094-4f67-abfd-33276e6b9b3a" Apr 22 19:58:58.890042 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:58.889375 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4qs28" podUID="6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4" Apr 22 19:58:58.890042 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:58:58.889470 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-8blrm" podUID="7099cbe7-07ec-402e-846b-d9dddfeea3e4" Apr 22 19:59:00.710312 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.710051 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-61.ec2.internal" event="NodeReady" Apr 22 19:59:00.710749 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.710406 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 19:59:00.756509 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.756480 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-x2267"] Apr 22 19:59:00.787456 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.787435 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-s2p2q"] Apr 22 19:59:00.787612 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.787571 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x2267" Apr 22 19:59:00.791329 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.791084 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 19:59:00.791329 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.791102 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 19:59:00.791329 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.791181 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2xtr6\"" Apr 22 19:59:00.805931 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.805911 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x2267"] Apr 22 19:59:00.806028 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.805941 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-s2p2q"] Apr 22 19:59:00.806074 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.806040 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s2p2q" Apr 22 19:59:00.808542 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.808522 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 19:59:00.808705 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.808686 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 19:59:00.808780 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.808754 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 19:59:00.808780 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.808764 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-g9zd7\"" Apr 22 19:59:00.888605 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.888577 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:59:00.888757 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.888613 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:59:00.888757 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.888741 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:59:00.891579 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.891436 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 19:59:00.891579 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.891438 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 19:59:00.891579 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.891506 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vnzkf\"" Apr 22 19:59:00.891579 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.891439 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-h4q6q\"" Apr 22 19:59:00.891579 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.891528 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 19:59:00.891870 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.891585 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 19:59:00.932456 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.932433 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls\") pod \"dns-default-x2267\" (UID: \"41de400a-d8c0-4987-9f2a-ec97460903ec\") " pod="openshift-dns/dns-default-x2267" Apr 22 19:59:00.932572 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.932469 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert\") pod \"ingress-canary-s2p2q\" (UID: \"52f5be1a-f9aa-4bf7-992f-277ec5922772\") " pod="openshift-ingress-canary/ingress-canary-s2p2q" Apr 22 19:59:00.932572 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.932498 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/41de400a-d8c0-4987-9f2a-ec97460903ec-tmp-dir\") pod \"dns-default-x2267\" (UID: \"41de400a-d8c0-4987-9f2a-ec97460903ec\") " pod="openshift-dns/dns-default-x2267" Apr 22 19:59:00.932572 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.932526 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bzrd\" (UniqueName: \"kubernetes.io/projected/41de400a-d8c0-4987-9f2a-ec97460903ec-kube-api-access-7bzrd\") pod \"dns-default-x2267\" (UID: \"41de400a-d8c0-4987-9f2a-ec97460903ec\") " pod="openshift-dns/dns-default-x2267" Apr 22 19:59:00.932718 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.932581 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wchxc\" (UniqueName: \"kubernetes.io/projected/52f5be1a-f9aa-4bf7-992f-277ec5922772-kube-api-access-wchxc\") pod \"ingress-canary-s2p2q\" (UID: \"52f5be1a-f9aa-4bf7-992f-277ec5922772\") " pod="openshift-ingress-canary/ingress-canary-s2p2q" Apr 22 19:59:00.932718 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:00.932598 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41de400a-d8c0-4987-9f2a-ec97460903ec-config-volume\") pod \"dns-default-x2267\" (UID: \"41de400a-d8c0-4987-9f2a-ec97460903ec\") " pod="openshift-dns/dns-default-x2267" Apr 22 19:59:01.033501 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:01.033387 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wchxc\" (UniqueName: \"kubernetes.io/projected/52f5be1a-f9aa-4bf7-992f-277ec5922772-kube-api-access-wchxc\") pod \"ingress-canary-s2p2q\" (UID: \"52f5be1a-f9aa-4bf7-992f-277ec5922772\") " pod="openshift-ingress-canary/ingress-canary-s2p2q" Apr 22 19:59:01.033501 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:01.033425 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41de400a-d8c0-4987-9f2a-ec97460903ec-config-volume\") pod \"dns-default-x2267\" (UID: \"41de400a-d8c0-4987-9f2a-ec97460903ec\") " pod="openshift-dns/dns-default-x2267" Apr 22 19:59:01.033501 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:01.033478 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls\") pod \"dns-default-x2267\" (UID: \"41de400a-d8c0-4987-9f2a-ec97460903ec\") " pod="openshift-dns/dns-default-x2267" Apr 22 19:59:01.033771 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:01.033508 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert\") pod \"ingress-canary-s2p2q\" (UID: \"52f5be1a-f9aa-4bf7-992f-277ec5922772\") " pod="openshift-ingress-canary/ingress-canary-s2p2q" Apr 22 19:59:01.033771 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:01.033619 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:01.033771 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:01.033644 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/41de400a-d8c0-4987-9f2a-ec97460903ec-tmp-dir\") pod \"dns-default-x2267\" (UID: \"41de400a-d8c0-4987-9f2a-ec97460903ec\") " pod="openshift-dns/dns-default-x2267" Apr 22 19:59:01.033771 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:01.033692 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls podName:41de400a-d8c0-4987-9f2a-ec97460903ec nodeName:}" failed. No retries permitted until 2026-04-22 19:59:01.533668694 +0000 UTC m=+34.169052322 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls") pod "dns-default-x2267" (UID: "41de400a-d8c0-4987-9f2a-ec97460903ec") : secret "dns-default-metrics-tls" not found Apr 22 19:59:01.033771 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:01.033715 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bzrd\" (UniqueName: \"kubernetes.io/projected/41de400a-d8c0-4987-9f2a-ec97460903ec-kube-api-access-7bzrd\") pod \"dns-default-x2267\" (UID: \"41de400a-d8c0-4987-9f2a-ec97460903ec\") " pod="openshift-dns/dns-default-x2267" Apr 22 19:59:01.033771 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:01.033619 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:01.034053 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:01.033805 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert podName:52f5be1a-f9aa-4bf7-992f-277ec5922772 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:01.533787141 +0000 UTC m=+34.169170769 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert") pod "ingress-canary-s2p2q" (UID: "52f5be1a-f9aa-4bf7-992f-277ec5922772") : secret "canary-serving-cert" not found Apr 22 19:59:01.034053 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:01.033884 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/41de400a-d8c0-4987-9f2a-ec97460903ec-tmp-dir\") pod \"dns-default-x2267\" (UID: \"41de400a-d8c0-4987-9f2a-ec97460903ec\") " pod="openshift-dns/dns-default-x2267" Apr 22 19:59:01.034131 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:01.034049 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41de400a-d8c0-4987-9f2a-ec97460903ec-config-volume\") pod \"dns-default-x2267\" (UID: \"41de400a-d8c0-4987-9f2a-ec97460903ec\") " pod="openshift-dns/dns-default-x2267" Apr 22 19:59:01.043887 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:01.043866 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wchxc\" (UniqueName: \"kubernetes.io/projected/52f5be1a-f9aa-4bf7-992f-277ec5922772-kube-api-access-wchxc\") pod \"ingress-canary-s2p2q\" (UID: \"52f5be1a-f9aa-4bf7-992f-277ec5922772\") " pod="openshift-ingress-canary/ingress-canary-s2p2q" Apr 22 19:59:01.044020 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:01.043867 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bzrd\" (UniqueName: \"kubernetes.io/projected/41de400a-d8c0-4987-9f2a-ec97460903ec-kube-api-access-7bzrd\") pod \"dns-default-x2267\" (UID: \"41de400a-d8c0-4987-9f2a-ec97460903ec\") " pod="openshift-dns/dns-default-x2267" Apr 22 19:59:01.536942 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:01.536916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert\") pod \"ingress-canary-s2p2q\" (UID: \"52f5be1a-f9aa-4bf7-992f-277ec5922772\") " pod="openshift-ingress-canary/ingress-canary-s2p2q" Apr 22 19:59:01.537157 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:01.537063 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:01.537157 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:01.537102 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls\") pod \"dns-default-x2267\" (UID: \"41de400a-d8c0-4987-9f2a-ec97460903ec\") " pod="openshift-dns/dns-default-x2267" Apr 22 19:59:01.537157 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:01.537120 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert podName:52f5be1a-f9aa-4bf7-992f-277ec5922772 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:02.537104849 +0000 UTC m=+35.172488470 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert") pod "ingress-canary-s2p2q" (UID: "52f5be1a-f9aa-4bf7-992f-277ec5922772") : secret "canary-serving-cert" not found Apr 22 19:59:01.537287 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:01.537179 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:01.537287 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:01.537221 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls podName:41de400a-d8c0-4987-9f2a-ec97460903ec nodeName:}" failed. No retries permitted until 2026-04-22 19:59:02.537210326 +0000 UTC m=+35.172593943 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls") pod "dns-default-x2267" (UID: "41de400a-d8c0-4987-9f2a-ec97460903ec") : secret "dns-default-metrics-tls" not found Apr 22 19:59:01.637805 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:01.637780 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs\") pod \"network-metrics-daemon-4qs28\" (UID: \"6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4\") " pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:59:01.637922 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:01.637864 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:59:01.637922 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:01.637911 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs podName:6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:33.637897639 +0000 UTC m=+66.273281262 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs") pod "network-metrics-daemon-4qs28" (UID: "6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4") : secret "metrics-daemon-secret" not found Apr 22 19:59:01.738638 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:01.738614 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7w8wz\" (UniqueName: \"kubernetes.io/projected/7099cbe7-07ec-402e-846b-d9dddfeea3e4-kube-api-access-7w8wz\") pod \"network-check-target-8blrm\" (UID: \"7099cbe7-07ec-402e-846b-d9dddfeea3e4\") " pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:59:01.741070 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:01.741049 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w8wz\" (UniqueName: \"kubernetes.io/projected/7099cbe7-07ec-402e-846b-d9dddfeea3e4-kube-api-access-7w8wz\") pod \"network-check-target-8blrm\" (UID: \"7099cbe7-07ec-402e-846b-d9dddfeea3e4\") " pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:59:01.807032 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:01.806980 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:59:02.013912 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:02.013886 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-8blrm"] Apr 22 19:59:02.017769 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:59:02.017746 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7099cbe7_07ec_402e_846b_d9dddfeea3e4.slice/crio-8bb256e19c0d6afb4f2af14c42adc3e01f140039726e8afb26d020dbc5a4d973 WatchSource:0}: Error finding container 8bb256e19c0d6afb4f2af14c42adc3e01f140039726e8afb26d020dbc5a4d973: Status 404 returned error can't find the container with id 8bb256e19c0d6afb4f2af14c42adc3e01f140039726e8afb26d020dbc5a4d973 Apr 22 19:59:02.078306 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:02.078277 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-8blrm" event={"ID":"7099cbe7-07ec-402e-846b-d9dddfeea3e4","Type":"ContainerStarted","Data":"8bb256e19c0d6afb4f2af14c42adc3e01f140039726e8afb26d020dbc5a4d973"} Apr 22 19:59:02.081473 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:02.081447 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6q2sq" event={"ID":"4c8642cf-9d39-42d2-bdf4-178ffbc1f890","Type":"ContainerStarted","Data":"6372db74b1707147f7baacdbb3d4b27e4ded55d0b6087651559e2e9552e8ac9b"} Apr 22 19:59:02.544189 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:02.544115 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls\") pod \"dns-default-x2267\" (UID: \"41de400a-d8c0-4987-9f2a-ec97460903ec\") " pod="openshift-dns/dns-default-x2267" Apr 22 19:59:02.544189 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:02.544156 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert\") pod \"ingress-canary-s2p2q\" (UID: \"52f5be1a-f9aa-4bf7-992f-277ec5922772\") " pod="openshift-ingress-canary/ingress-canary-s2p2q" Apr 22 19:59:02.544427 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:02.544271 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:02.544427 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:02.544293 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:02.544427 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:02.544348 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert podName:52f5be1a-f9aa-4bf7-992f-277ec5922772 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:04.544328212 +0000 UTC m=+37.179711829 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert") pod "ingress-canary-s2p2q" (UID: "52f5be1a-f9aa-4bf7-992f-277ec5922772") : secret "canary-serving-cert" not found Apr 22 19:59:02.544427 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:02.544382 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls podName:41de400a-d8c0-4987-9f2a-ec97460903ec nodeName:}" failed. No retries permitted until 2026-04-22 19:59:04.5443724 +0000 UTC m=+37.179756024 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls") pod "dns-default-x2267" (UID: "41de400a-d8c0-4987-9f2a-ec97460903ec") : secret "dns-default-metrics-tls" not found Apr 22 19:59:03.086489 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:03.086449 2574 generic.go:358] "Generic (PLEG): container finished" podID="4c8642cf-9d39-42d2-bdf4-178ffbc1f890" containerID="6372db74b1707147f7baacdbb3d4b27e4ded55d0b6087651559e2e9552e8ac9b" exitCode=0 Apr 22 19:59:03.086858 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:03.086502 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6q2sq" event={"ID":"4c8642cf-9d39-42d2-bdf4-178ffbc1f890","Type":"ContainerDied","Data":"6372db74b1707147f7baacdbb3d4b27e4ded55d0b6087651559e2e9552e8ac9b"} Apr 22 19:59:04.091954 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:04.091741 2574 generic.go:358] "Generic (PLEG): container finished" podID="4c8642cf-9d39-42d2-bdf4-178ffbc1f890" containerID="047bf56252a943d91b67b0352865db833025f374e0892feae4e5d3567fa91ad2" exitCode=0 Apr 22 19:59:04.092299 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:04.091853 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6q2sq" event={"ID":"4c8642cf-9d39-42d2-bdf4-178ffbc1f890","Type":"ContainerDied","Data":"047bf56252a943d91b67b0352865db833025f374e0892feae4e5d3567fa91ad2"} Apr 22 19:59:04.157103 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:04.157071 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c2238649-8094-4f67-abfd-33276e6b9b3a-original-pull-secret\") pod \"global-pull-secret-syncer-kdnth\" (UID: \"c2238649-8094-4f67-abfd-33276e6b9b3a\") " pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:59:04.160938 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:04.160907 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c2238649-8094-4f67-abfd-33276e6b9b3a-original-pull-secret\") pod \"global-pull-secret-syncer-kdnth\" (UID: \"c2238649-8094-4f67-abfd-33276e6b9b3a\") " pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:59:04.211060 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:04.211032 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-kdnth" Apr 22 19:59:04.560261 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:04.560177 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls\") pod \"dns-default-x2267\" (UID: \"41de400a-d8c0-4987-9f2a-ec97460903ec\") " pod="openshift-dns/dns-default-x2267" Apr 22 19:59:04.560261 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:04.560221 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert\") pod \"ingress-canary-s2p2q\" (UID: \"52f5be1a-f9aa-4bf7-992f-277ec5922772\") " pod="openshift-ingress-canary/ingress-canary-s2p2q" Apr 22 19:59:04.560509 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:04.560302 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:04.560509 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:04.560330 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:04.560509 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:04.560379 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls podName:41de400a-d8c0-4987-9f2a-ec97460903ec nodeName:}" failed. No retries permitted until 2026-04-22 19:59:08.560345019 +0000 UTC m=+41.195728652 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls") pod "dns-default-x2267" (UID: "41de400a-d8c0-4987-9f2a-ec97460903ec") : secret "dns-default-metrics-tls" not found Apr 22 19:59:04.560509 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:04.560397 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert podName:52f5be1a-f9aa-4bf7-992f-277ec5922772 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:08.560389974 +0000 UTC m=+41.195773591 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert") pod "ingress-canary-s2p2q" (UID: "52f5be1a-f9aa-4bf7-992f-277ec5922772") : secret "canary-serving-cert" not found Apr 22 19:59:04.836437 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:04.836412 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-kdnth"] Apr 22 19:59:04.839721 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:59:04.839697 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2238649_8094_4f67_abfd_33276e6b9b3a.slice/crio-b449e5f22c7cf7641fe4de503d9a6dae4d214d305036e1852ebc3140cbfcd906 WatchSource:0}: Error finding container b449e5f22c7cf7641fe4de503d9a6dae4d214d305036e1852ebc3140cbfcd906: Status 404 returned error can't find the container with id b449e5f22c7cf7641fe4de503d9a6dae4d214d305036e1852ebc3140cbfcd906 Apr 22 19:59:05.095321 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:05.095149 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-8blrm" event={"ID":"7099cbe7-07ec-402e-846b-d9dddfeea3e4","Type":"ContainerStarted","Data":"795e5ad8d728d3af0671aba2e9e50df4d50af3e19ebfde27b3cb4da266286678"} Apr 22 19:59:05.095924 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:05.095327 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:59:05.096232 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:05.096203 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kdnth" event={"ID":"c2238649-8094-4f67-abfd-33276e6b9b3a","Type":"ContainerStarted","Data":"b449e5f22c7cf7641fe4de503d9a6dae4d214d305036e1852ebc3140cbfcd906"} Apr 22 19:59:05.098940 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:05.098920 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6q2sq" event={"ID":"4c8642cf-9d39-42d2-bdf4-178ffbc1f890","Type":"ContainerStarted","Data":"f52fe0ad9120c4650deeb79ac860213f0b54d6b014a4e2edd3461baf32550cf1"} Apr 22 19:59:05.125185 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:05.125147 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-8blrm" podStartSLOduration=35.160703699 podStartE2EDuration="38.125135657s" podCreationTimestamp="2026-04-22 19:58:27 +0000 UTC" firstStartedPulling="2026-04-22 19:59:02.019747334 +0000 UTC m=+34.655130955" lastFinishedPulling="2026-04-22 19:59:04.98417929 +0000 UTC m=+37.619562913" observedRunningTime="2026-04-22 19:59:05.123649118 +0000 UTC m=+37.759032756" watchObservedRunningTime="2026-04-22 19:59:05.125135657 +0000 UTC m=+37.760519287" Apr 22 19:59:05.199090 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:05.199050 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6q2sq" podStartSLOduration=6.724458805 podStartE2EDuration="38.199038057s" podCreationTimestamp="2026-04-22 19:58:27 +0000 UTC" firstStartedPulling="2026-04-22 19:58:30.402741369 +0000 UTC m=+3.038124999" lastFinishedPulling="2026-04-22 19:59:01.877320621 +0000 UTC m=+34.512704251" observedRunningTime="2026-04-22 19:59:05.190774579 +0000 UTC m=+37.826158219" watchObservedRunningTime="2026-04-22 19:59:05.199038057 +0000 UTC m=+37.834421696" Apr 22 19:59:08.586510 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:08.586463 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls\") pod \"dns-default-x2267\" (UID: \"41de400a-d8c0-4987-9f2a-ec97460903ec\") " pod="openshift-dns/dns-default-x2267" Apr 22 19:59:08.586510 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:08.586510 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert\") pod \"ingress-canary-s2p2q\" (UID: \"52f5be1a-f9aa-4bf7-992f-277ec5922772\") " pod="openshift-ingress-canary/ingress-canary-s2p2q" Apr 22 19:59:08.586939 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:08.586617 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:08.586939 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:08.586621 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:08.586939 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:08.586671 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert podName:52f5be1a-f9aa-4bf7-992f-277ec5922772 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:16.586656664 +0000 UTC m=+49.222040281 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert") pod "ingress-canary-s2p2q" (UID: "52f5be1a-f9aa-4bf7-992f-277ec5922772") : secret "canary-serving-cert" not found Apr 22 19:59:08.586939 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:08.586683 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls podName:41de400a-d8c0-4987-9f2a-ec97460903ec nodeName:}" failed. No retries permitted until 2026-04-22 19:59:16.586677847 +0000 UTC m=+49.222061465 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls") pod "dns-default-x2267" (UID: "41de400a-d8c0-4987-9f2a-ec97460903ec") : secret "dns-default-metrics-tls" not found Apr 22 19:59:09.107404 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:09.107299 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-kdnth" event={"ID":"c2238649-8094-4f67-abfd-33276e6b9b3a","Type":"ContainerStarted","Data":"97ae36b1f0a3009a59a18d0e24f90db4f75001a90dfd8bc8539b3a791901c091"} Apr 22 19:59:16.640484 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:16.640445 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls\") pod \"dns-default-x2267\" (UID: \"41de400a-d8c0-4987-9f2a-ec97460903ec\") " pod="openshift-dns/dns-default-x2267" Apr 22 19:59:16.640484 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:16.640490 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert\") pod \"ingress-canary-s2p2q\" (UID: \"52f5be1a-f9aa-4bf7-992f-277ec5922772\") " pod="openshift-ingress-canary/ingress-canary-s2p2q" Apr 22 19:59:16.640928 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:16.640588 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:16.640928 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:16.640600 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:16.640928 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:16.640649 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls podName:41de400a-d8c0-4987-9f2a-ec97460903ec nodeName:}" failed. No retries permitted until 2026-04-22 19:59:32.640633443 +0000 UTC m=+65.276017059 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls") pod "dns-default-x2267" (UID: "41de400a-d8c0-4987-9f2a-ec97460903ec") : secret "dns-default-metrics-tls" not found Apr 22 19:59:16.640928 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:16.640665 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert podName:52f5be1a-f9aa-4bf7-992f-277ec5922772 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:32.640657832 +0000 UTC m=+65.276041448 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert") pod "ingress-canary-s2p2q" (UID: "52f5be1a-f9aa-4bf7-992f-277ec5922772") : secret "canary-serving-cert" not found Apr 22 19:59:26.082986 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:26.082957 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mpl4p" Apr 22 19:59:26.110720 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:26.110667 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-kdnth" podStartSLOduration=50.127566083 podStartE2EDuration="54.110656032s" podCreationTimestamp="2026-04-22 19:58:32 +0000 UTC" firstStartedPulling="2026-04-22 19:59:04.841618673 +0000 UTC m=+37.477002306" lastFinishedPulling="2026-04-22 19:59:08.824708635 +0000 UTC m=+41.460092255" observedRunningTime="2026-04-22 19:59:09.141025401 +0000 UTC m=+41.776409115" watchObservedRunningTime="2026-04-22 19:59:26.110656032 +0000 UTC m=+58.746039662" Apr 22 19:59:32.655329 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:32.655299 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls\") pod \"dns-default-x2267\" (UID: \"41de400a-d8c0-4987-9f2a-ec97460903ec\") " pod="openshift-dns/dns-default-x2267" Apr 22 19:59:32.655329 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:32.655332 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert\") pod \"ingress-canary-s2p2q\" (UID: \"52f5be1a-f9aa-4bf7-992f-277ec5922772\") " pod="openshift-ingress-canary/ingress-canary-s2p2q" Apr 22 19:59:32.655733 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:32.655432 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 19:59:32.655733 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:32.655443 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 19:59:32.655733 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:32.655495 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert podName:52f5be1a-f9aa-4bf7-992f-277ec5922772 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:04.655478764 +0000 UTC m=+97.290862398 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert") pod "ingress-canary-s2p2q" (UID: "52f5be1a-f9aa-4bf7-992f-277ec5922772") : secret "canary-serving-cert" not found Apr 22 19:59:32.655733 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:32.655509 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls podName:41de400a-d8c0-4987-9f2a-ec97460903ec nodeName:}" failed. No retries permitted until 2026-04-22 20:00:04.655502418 +0000 UTC m=+97.290886035 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls") pod "dns-default-x2267" (UID: "41de400a-d8c0-4987-9f2a-ec97460903ec") : secret "dns-default-metrics-tls" not found Apr 22 19:59:33.662341 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:33.662302 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs\") pod \"network-metrics-daemon-4qs28\" (UID: \"6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4\") " pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 19:59:33.662740 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:33.662461 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 19:59:33.662740 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:33.662534 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs podName:6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:37.662515116 +0000 UTC m=+130.297898739 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs") pod "network-metrics-daemon-4qs28" (UID: "6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4") : secret "metrics-daemon-secret" not found Apr 22 19:59:36.104379 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:36.104338 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-8blrm" Apr 22 19:59:39.056495 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.056458 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2"] Apr 22 19:59:39.059263 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.059242 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2" Apr 22 19:59:39.063562 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.063540 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 19:59:39.064477 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.064454 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 19:59:39.064477 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.064458 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-d4q5h\"" Apr 22 19:59:39.064619 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.064493 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 19:59:39.064741 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.064724 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 19:59:39.068583 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.068561 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2"] Apr 22 19:59:39.100079 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.100054 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/be2244ac-77cc-4970-bda6-135ba736f55c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c9bk2\" (UID: \"be2244ac-77cc-4970-bda6-135ba736f55c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2" Apr 22 19:59:39.100160 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.100107 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/be2244ac-77cc-4970-bda6-135ba736f55c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-c9bk2\" (UID: \"be2244ac-77cc-4970-bda6-135ba736f55c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2" Apr 22 19:59:39.100160 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.100127 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn8lb\" (UniqueName: \"kubernetes.io/projected/be2244ac-77cc-4970-bda6-135ba736f55c-kube-api-access-mn8lb\") pod \"cluster-monitoring-operator-75587bd455-c9bk2\" (UID: \"be2244ac-77cc-4970-bda6-135ba736f55c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2" Apr 22 19:59:39.169036 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.169014 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-bbdd547d6-cdfxx"] Apr 22 19:59:39.170676 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.170662 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-5jcmz"] Apr 22 19:59:39.170809 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.170794 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.172305 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.172290 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5jcmz" Apr 22 19:59:39.173883 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.173860 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 19:59:39.173999 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.173979 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 19:59:39.174117 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.174099 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-v9fvc\"" Apr 22 19:59:39.174497 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.174475 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 19:59:39.174805 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.174780 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-5c9xb\"" Apr 22 19:59:39.180021 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.180002 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 19:59:39.185949 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.185930 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-5jcmz"] Apr 22 19:59:39.190482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.190465 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-bbdd547d6-cdfxx"] Apr 22 19:59:39.200852 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.200835 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-certificates\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.200943 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.200869 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/be2244ac-77cc-4970-bda6-135ba736f55c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c9bk2\" (UID: \"be2244ac-77cc-4970-bda6-135ba736f55c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2" Apr 22 19:59:39.200943 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.200892 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15904536-b225-4e88-a726-c2cc5e1d8dd9-trusted-ca\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.200943 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.200906 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/15904536-b225-4e88-a726-c2cc5e1d8dd9-installation-pull-secrets\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.201048 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.200965 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvl6b\" (UniqueName: \"kubernetes.io/projected/4f409ab1-ad8c-4f11-b4fa-cc97501e9dc9-kube-api-access-nvl6b\") pod \"network-check-source-8894fc9bd-5jcmz\" (UID: \"4f409ab1-ad8c-4f11-b4fa-cc97501e9dc9\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5jcmz" Apr 22 19:59:39.201048 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.201003 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/be2244ac-77cc-4970-bda6-135ba736f55c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-c9bk2\" (UID: \"be2244ac-77cc-4970-bda6-135ba736f55c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2" Apr 22 19:59:39.201048 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:39.201008 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:59:39.201048 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.201019 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mn8lb\" (UniqueName: \"kubernetes.io/projected/be2244ac-77cc-4970-bda6-135ba736f55c-kube-api-access-mn8lb\") pod \"cluster-monitoring-operator-75587bd455-c9bk2\" (UID: \"be2244ac-77cc-4970-bda6-135ba736f55c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2" Apr 22 19:59:39.201048 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.201034 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-tls\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.201246 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.201055 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/15904536-b225-4e88-a726-c2cc5e1d8dd9-ca-trust-extracted\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.201246 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:39.201094 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be2244ac-77cc-4970-bda6-135ba736f55c-cluster-monitoring-operator-tls podName:be2244ac-77cc-4970-bda6-135ba736f55c nodeName:}" failed. No retries permitted until 2026-04-22 19:59:39.701053177 +0000 UTC m=+72.336436800 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/be2244ac-77cc-4970-bda6-135ba736f55c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-c9bk2" (UID: "be2244ac-77cc-4970-bda6-135ba736f55c") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:59:39.201246 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.201190 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/15904536-b225-4e88-a726-c2cc5e1d8dd9-image-registry-private-configuration\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.201412 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.201241 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-bound-sa-token\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.201412 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.201267 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcvlf\" (UniqueName: \"kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-kube-api-access-rcvlf\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.202057 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.202041 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/be2244ac-77cc-4970-bda6-135ba736f55c-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-c9bk2\" (UID: \"be2244ac-77cc-4970-bda6-135ba736f55c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2" Apr 22 19:59:39.221425 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.221407 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn8lb\" (UniqueName: \"kubernetes.io/projected/be2244ac-77cc-4970-bda6-135ba736f55c-kube-api-access-mn8lb\") pod \"cluster-monitoring-operator-75587bd455-c9bk2\" (UID: \"be2244ac-77cc-4970-bda6-135ba736f55c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2" Apr 22 19:59:39.260041 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.260019 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-5hfh2"] Apr 22 19:59:39.261938 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.261924 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" Apr 22 19:59:39.264284 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.264266 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 19:59:39.264387 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.264296 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 19:59:39.264387 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.264269 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-nklfb\"" Apr 22 19:59:39.264502 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.264385 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 19:59:39.264502 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.264427 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 19:59:39.270830 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.270810 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-5hfh2"] Apr 22 19:59:39.271979 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.271964 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 19:59:39.302027 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.302007 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9453f8c-04f5-4b72-b26e-c5ccc3bfed06-serving-cert\") pod \"console-operator-9d4b6777b-5hfh2\" (UID: \"e9453f8c-04f5-4b72-b26e-c5ccc3bfed06\") " pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" Apr 22 19:59:39.302128 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.302043 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-tls\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.302128 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.302062 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/15904536-b225-4e88-a726-c2cc5e1d8dd9-ca-trust-extracted\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.302128 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.302102 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/15904536-b225-4e88-a726-c2cc5e1d8dd9-image-registry-private-configuration\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.302280 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.302143 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-bound-sa-token\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.302280 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:39.302146 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:59:39.302280 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:39.302160 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bbdd547d6-cdfxx: secret "image-registry-tls" not found Apr 22 19:59:39.302280 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:39.302223 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-tls podName:15904536-b225-4e88-a726-c2cc5e1d8dd9 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:39.802209214 +0000 UTC m=+72.437592831 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-tls") pod "image-registry-bbdd547d6-cdfxx" (UID: "15904536-b225-4e88-a726-c2cc5e1d8dd9") : secret "image-registry-tls" not found Apr 22 19:59:39.302280 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.302160 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcvlf\" (UniqueName: \"kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-kube-api-access-rcvlf\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.302560 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.302300 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-certificates\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.302560 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.302351 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15904536-b225-4e88-a726-c2cc5e1d8dd9-trusted-ca\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.302560 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.302383 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/15904536-b225-4e88-a726-c2cc5e1d8dd9-ca-trust-extracted\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.302560 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.302459 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/15904536-b225-4e88-a726-c2cc5e1d8dd9-installation-pull-secrets\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.302560 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.302500 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9453f8c-04f5-4b72-b26e-c5ccc3bfed06-trusted-ca\") pod \"console-operator-9d4b6777b-5hfh2\" (UID: \"e9453f8c-04f5-4b72-b26e-c5ccc3bfed06\") " pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" Apr 22 19:59:39.302560 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.302538 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9453f8c-04f5-4b72-b26e-c5ccc3bfed06-config\") pod \"console-operator-9d4b6777b-5hfh2\" (UID: \"e9453f8c-04f5-4b72-b26e-c5ccc3bfed06\") " pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" Apr 22 19:59:39.302840 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.302579 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nnrv\" (UniqueName: \"kubernetes.io/projected/e9453f8c-04f5-4b72-b26e-c5ccc3bfed06-kube-api-access-4nnrv\") pod \"console-operator-9d4b6777b-5hfh2\" (UID: \"e9453f8c-04f5-4b72-b26e-c5ccc3bfed06\") " pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" Apr 22 19:59:39.302840 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.302654 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvl6b\" (UniqueName: \"kubernetes.io/projected/4f409ab1-ad8c-4f11-b4fa-cc97501e9dc9-kube-api-access-nvl6b\") pod \"network-check-source-8894fc9bd-5jcmz\" (UID: \"4f409ab1-ad8c-4f11-b4fa-cc97501e9dc9\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5jcmz" Apr 22 19:59:39.302958 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.302936 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-certificates\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.303661 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.303645 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15904536-b225-4e88-a726-c2cc5e1d8dd9-trusted-ca\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.304526 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.304511 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/15904536-b225-4e88-a726-c2cc5e1d8dd9-image-registry-private-configuration\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.304704 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.304689 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/15904536-b225-4e88-a726-c2cc5e1d8dd9-installation-pull-secrets\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.316084 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.313744 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvl6b\" (UniqueName: \"kubernetes.io/projected/4f409ab1-ad8c-4f11-b4fa-cc97501e9dc9-kube-api-access-nvl6b\") pod \"network-check-source-8894fc9bd-5jcmz\" (UID: \"4f409ab1-ad8c-4f11-b4fa-cc97501e9dc9\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5jcmz" Apr 22 19:59:39.316084 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.315057 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-bound-sa-token\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.317083 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.317059 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcvlf\" (UniqueName: \"kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-kube-api-access-rcvlf\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.403860 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.403836 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9453f8c-04f5-4b72-b26e-c5ccc3bfed06-trusted-ca\") pod \"console-operator-9d4b6777b-5hfh2\" (UID: \"e9453f8c-04f5-4b72-b26e-c5ccc3bfed06\") " pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" Apr 22 19:59:39.403960 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.403863 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9453f8c-04f5-4b72-b26e-c5ccc3bfed06-config\") pod \"console-operator-9d4b6777b-5hfh2\" (UID: \"e9453f8c-04f5-4b72-b26e-c5ccc3bfed06\") " pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" Apr 22 19:59:39.403960 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.403881 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nnrv\" (UniqueName: \"kubernetes.io/projected/e9453f8c-04f5-4b72-b26e-c5ccc3bfed06-kube-api-access-4nnrv\") pod \"console-operator-9d4b6777b-5hfh2\" (UID: \"e9453f8c-04f5-4b72-b26e-c5ccc3bfed06\") " pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" Apr 22 19:59:39.403960 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.403904 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9453f8c-04f5-4b72-b26e-c5ccc3bfed06-serving-cert\") pod \"console-operator-9d4b6777b-5hfh2\" (UID: \"e9453f8c-04f5-4b72-b26e-c5ccc3bfed06\") " pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" Apr 22 19:59:39.404456 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.404437 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9453f8c-04f5-4b72-b26e-c5ccc3bfed06-config\") pod \"console-operator-9d4b6777b-5hfh2\" (UID: \"e9453f8c-04f5-4b72-b26e-c5ccc3bfed06\") " pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" Apr 22 19:59:39.404596 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.404576 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e9453f8c-04f5-4b72-b26e-c5ccc3bfed06-trusted-ca\") pod \"console-operator-9d4b6777b-5hfh2\" (UID: \"e9453f8c-04f5-4b72-b26e-c5ccc3bfed06\") " pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" Apr 22 19:59:39.406283 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.406265 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9453f8c-04f5-4b72-b26e-c5ccc3bfed06-serving-cert\") pod \"console-operator-9d4b6777b-5hfh2\" (UID: \"e9453f8c-04f5-4b72-b26e-c5ccc3bfed06\") " pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" Apr 22 19:59:39.411192 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.411172 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nnrv\" (UniqueName: \"kubernetes.io/projected/e9453f8c-04f5-4b72-b26e-c5ccc3bfed06-kube-api-access-4nnrv\") pod \"console-operator-9d4b6777b-5hfh2\" (UID: \"e9453f8c-04f5-4b72-b26e-c5ccc3bfed06\") " pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" Apr 22 19:59:39.486216 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.486198 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5jcmz" Apr 22 19:59:39.571648 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.571575 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" Apr 22 19:59:39.594897 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.594870 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-5jcmz"] Apr 22 19:59:39.599427 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:59:39.599402 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f409ab1_ad8c_4f11_b4fa_cc97501e9dc9.slice/crio-6e28e259282f229cfc366ad39360340e98e8642231ae7dbd9aaf24f3ce463175 WatchSource:0}: Error finding container 6e28e259282f229cfc366ad39360340e98e8642231ae7dbd9aaf24f3ce463175: Status 404 returned error can't find the container with id 6e28e259282f229cfc366ad39360340e98e8642231ae7dbd9aaf24f3ce463175 Apr 22 19:59:39.687805 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.687769 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-5hfh2"] Apr 22 19:59:39.690347 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:59:39.690324 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9453f8c_04f5_4b72_b26e_c5ccc3bfed06.slice/crio-4474d3985833d48d744e110bf0d754f5621b62b57075d3e46b36aa9200ee58af WatchSource:0}: Error finding container 4474d3985833d48d744e110bf0d754f5621b62b57075d3e46b36aa9200ee58af: Status 404 returned error can't find the container with id 4474d3985833d48d744e110bf0d754f5621b62b57075d3e46b36aa9200ee58af Apr 22 19:59:39.706054 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.706035 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/be2244ac-77cc-4970-bda6-135ba736f55c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c9bk2\" (UID: \"be2244ac-77cc-4970-bda6-135ba736f55c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2" Apr 22 19:59:39.706187 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:39.706170 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:59:39.706244 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:39.706223 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be2244ac-77cc-4970-bda6-135ba736f55c-cluster-monitoring-operator-tls podName:be2244ac-77cc-4970-bda6-135ba736f55c nodeName:}" failed. No retries permitted until 2026-04-22 19:59:40.706209357 +0000 UTC m=+73.341592974 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/be2244ac-77cc-4970-bda6-135ba736f55c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-c9bk2" (UID: "be2244ac-77cc-4970-bda6-135ba736f55c") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:59:39.806932 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:39.806903 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-tls\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:39.807084 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:39.807025 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:59:39.807084 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:39.807036 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bbdd547d6-cdfxx: secret "image-registry-tls" not found Apr 22 19:59:39.807161 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:39.807089 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-tls podName:15904536-b225-4e88-a726-c2cc5e1d8dd9 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:40.807076082 +0000 UTC m=+73.442459700 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-tls") pod "image-registry-bbdd547d6-cdfxx" (UID: "15904536-b225-4e88-a726-c2cc5e1d8dd9") : secret "image-registry-tls" not found Apr 22 19:59:40.165586 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:40.165555 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5jcmz" event={"ID":"4f409ab1-ad8c-4f11-b4fa-cc97501e9dc9","Type":"ContainerStarted","Data":"5adff7b3d541dee911df74c4c9786688ed248dd44e9319cb73c6c85e1b847418"} Apr 22 19:59:40.165978 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:40.165591 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5jcmz" event={"ID":"4f409ab1-ad8c-4f11-b4fa-cc97501e9dc9","Type":"ContainerStarted","Data":"6e28e259282f229cfc366ad39360340e98e8642231ae7dbd9aaf24f3ce463175"} Apr 22 19:59:40.166581 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:40.166562 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" event={"ID":"e9453f8c-04f5-4b72-b26e-c5ccc3bfed06","Type":"ContainerStarted","Data":"4474d3985833d48d744e110bf0d754f5621b62b57075d3e46b36aa9200ee58af"} Apr 22 19:59:40.180812 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:40.180772 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-5jcmz" podStartSLOduration=1.180760698 podStartE2EDuration="1.180760698s" podCreationTimestamp="2026-04-22 19:59:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:59:40.180307372 +0000 UTC m=+72.815691011" watchObservedRunningTime="2026-04-22 19:59:40.180760698 +0000 UTC m=+72.816144368" Apr 22 19:59:40.713540 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:40.713499 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/be2244ac-77cc-4970-bda6-135ba736f55c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c9bk2\" (UID: \"be2244ac-77cc-4970-bda6-135ba736f55c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2" Apr 22 19:59:40.713720 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:40.713659 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:59:40.713784 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:40.713738 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be2244ac-77cc-4970-bda6-135ba736f55c-cluster-monitoring-operator-tls podName:be2244ac-77cc-4970-bda6-135ba736f55c nodeName:}" failed. No retries permitted until 2026-04-22 19:59:42.713718245 +0000 UTC m=+75.349101865 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/be2244ac-77cc-4970-bda6-135ba736f55c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-c9bk2" (UID: "be2244ac-77cc-4970-bda6-135ba736f55c") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:59:40.814173 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:40.814145 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-tls\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:40.814333 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:40.814314 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:59:40.814397 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:40.814335 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bbdd547d6-cdfxx: secret "image-registry-tls" not found Apr 22 19:59:40.814433 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:40.814398 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-tls podName:15904536-b225-4e88-a726-c2cc5e1d8dd9 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:42.814384213 +0000 UTC m=+75.449767833 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-tls") pod "image-registry-bbdd547d6-cdfxx" (UID: "15904536-b225-4e88-a726-c2cc5e1d8dd9") : secret "image-registry-tls" not found Apr 22 19:59:42.171841 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:42.171815 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/0.log" Apr 22 19:59:42.172158 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:42.171859 2574 generic.go:358] "Generic (PLEG): container finished" podID="e9453f8c-04f5-4b72-b26e-c5ccc3bfed06" containerID="ed021ab5805e2daf5886dcfe5dd4c8721ba1bf2a55230643048d78e73edc4d8e" exitCode=255 Apr 22 19:59:42.172158 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:42.171891 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" event={"ID":"e9453f8c-04f5-4b72-b26e-c5ccc3bfed06","Type":"ContainerDied","Data":"ed021ab5805e2daf5886dcfe5dd4c8721ba1bf2a55230643048d78e73edc4d8e"} Apr 22 19:59:42.172158 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:42.172108 2574 scope.go:117] "RemoveContainer" containerID="ed021ab5805e2daf5886dcfe5dd4c8721ba1bf2a55230643048d78e73edc4d8e" Apr 22 19:59:42.725856 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:42.725829 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/be2244ac-77cc-4970-bda6-135ba736f55c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c9bk2\" (UID: \"be2244ac-77cc-4970-bda6-135ba736f55c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2" Apr 22 19:59:42.725987 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:42.725958 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:59:42.726027 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:42.726020 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be2244ac-77cc-4970-bda6-135ba736f55c-cluster-monitoring-operator-tls podName:be2244ac-77cc-4970-bda6-135ba736f55c nodeName:}" failed. No retries permitted until 2026-04-22 19:59:46.726005303 +0000 UTC m=+79.361388920 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/be2244ac-77cc-4970-bda6-135ba736f55c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-c9bk2" (UID: "be2244ac-77cc-4970-bda6-135ba736f55c") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:59:42.826700 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:42.826676 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-tls\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:42.826811 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:42.826798 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:59:42.826852 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:42.826813 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bbdd547d6-cdfxx: secret "image-registry-tls" not found Apr 22 19:59:42.826889 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:42.826857 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-tls podName:15904536-b225-4e88-a726-c2cc5e1d8dd9 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:46.826843925 +0000 UTC m=+79.462227542 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-tls") pod "image-registry-bbdd547d6-cdfxx" (UID: "15904536-b225-4e88-a726-c2cc5e1d8dd9") : secret "image-registry-tls" not found Apr 22 19:59:43.174985 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:43.174964 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/1.log" Apr 22 19:59:43.175347 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:43.175333 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/0.log" Apr 22 19:59:43.175416 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:43.175377 2574 generic.go:358] "Generic (PLEG): container finished" podID="e9453f8c-04f5-4b72-b26e-c5ccc3bfed06" containerID="b586e43f7e061d1e1527f60d53b9acec0df753fed01309ef5dda38ecf5c13e5d" exitCode=255 Apr 22 19:59:43.175416 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:43.175409 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" event={"ID":"e9453f8c-04f5-4b72-b26e-c5ccc3bfed06","Type":"ContainerDied","Data":"b586e43f7e061d1e1527f60d53b9acec0df753fed01309ef5dda38ecf5c13e5d"} Apr 22 19:59:43.175482 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:43.175436 2574 scope.go:117] "RemoveContainer" containerID="ed021ab5805e2daf5886dcfe5dd4c8721ba1bf2a55230643048d78e73edc4d8e" Apr 22 19:59:43.175661 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:43.175643 2574 scope.go:117] "RemoveContainer" containerID="b586e43f7e061d1e1527f60d53b9acec0df753fed01309ef5dda38ecf5c13e5d" Apr 22 19:59:43.175831 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:43.175812 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-5hfh2_openshift-console-operator(e9453f8c-04f5-4b72-b26e-c5ccc3bfed06)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" podUID="e9453f8c-04f5-4b72-b26e-c5ccc3bfed06" Apr 22 19:59:43.881405 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:43.881374 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-j65gx"] Apr 22 19:59:43.906526 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:43.906505 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-j65gx"] Apr 22 19:59:43.906680 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:43.906664 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-j65gx" Apr 22 19:59:43.909796 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:43.909778 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-ds2dg\"" Apr 22 19:59:43.909949 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:43.909934 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 19:59:43.910646 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:43.910631 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 19:59:43.935814 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:43.935793 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/421f7c2c-ae76-458c-aaf2-422f5b7a1f27-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-j65gx\" (UID: \"421f7c2c-ae76-458c-aaf2-422f5b7a1f27\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j65gx" Apr 22 19:59:43.935903 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:43.935839 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/421f7c2c-ae76-458c-aaf2-422f5b7a1f27-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-j65gx\" (UID: \"421f7c2c-ae76-458c-aaf2-422f5b7a1f27\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j65gx" Apr 22 19:59:44.036473 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:44.036447 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/421f7c2c-ae76-458c-aaf2-422f5b7a1f27-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-j65gx\" (UID: \"421f7c2c-ae76-458c-aaf2-422f5b7a1f27\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j65gx" Apr 22 19:59:44.036559 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:44.036517 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/421f7c2c-ae76-458c-aaf2-422f5b7a1f27-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-j65gx\" (UID: \"421f7c2c-ae76-458c-aaf2-422f5b7a1f27\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j65gx" Apr 22 19:59:44.036608 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:44.036578 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:59:44.036639 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:44.036632 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421f7c2c-ae76-458c-aaf2-422f5b7a1f27-networking-console-plugin-cert podName:421f7c2c-ae76-458c-aaf2-422f5b7a1f27 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:44.536617376 +0000 UTC m=+77.172000993 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/421f7c2c-ae76-458c-aaf2-422f5b7a1f27-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-j65gx" (UID: "421f7c2c-ae76-458c-aaf2-422f5b7a1f27") : secret "networking-console-plugin-cert" not found Apr 22 19:59:44.037000 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:44.036985 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/421f7c2c-ae76-458c-aaf2-422f5b7a1f27-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-j65gx\" (UID: \"421f7c2c-ae76-458c-aaf2-422f5b7a1f27\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j65gx" Apr 22 19:59:44.178045 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:44.177995 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/1.log" Apr 22 19:59:44.178303 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:44.178295 2574 scope.go:117] "RemoveContainer" containerID="b586e43f7e061d1e1527f60d53b9acec0df753fed01309ef5dda38ecf5c13e5d" Apr 22 19:59:44.178474 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:44.178457 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-5hfh2_openshift-console-operator(e9453f8c-04f5-4b72-b26e-c5ccc3bfed06)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" podUID="e9453f8c-04f5-4b72-b26e-c5ccc3bfed06" Apr 22 19:59:44.540843 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:44.540770 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/421f7c2c-ae76-458c-aaf2-422f5b7a1f27-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-j65gx\" (UID: \"421f7c2c-ae76-458c-aaf2-422f5b7a1f27\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j65gx" Apr 22 19:59:44.541042 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:44.540875 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:59:44.541042 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:44.540924 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421f7c2c-ae76-458c-aaf2-422f5b7a1f27-networking-console-plugin-cert podName:421f7c2c-ae76-458c-aaf2-422f5b7a1f27 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:45.540911875 +0000 UTC m=+78.176295492 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/421f7c2c-ae76-458c-aaf2-422f5b7a1f27-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-j65gx" (UID: "421f7c2c-ae76-458c-aaf2-422f5b7a1f27") : secret "networking-console-plugin-cert" not found Apr 22 19:59:45.547468 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:45.547430 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/421f7c2c-ae76-458c-aaf2-422f5b7a1f27-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-j65gx\" (UID: \"421f7c2c-ae76-458c-aaf2-422f5b7a1f27\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j65gx" Apr 22 19:59:45.547872 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:45.547560 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:59:45.547872 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:45.547614 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421f7c2c-ae76-458c-aaf2-422f5b7a1f27-networking-console-plugin-cert podName:421f7c2c-ae76-458c-aaf2-422f5b7a1f27 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:47.54760076 +0000 UTC m=+80.182984377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/421f7c2c-ae76-458c-aaf2-422f5b7a1f27-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-j65gx" (UID: "421f7c2c-ae76-458c-aaf2-422f5b7a1f27") : secret "networking-console-plugin-cert" not found Apr 22 19:59:46.754752 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:46.754627 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/be2244ac-77cc-4970-bda6-135ba736f55c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c9bk2\" (UID: \"be2244ac-77cc-4970-bda6-135ba736f55c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2" Apr 22 19:59:46.754752 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:46.754727 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:59:46.755140 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:46.754781 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be2244ac-77cc-4970-bda6-135ba736f55c-cluster-monitoring-operator-tls podName:be2244ac-77cc-4970-bda6-135ba736f55c nodeName:}" failed. No retries permitted until 2026-04-22 19:59:54.754765811 +0000 UTC m=+87.390149433 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/be2244ac-77cc-4970-bda6-135ba736f55c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-c9bk2" (UID: "be2244ac-77cc-4970-bda6-135ba736f55c") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:59:46.855915 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:46.855887 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-tls\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:46.856061 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:46.856047 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 19:59:46.856115 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:46.856068 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bbdd547d6-cdfxx: secret "image-registry-tls" not found Apr 22 19:59:46.856166 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:46.856132 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-tls podName:15904536-b225-4e88-a726-c2cc5e1d8dd9 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:54.856111336 +0000 UTC m=+87.491494968 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-tls") pod "image-registry-bbdd547d6-cdfxx" (UID: "15904536-b225-4e88-a726-c2cc5e1d8dd9") : secret "image-registry-tls" not found Apr 22 19:59:47.314604 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:47.314565 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-5kc67"] Apr 22 19:59:47.317454 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:47.317430 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-5kc67" Apr 22 19:59:47.320030 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:47.320009 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 19:59:47.320167 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:47.320145 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 19:59:47.320468 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:47.320451 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 19:59:47.321170 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:47.321154 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-587b9\"" Apr 22 19:59:47.321235 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:47.321170 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 19:59:47.325880 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:47.325859 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-5kc67"] Apr 22 19:59:47.359012 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:47.358987 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lctf\" (UniqueName: \"kubernetes.io/projected/917d3e9d-e110-4c75-ba5a-b45334a9e95b-kube-api-access-4lctf\") pod \"service-ca-865cb79987-5kc67\" (UID: \"917d3e9d-e110-4c75-ba5a-b45334a9e95b\") " pod="openshift-service-ca/service-ca-865cb79987-5kc67" Apr 22 19:59:47.359162 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:47.359031 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/917d3e9d-e110-4c75-ba5a-b45334a9e95b-signing-key\") pod \"service-ca-865cb79987-5kc67\" (UID: \"917d3e9d-e110-4c75-ba5a-b45334a9e95b\") " pod="openshift-service-ca/service-ca-865cb79987-5kc67" Apr 22 19:59:47.359162 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:47.359053 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/917d3e9d-e110-4c75-ba5a-b45334a9e95b-signing-cabundle\") pod \"service-ca-865cb79987-5kc67\" (UID: \"917d3e9d-e110-4c75-ba5a-b45334a9e95b\") " pod="openshift-service-ca/service-ca-865cb79987-5kc67" Apr 22 19:59:47.459804 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:47.459768 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4lctf\" (UniqueName: \"kubernetes.io/projected/917d3e9d-e110-4c75-ba5a-b45334a9e95b-kube-api-access-4lctf\") pod \"service-ca-865cb79987-5kc67\" (UID: \"917d3e9d-e110-4c75-ba5a-b45334a9e95b\") " pod="openshift-service-ca/service-ca-865cb79987-5kc67" Apr 22 19:59:47.459909 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:47.459837 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/917d3e9d-e110-4c75-ba5a-b45334a9e95b-signing-key\") pod \"service-ca-865cb79987-5kc67\" (UID: \"917d3e9d-e110-4c75-ba5a-b45334a9e95b\") " pod="openshift-service-ca/service-ca-865cb79987-5kc67" Apr 22 19:59:47.459909 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:47.459871 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/917d3e9d-e110-4c75-ba5a-b45334a9e95b-signing-cabundle\") pod \"service-ca-865cb79987-5kc67\" (UID: \"917d3e9d-e110-4c75-ba5a-b45334a9e95b\") " pod="openshift-service-ca/service-ca-865cb79987-5kc67" Apr 22 19:59:47.460504 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:47.460485 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/917d3e9d-e110-4c75-ba5a-b45334a9e95b-signing-cabundle\") pod \"service-ca-865cb79987-5kc67\" (UID: \"917d3e9d-e110-4c75-ba5a-b45334a9e95b\") " pod="openshift-service-ca/service-ca-865cb79987-5kc67" Apr 22 19:59:47.462211 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:47.462194 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/917d3e9d-e110-4c75-ba5a-b45334a9e95b-signing-key\") pod \"service-ca-865cb79987-5kc67\" (UID: \"917d3e9d-e110-4c75-ba5a-b45334a9e95b\") " pod="openshift-service-ca/service-ca-865cb79987-5kc67" Apr 22 19:59:47.467457 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:47.467439 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lctf\" (UniqueName: \"kubernetes.io/projected/917d3e9d-e110-4c75-ba5a-b45334a9e95b-kube-api-access-4lctf\") pod \"service-ca-865cb79987-5kc67\" (UID: \"917d3e9d-e110-4c75-ba5a-b45334a9e95b\") " pod="openshift-service-ca/service-ca-865cb79987-5kc67" Apr 22 19:59:47.561197 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:47.561177 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/421f7c2c-ae76-458c-aaf2-422f5b7a1f27-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-j65gx\" (UID: \"421f7c2c-ae76-458c-aaf2-422f5b7a1f27\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j65gx" Apr 22 19:59:47.561290 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:47.561256 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:59:47.561336 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:47.561312 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421f7c2c-ae76-458c-aaf2-422f5b7a1f27-networking-console-plugin-cert podName:421f7c2c-ae76-458c-aaf2-422f5b7a1f27 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:51.561299396 +0000 UTC m=+84.196683013 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/421f7c2c-ae76-458c-aaf2-422f5b7a1f27-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-j65gx" (UID: "421f7c2c-ae76-458c-aaf2-422f5b7a1f27") : secret "networking-console-plugin-cert" not found Apr 22 19:59:47.626693 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:47.626642 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-5kc67" Apr 22 19:59:47.740182 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:47.740157 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-5kc67"] Apr 22 19:59:47.742561 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:59:47.742535 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod917d3e9d_e110_4c75_ba5a_b45334a9e95b.slice/crio-a9b2d72c5c97a5eeef9bc061dd46a69dc9c6ff00b56a2ba24e12baec44c81da6 WatchSource:0}: Error finding container a9b2d72c5c97a5eeef9bc061dd46a69dc9c6ff00b56a2ba24e12baec44c81da6: Status 404 returned error can't find the container with id a9b2d72c5c97a5eeef9bc061dd46a69dc9c6ff00b56a2ba24e12baec44c81da6 Apr 22 19:59:48.186595 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:48.186553 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-5kc67" event={"ID":"917d3e9d-e110-4c75-ba5a-b45334a9e95b","Type":"ContainerStarted","Data":"a9b2d72c5c97a5eeef9bc061dd46a69dc9c6ff00b56a2ba24e12baec44c81da6"} Apr 22 19:59:48.448121 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:48.448033 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s75m7_64949cb3-7087-4f51-8a7a-81b46c0895c9/dns-node-resolver/0.log" Apr 22 19:59:49.572139 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:49.572111 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" Apr 22 19:59:49.572420 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:49.572152 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" Apr 22 19:59:49.572673 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:49.572658 2574 scope.go:117] "RemoveContainer" containerID="b586e43f7e061d1e1527f60d53b9acec0df753fed01309ef5dda38ecf5c13e5d" Apr 22 19:59:49.572892 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:49.572872 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-5hfh2_openshift-console-operator(e9453f8c-04f5-4b72-b26e-c5ccc3bfed06)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" podUID="e9453f8c-04f5-4b72-b26e-c5ccc3bfed06" Apr 22 19:59:49.647757 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:49.647731 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-m58j4_77f8f3a1-aebf-4a43-97ef-0a217a8920be/node-ca/0.log" Apr 22 19:59:50.191295 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:50.191264 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-5kc67" event={"ID":"917d3e9d-e110-4c75-ba5a-b45334a9e95b","Type":"ContainerStarted","Data":"e005fca30c193a73ab3953057f5c2f65710a6becf74c584444bd96d48bc517d3"} Apr 22 19:59:50.208614 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:50.208567 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-5kc67" podStartSLOduration=1.4564388209999999 podStartE2EDuration="3.208554136s" podCreationTimestamp="2026-04-22 19:59:47 +0000 UTC" firstStartedPulling="2026-04-22 19:59:47.744651207 +0000 UTC m=+80.380034824" lastFinishedPulling="2026-04-22 19:59:49.496766509 +0000 UTC m=+82.132150139" observedRunningTime="2026-04-22 19:59:50.207597329 +0000 UTC m=+82.842980967" watchObservedRunningTime="2026-04-22 19:59:50.208554136 +0000 UTC m=+82.843937780" Apr 22 19:59:51.593113 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:51.593080 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/421f7c2c-ae76-458c-aaf2-422f5b7a1f27-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-j65gx\" (UID: \"421f7c2c-ae76-458c-aaf2-422f5b7a1f27\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j65gx" Apr 22 19:59:51.593489 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:51.593208 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 19:59:51.593489 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:51.593271 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/421f7c2c-ae76-458c-aaf2-422f5b7a1f27-networking-console-plugin-cert podName:421f7c2c-ae76-458c-aaf2-422f5b7a1f27 nodeName:}" failed. No retries permitted until 2026-04-22 19:59:59.593255801 +0000 UTC m=+92.228639417 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/421f7c2c-ae76-458c-aaf2-422f5b7a1f27-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-j65gx" (UID: "421f7c2c-ae76-458c-aaf2-422f5b7a1f27") : secret "networking-console-plugin-cert" not found Apr 22 19:59:54.815719 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:54.815683 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/be2244ac-77cc-4970-bda6-135ba736f55c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c9bk2\" (UID: \"be2244ac-77cc-4970-bda6-135ba736f55c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2" Apr 22 19:59:54.816149 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:54.815809 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 19:59:54.816149 ip-10-0-128-61 kubenswrapper[2574]: E0422 19:59:54.815869 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/be2244ac-77cc-4970-bda6-135ba736f55c-cluster-monitoring-operator-tls podName:be2244ac-77cc-4970-bda6-135ba736f55c nodeName:}" failed. No retries permitted until 2026-04-22 20:00:10.815852579 +0000 UTC m=+103.451236207 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/be2244ac-77cc-4970-bda6-135ba736f55c-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-c9bk2" (UID: "be2244ac-77cc-4970-bda6-135ba736f55c") : secret "cluster-monitoring-operator-tls" not found Apr 22 19:59:54.916727 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:54.916694 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-tls\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:54.918878 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:54.918851 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-tls\") pod \"image-registry-bbdd547d6-cdfxx\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:55.081823 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:55.081749 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:55.196283 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:55.196256 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-bbdd547d6-cdfxx"] Apr 22 19:59:55.200117 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:59:55.200085 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15904536_b225_4e88_a726_c2cc5e1d8dd9.slice/crio-992d9448b40350cb7723a9fdf2024ebf976cd9bd350cfa848a57c53ed5c1d0a2 WatchSource:0}: Error finding container 992d9448b40350cb7723a9fdf2024ebf976cd9bd350cfa848a57c53ed5c1d0a2: Status 404 returned error can't find the container with id 992d9448b40350cb7723a9fdf2024ebf976cd9bd350cfa848a57c53ed5c1d0a2 Apr 22 19:59:56.204758 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:56.204723 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" event={"ID":"15904536-b225-4e88-a726-c2cc5e1d8dd9","Type":"ContainerStarted","Data":"cde67e52ea8a1f6c842318018313c52a58bcd418eaa2cb5525526d63e24b2631"} Apr 22 19:59:56.204758 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:56.204757 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" event={"ID":"15904536-b225-4e88-a726-c2cc5e1d8dd9","Type":"ContainerStarted","Data":"992d9448b40350cb7723a9fdf2024ebf976cd9bd350cfa848a57c53ed5c1d0a2"} Apr 22 19:59:56.205163 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:56.204857 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 19:59:56.224660 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:56.224622 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" podStartSLOduration=17.224609357 podStartE2EDuration="17.224609357s" podCreationTimestamp="2026-04-22 19:59:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:59:56.223543608 +0000 UTC m=+88.858927249" watchObservedRunningTime="2026-04-22 19:59:56.224609357 +0000 UTC m=+88.859993026" Apr 22 19:59:59.653146 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:59.653106 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/421f7c2c-ae76-458c-aaf2-422f5b7a1f27-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-j65gx\" (UID: \"421f7c2c-ae76-458c-aaf2-422f5b7a1f27\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j65gx" Apr 22 19:59:59.655455 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:59.655433 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/421f7c2c-ae76-458c-aaf2-422f5b7a1f27-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-j65gx\" (UID: \"421f7c2c-ae76-458c-aaf2-422f5b7a1f27\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-j65gx" Apr 22 19:59:59.814914 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:59.814877 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-j65gx" Apr 22 19:59:59.928574 ip-10-0-128-61 kubenswrapper[2574]: I0422 19:59:59.928511 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-j65gx"] Apr 22 19:59:59.931859 ip-10-0-128-61 kubenswrapper[2574]: W0422 19:59:59.931836 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod421f7c2c_ae76_458c_aaf2_422f5b7a1f27.slice/crio-4717ac8fbcb317489f1aafe36c83b1ba511159a9d6ad16f5f95c14fcac73c8c5 WatchSource:0}: Error finding container 4717ac8fbcb317489f1aafe36c83b1ba511159a9d6ad16f5f95c14fcac73c8c5: Status 404 returned error can't find the container with id 4717ac8fbcb317489f1aafe36c83b1ba511159a9d6ad16f5f95c14fcac73c8c5 Apr 22 20:00:00.216089 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:00.216009 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-j65gx" event={"ID":"421f7c2c-ae76-458c-aaf2-422f5b7a1f27","Type":"ContainerStarted","Data":"4717ac8fbcb317489f1aafe36c83b1ba511159a9d6ad16f5f95c14fcac73c8c5"} Apr 22 20:00:01.219529 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:01.219497 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-j65gx" event={"ID":"421f7c2c-ae76-458c-aaf2-422f5b7a1f27","Type":"ContainerStarted","Data":"6acdd49dcc7493a8473dd3a362df37d02b1fb3dbe6f840349857cdd9e795bb79"} Apr 22 20:00:01.234427 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:01.234389 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-j65gx" podStartSLOduration=17.050959047 podStartE2EDuration="18.234376656s" podCreationTimestamp="2026-04-22 19:59:43 +0000 UTC" firstStartedPulling="2026-04-22 19:59:59.934178502 +0000 UTC m=+92.569562120" lastFinishedPulling="2026-04-22 20:00:01.117596112 +0000 UTC m=+93.752979729" observedRunningTime="2026-04-22 20:00:01.233458876 +0000 UTC m=+93.868842515" watchObservedRunningTime="2026-04-22 20:00:01.234376656 +0000 UTC m=+93.869760289" Apr 22 20:00:04.690350 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:04.690312 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert\") pod \"ingress-canary-s2p2q\" (UID: \"52f5be1a-f9aa-4bf7-992f-277ec5922772\") " pod="openshift-ingress-canary/ingress-canary-s2p2q" Apr 22 20:00:04.690709 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:04.690405 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls\") pod \"dns-default-x2267\" (UID: \"41de400a-d8c0-4987-9f2a-ec97460903ec\") " pod="openshift-dns/dns-default-x2267" Apr 22 20:00:04.692628 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:04.692607 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/41de400a-d8c0-4987-9f2a-ec97460903ec-metrics-tls\") pod \"dns-default-x2267\" (UID: \"41de400a-d8c0-4987-9f2a-ec97460903ec\") " pod="openshift-dns/dns-default-x2267" Apr 22 20:00:04.692720 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:04.692607 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52f5be1a-f9aa-4bf7-992f-277ec5922772-cert\") pod \"ingress-canary-s2p2q\" (UID: \"52f5be1a-f9aa-4bf7-992f-277ec5922772\") " pod="openshift-ingress-canary/ingress-canary-s2p2q" Apr 22 20:00:04.704012 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:04.703992 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-2xtr6\"" Apr 22 20:00:04.712267 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:04.712253 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x2267" Apr 22 20:00:04.717484 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:04.717462 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-g9zd7\"" Apr 22 20:00:04.725613 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:04.725596 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s2p2q" Apr 22 20:00:04.832071 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:04.832043 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x2267"] Apr 22 20:00:04.835248 ip-10-0-128-61 kubenswrapper[2574]: W0422 20:00:04.835227 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41de400a_d8c0_4987_9f2a_ec97460903ec.slice/crio-e0a08bf8e3608f5595b197a6eb11a46ea206d7d517f1fe3ba1212c72e58cf941 WatchSource:0}: Error finding container e0a08bf8e3608f5595b197a6eb11a46ea206d7d517f1fe3ba1212c72e58cf941: Status 404 returned error can't find the container with id e0a08bf8e3608f5595b197a6eb11a46ea206d7d517f1fe3ba1212c72e58cf941 Apr 22 20:00:04.844420 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:04.844400 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-s2p2q"] Apr 22 20:00:04.847331 ip-10-0-128-61 kubenswrapper[2574]: W0422 20:00:04.847310 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52f5be1a_f9aa_4bf7_992f_277ec5922772.slice/crio-6e4bb1f108902ed0a0fd48c8fedf129e6d9abdda064047c5b660eb72af72c581 WatchSource:0}: Error finding container 6e4bb1f108902ed0a0fd48c8fedf129e6d9abdda064047c5b660eb72af72c581: Status 404 returned error can't find the container with id 6e4bb1f108902ed0a0fd48c8fedf129e6d9abdda064047c5b660eb72af72c581 Apr 22 20:00:04.889232 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:04.889185 2574 scope.go:117] "RemoveContainer" containerID="b586e43f7e061d1e1527f60d53b9acec0df753fed01309ef5dda38ecf5c13e5d" Apr 22 20:00:05.230245 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:05.230176 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:00:05.230653 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:05.230636 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/1.log" Apr 22 20:00:05.230754 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:05.230669 2574 generic.go:358] "Generic (PLEG): container finished" podID="e9453f8c-04f5-4b72-b26e-c5ccc3bfed06" containerID="0c19aaa5b0f2bd5405ce99335d0051c5eb86938d057fd32f38f64e461c31e6c0" exitCode=255 Apr 22 20:00:05.230754 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:05.230742 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" event={"ID":"e9453f8c-04f5-4b72-b26e-c5ccc3bfed06","Type":"ContainerDied","Data":"0c19aaa5b0f2bd5405ce99335d0051c5eb86938d057fd32f38f64e461c31e6c0"} Apr 22 20:00:05.230829 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:05.230788 2574 scope.go:117] "RemoveContainer" containerID="b586e43f7e061d1e1527f60d53b9acec0df753fed01309ef5dda38ecf5c13e5d" Apr 22 20:00:05.231124 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:05.231089 2574 scope.go:117] "RemoveContainer" containerID="0c19aaa5b0f2bd5405ce99335d0051c5eb86938d057fd32f38f64e461c31e6c0" Apr 22 20:00:05.231290 ip-10-0-128-61 kubenswrapper[2574]: E0422 20:00:05.231271 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-5hfh2_openshift-console-operator(e9453f8c-04f5-4b72-b26e-c5ccc3bfed06)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" podUID="e9453f8c-04f5-4b72-b26e-c5ccc3bfed06" Apr 22 20:00:05.231815 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:05.231798 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x2267" event={"ID":"41de400a-d8c0-4987-9f2a-ec97460903ec","Type":"ContainerStarted","Data":"e0a08bf8e3608f5595b197a6eb11a46ea206d7d517f1fe3ba1212c72e58cf941"} Apr 22 20:00:05.234954 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:05.234931 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-s2p2q" event={"ID":"52f5be1a-f9aa-4bf7-992f-277ec5922772","Type":"ContainerStarted","Data":"6e4bb1f108902ed0a0fd48c8fedf129e6d9abdda064047c5b660eb72af72c581"} Apr 22 20:00:06.239255 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:06.239228 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:00:07.243561 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:07.243474 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x2267" event={"ID":"41de400a-d8c0-4987-9f2a-ec97460903ec","Type":"ContainerStarted","Data":"b3bcb9ee7382652a98ddce84b1dae925793b84226beac716ba70b1d06219c9ed"} Apr 22 20:00:07.243561 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:07.243511 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x2267" event={"ID":"41de400a-d8c0-4987-9f2a-ec97460903ec","Type":"ContainerStarted","Data":"b924028355d47d7d450d56994238d1e56a6b8dd255237c523c7b0ea8820cc2a1"} Apr 22 20:00:07.243561 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:07.243552 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-x2267" Apr 22 20:00:07.244830 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:07.244806 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-s2p2q" event={"ID":"52f5be1a-f9aa-4bf7-992f-277ec5922772","Type":"ContainerStarted","Data":"2d8c93b8d4ec206ca1705ac54c22ee9ea96367f62b0aec6dc9fb5c1e02ea8f20"} Apr 22 20:00:07.262127 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:07.262090 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-x2267" podStartSLOduration=65.206530995 podStartE2EDuration="1m7.262081066s" podCreationTimestamp="2026-04-22 19:59:00 +0000 UTC" firstStartedPulling="2026-04-22 20:00:04.837069127 +0000 UTC m=+97.472452757" lastFinishedPulling="2026-04-22 20:00:06.892619209 +0000 UTC m=+99.528002828" observedRunningTime="2026-04-22 20:00:07.261103851 +0000 UTC m=+99.896487489" watchObservedRunningTime="2026-04-22 20:00:07.262081066 +0000 UTC m=+99.897464705" Apr 22 20:00:07.278466 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:07.278431 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-s2p2q" podStartSLOduration=65.231356822 podStartE2EDuration="1m7.278420954s" podCreationTimestamp="2026-04-22 19:59:00 +0000 UTC" firstStartedPulling="2026-04-22 20:00:04.849080694 +0000 UTC m=+97.484464311" lastFinishedPulling="2026-04-22 20:00:06.896144825 +0000 UTC m=+99.531528443" observedRunningTime="2026-04-22 20:00:07.277885397 +0000 UTC m=+99.913269049" watchObservedRunningTime="2026-04-22 20:00:07.278420954 +0000 UTC m=+99.913804589" Apr 22 20:00:08.485842 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.485808 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-bbdd547d6-cdfxx"] Apr 22 20:00:08.525948 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.525918 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-rjvgq"] Apr 22 20:00:08.529543 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.529525 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rjvgq" Apr 22 20:00:08.532931 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.532907 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 20:00:08.533175 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.533158 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 20:00:08.533464 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.533441 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 20:00:08.534441 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.534407 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rfw8m\"" Apr 22 20:00:08.541414 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.541398 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 20:00:08.549836 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.549814 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rjvgq"] Apr 22 20:00:08.569095 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.569070 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-798f685885-kbmjw"] Apr 22 20:00:08.572146 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.572129 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.592160 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.592131 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-798f685885-kbmjw"] Apr 22 20:00:08.624555 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.624527 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5-crio-socket\") pod \"insights-runtime-extractor-rjvgq\" (UID: \"7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5\") " pod="openshift-insights/insights-runtime-extractor-rjvgq" Apr 22 20:00:08.624695 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.624567 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rjvgq\" (UID: \"7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5\") " pod="openshift-insights/insights-runtime-extractor-rjvgq" Apr 22 20:00:08.624695 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.624587 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rjvgq\" (UID: \"7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5\") " pod="openshift-insights/insights-runtime-extractor-rjvgq" Apr 22 20:00:08.624695 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.624677 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vxvj\" (UniqueName: \"kubernetes.io/projected/7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5-kube-api-access-4vxvj\") pod \"insights-runtime-extractor-rjvgq\" (UID: \"7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5\") " pod="openshift-insights/insights-runtime-extractor-rjvgq" Apr 22 20:00:08.624812 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.624711 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5-data-volume\") pod \"insights-runtime-extractor-rjvgq\" (UID: \"7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5\") " pod="openshift-insights/insights-runtime-extractor-rjvgq" Apr 22 20:00:08.725033 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.724999 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vxvj\" (UniqueName: \"kubernetes.io/projected/7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5-kube-api-access-4vxvj\") pod \"insights-runtime-extractor-rjvgq\" (UID: \"7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5\") " pod="openshift-insights/insights-runtime-extractor-rjvgq" Apr 22 20:00:08.725033 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.725033 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5-data-volume\") pod \"insights-runtime-extractor-rjvgq\" (UID: \"7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5\") " pod="openshift-insights/insights-runtime-extractor-rjvgq" Apr 22 20:00:08.725287 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.725056 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8e4a343e-1122-478c-8882-f7bdc03c0cb4-registry-certificates\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.725287 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.725078 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fksqr\" (UniqueName: \"kubernetes.io/projected/8e4a343e-1122-478c-8882-f7bdc03c0cb4-kube-api-access-fksqr\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.725287 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.725106 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e4a343e-1122-478c-8882-f7bdc03c0cb4-trusted-ca\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.725287 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.725192 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8e4a343e-1122-478c-8882-f7bdc03c0cb4-image-registry-private-configuration\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.725287 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.725241 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5-crio-socket\") pod \"insights-runtime-extractor-rjvgq\" (UID: \"7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5\") " pod="openshift-insights/insights-runtime-extractor-rjvgq" Apr 22 20:00:08.725287 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.725282 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8e4a343e-1122-478c-8882-f7bdc03c0cb4-ca-trust-extracted\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.725570 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.725290 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5-crio-socket\") pod \"insights-runtime-extractor-rjvgq\" (UID: \"7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5\") " pod="openshift-insights/insights-runtime-extractor-rjvgq" Apr 22 20:00:08.725570 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.725313 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8e4a343e-1122-478c-8882-f7bdc03c0cb4-registry-tls\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.725570 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.725342 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e4a343e-1122-478c-8882-f7bdc03c0cb4-bound-sa-token\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.725570 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.725397 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rjvgq\" (UID: \"7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5\") " pod="openshift-insights/insights-runtime-extractor-rjvgq" Apr 22 20:00:08.725570 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.725425 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5-data-volume\") pod \"insights-runtime-extractor-rjvgq\" (UID: \"7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5\") " pod="openshift-insights/insights-runtime-extractor-rjvgq" Apr 22 20:00:08.725570 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.725428 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8e4a343e-1122-478c-8882-f7bdc03c0cb4-installation-pull-secrets\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.725570 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.725480 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rjvgq\" (UID: \"7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5\") " pod="openshift-insights/insights-runtime-extractor-rjvgq" Apr 22 20:00:08.726292 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.726272 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-rjvgq\" (UID: \"7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5\") " pod="openshift-insights/insights-runtime-extractor-rjvgq" Apr 22 20:00:08.728065 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.728046 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-rjvgq\" (UID: \"7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5\") " pod="openshift-insights/insights-runtime-extractor-rjvgq" Apr 22 20:00:08.739593 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.739530 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vxvj\" (UniqueName: \"kubernetes.io/projected/7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5-kube-api-access-4vxvj\") pod \"insights-runtime-extractor-rjvgq\" (UID: \"7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5\") " pod="openshift-insights/insights-runtime-extractor-rjvgq" Apr 22 20:00:08.825871 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.825834 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8e4a343e-1122-478c-8882-f7bdc03c0cb4-registry-certificates\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.825871 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.825871 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fksqr\" (UniqueName: \"kubernetes.io/projected/8e4a343e-1122-478c-8882-f7bdc03c0cb4-kube-api-access-fksqr\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.826126 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.825890 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e4a343e-1122-478c-8882-f7bdc03c0cb4-trusted-ca\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.826126 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.825934 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8e4a343e-1122-478c-8882-f7bdc03c0cb4-image-registry-private-configuration\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.826126 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.825965 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8e4a343e-1122-478c-8882-f7bdc03c0cb4-ca-trust-extracted\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.826126 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.825991 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8e4a343e-1122-478c-8882-f7bdc03c0cb4-registry-tls\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.826126 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.826014 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e4a343e-1122-478c-8882-f7bdc03c0cb4-bound-sa-token\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.826126 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.826042 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8e4a343e-1122-478c-8882-f7bdc03c0cb4-installation-pull-secrets\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.826549 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.826522 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8e4a343e-1122-478c-8882-f7bdc03c0cb4-ca-trust-extracted\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.826835 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.826808 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8e4a343e-1122-478c-8882-f7bdc03c0cb4-registry-certificates\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.826963 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.826937 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e4a343e-1122-478c-8882-f7bdc03c0cb4-trusted-ca\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.828622 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.828602 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8e4a343e-1122-478c-8882-f7bdc03c0cb4-registry-tls\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.828708 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.828607 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8e4a343e-1122-478c-8882-f7bdc03c0cb4-installation-pull-secrets\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.828708 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.828671 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8e4a343e-1122-478c-8882-f7bdc03c0cb4-image-registry-private-configuration\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.833857 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.833837 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fksqr\" (UniqueName: \"kubernetes.io/projected/8e4a343e-1122-478c-8882-f7bdc03c0cb4-kube-api-access-fksqr\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.833963 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.833944 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e4a343e-1122-478c-8882-f7bdc03c0cb4-bound-sa-token\") pod \"image-registry-798f685885-kbmjw\" (UID: \"8e4a343e-1122-478c-8882-f7bdc03c0cb4\") " pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.838829 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.838811 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-rjvgq" Apr 22 20:00:08.882298 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.882217 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:08.980512 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:08.980482 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-rjvgq"] Apr 22 20:00:08.983711 ip-10-0-128-61 kubenswrapper[2574]: W0422 20:00:08.983681 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a889f06_e0b0_4f8f_8b7b_5f1b94796ab5.slice/crio-eefd6055b0f73323482643be69b50c42c98e47f6b4cfeb2704d5c93a80fd7705 WatchSource:0}: Error finding container eefd6055b0f73323482643be69b50c42c98e47f6b4cfeb2704d5c93a80fd7705: Status 404 returned error can't find the container with id eefd6055b0f73323482643be69b50c42c98e47f6b4cfeb2704d5c93a80fd7705 Apr 22 20:00:09.024895 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:09.024867 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-798f685885-kbmjw"] Apr 22 20:00:09.028647 ip-10-0-128-61 kubenswrapper[2574]: W0422 20:00:09.028619 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e4a343e_1122_478c_8882_f7bdc03c0cb4.slice/crio-584dfef8cd5fdf92f1c7ab75d72b02a1bbf752c0774336f491c27b3e0495a149 WatchSource:0}: Error finding container 584dfef8cd5fdf92f1c7ab75d72b02a1bbf752c0774336f491c27b3e0495a149: Status 404 returned error can't find the container with id 584dfef8cd5fdf92f1c7ab75d72b02a1bbf752c0774336f491c27b3e0495a149 Apr 22 20:00:09.254699 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:09.254600 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-798f685885-kbmjw" event={"ID":"8e4a343e-1122-478c-8882-f7bdc03c0cb4","Type":"ContainerStarted","Data":"53a77632d90746ed6a47eb070fb6d5d7791b403eaf8b4d300d1f0fd690305886"} Apr 22 20:00:09.254699 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:09.254644 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-798f685885-kbmjw" event={"ID":"8e4a343e-1122-478c-8882-f7bdc03c0cb4","Type":"ContainerStarted","Data":"584dfef8cd5fdf92f1c7ab75d72b02a1bbf752c0774336f491c27b3e0495a149"} Apr 22 20:00:09.254699 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:09.254697 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:09.255962 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:09.255941 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rjvgq" event={"ID":"7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5","Type":"ContainerStarted","Data":"eb069a08ff6fe55b1b89336a1c5085882740dfae29dcc532dcd417619659b6c6"} Apr 22 20:00:09.255962 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:09.255965 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rjvgq" event={"ID":"7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5","Type":"ContainerStarted","Data":"eefd6055b0f73323482643be69b50c42c98e47f6b4cfeb2704d5c93a80fd7705"} Apr 22 20:00:09.272890 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:09.272835 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-798f685885-kbmjw" podStartSLOduration=1.272818369 podStartE2EDuration="1.272818369s" podCreationTimestamp="2026-04-22 20:00:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:00:09.272163327 +0000 UTC m=+101.907546977" watchObservedRunningTime="2026-04-22 20:00:09.272818369 +0000 UTC m=+101.908202008" Apr 22 20:00:09.572114 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:09.572075 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" Apr 22 20:00:09.572114 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:09.572115 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" Apr 22 20:00:09.572671 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:09.572591 2574 scope.go:117] "RemoveContainer" containerID="0c19aaa5b0f2bd5405ce99335d0051c5eb86938d057fd32f38f64e461c31e6c0" Apr 22 20:00:09.572815 ip-10-0-128-61 kubenswrapper[2574]: E0422 20:00:09.572794 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-5hfh2_openshift-console-operator(e9453f8c-04f5-4b72-b26e-c5ccc3bfed06)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" podUID="e9453f8c-04f5-4b72-b26e-c5ccc3bfed06" Apr 22 20:00:10.260346 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:10.260252 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rjvgq" event={"ID":"7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5","Type":"ContainerStarted","Data":"9186fe7169611136827990d96fef882e27233297f33ee6e706f99d0b3f699de1"} Apr 22 20:00:10.844065 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:10.844014 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/be2244ac-77cc-4970-bda6-135ba736f55c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c9bk2\" (UID: \"be2244ac-77cc-4970-bda6-135ba736f55c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2" Apr 22 20:00:10.846409 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:10.846383 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/be2244ac-77cc-4970-bda6-135ba736f55c-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-c9bk2\" (UID: \"be2244ac-77cc-4970-bda6-135ba736f55c\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2" Apr 22 20:00:10.869332 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:10.869298 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2" Apr 22 20:00:11.003693 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:11.003661 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2"] Apr 22 20:00:11.006543 ip-10-0-128-61 kubenswrapper[2574]: W0422 20:00:11.006508 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe2244ac_77cc_4970_bda6_135ba736f55c.slice/crio-5edfaf0829e601542a431e221540d7b4948809c71e96a69f5a28b570e9dadf54 WatchSource:0}: Error finding container 5edfaf0829e601542a431e221540d7b4948809c71e96a69f5a28b570e9dadf54: Status 404 returned error can't find the container with id 5edfaf0829e601542a431e221540d7b4948809c71e96a69f5a28b570e9dadf54 Apr 22 20:00:11.263954 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:11.263869 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2" event={"ID":"be2244ac-77cc-4970-bda6-135ba736f55c","Type":"ContainerStarted","Data":"5edfaf0829e601542a431e221540d7b4948809c71e96a69f5a28b570e9dadf54"} Apr 22 20:00:12.268308 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:12.268271 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-rjvgq" event={"ID":"7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5","Type":"ContainerStarted","Data":"f1b503004e4345bf2af5fae4b0fedc00b4d79cabd60490e7d4c4feb23686dbf8"} Apr 22 20:00:12.287287 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:12.287240 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-rjvgq" podStartSLOduration=1.8690269320000001 podStartE2EDuration="4.287225886s" podCreationTimestamp="2026-04-22 20:00:08 +0000 UTC" firstStartedPulling="2026-04-22 20:00:09.043018666 +0000 UTC m=+101.678402286" lastFinishedPulling="2026-04-22 20:00:11.461217615 +0000 UTC m=+104.096601240" observedRunningTime="2026-04-22 20:00:12.285825194 +0000 UTC m=+104.921208832" watchObservedRunningTime="2026-04-22 20:00:12.287225886 +0000 UTC m=+104.922609524" Apr 22 20:00:13.272203 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:13.272156 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2" event={"ID":"be2244ac-77cc-4970-bda6-135ba736f55c","Type":"ContainerStarted","Data":"e35e8d461515595eac6c24602fe5e1b5d51b74bb58e1a7ca96412325837aaa53"} Apr 22 20:00:13.289692 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:13.289637 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-c9bk2" podStartSLOduration=32.133840132 podStartE2EDuration="34.289619924s" podCreationTimestamp="2026-04-22 19:59:39 +0000 UTC" firstStartedPulling="2026-04-22 20:00:11.011622092 +0000 UTC m=+103.647005713" lastFinishedPulling="2026-04-22 20:00:13.167401873 +0000 UTC m=+105.802785505" observedRunningTime="2026-04-22 20:00:13.288930852 +0000 UTC m=+105.924314491" watchObservedRunningTime="2026-04-22 20:00:13.289619924 +0000 UTC m=+105.925003619" Apr 22 20:00:16.722304 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:16.722266 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-pct54"] Apr 22 20:00:16.724184 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:16.724165 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-pct54" Apr 22 20:00:16.726676 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:16.726650 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 20:00:16.726676 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:16.726664 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-8slp5\"" Apr 22 20:00:16.726871 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:16.726739 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 20:00:16.727970 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:16.727948 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 20:00:16.734036 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:16.734014 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-pct54"] Apr 22 20:00:16.791862 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:16.791816 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b301d43e-1b56-4056-b42e-684724967abb-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-pct54\" (UID: \"b301d43e-1b56-4056-b42e-684724967abb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pct54" Apr 22 20:00:16.791862 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:16.791863 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b301d43e-1b56-4056-b42e-684724967abb-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-pct54\" (UID: \"b301d43e-1b56-4056-b42e-684724967abb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pct54" Apr 22 20:00:16.792086 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:16.791960 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68srs\" (UniqueName: \"kubernetes.io/projected/b301d43e-1b56-4056-b42e-684724967abb-kube-api-access-68srs\") pod \"prometheus-operator-5676c8c784-pct54\" (UID: \"b301d43e-1b56-4056-b42e-684724967abb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pct54" Apr 22 20:00:16.792086 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:16.792017 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b301d43e-1b56-4056-b42e-684724967abb-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-pct54\" (UID: \"b301d43e-1b56-4056-b42e-684724967abb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pct54" Apr 22 20:00:16.892576 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:16.892533 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-68srs\" (UniqueName: \"kubernetes.io/projected/b301d43e-1b56-4056-b42e-684724967abb-kube-api-access-68srs\") pod \"prometheus-operator-5676c8c784-pct54\" (UID: \"b301d43e-1b56-4056-b42e-684724967abb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pct54" Apr 22 20:00:16.892576 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:16.892582 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b301d43e-1b56-4056-b42e-684724967abb-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-pct54\" (UID: \"b301d43e-1b56-4056-b42e-684724967abb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pct54" Apr 22 20:00:16.892831 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:16.892641 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b301d43e-1b56-4056-b42e-684724967abb-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-pct54\" (UID: \"b301d43e-1b56-4056-b42e-684724967abb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pct54" Apr 22 20:00:16.892831 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:16.892659 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b301d43e-1b56-4056-b42e-684724967abb-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-pct54\" (UID: \"b301d43e-1b56-4056-b42e-684724967abb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pct54" Apr 22 20:00:16.892831 ip-10-0-128-61 kubenswrapper[2574]: E0422 20:00:16.892771 2574 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 22 20:00:16.892978 ip-10-0-128-61 kubenswrapper[2574]: E0422 20:00:16.892847 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b301d43e-1b56-4056-b42e-684724967abb-prometheus-operator-tls podName:b301d43e-1b56-4056-b42e-684724967abb nodeName:}" failed. No retries permitted until 2026-04-22 20:00:17.392831582 +0000 UTC m=+110.028215199 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/b301d43e-1b56-4056-b42e-684724967abb-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-pct54" (UID: "b301d43e-1b56-4056-b42e-684724967abb") : secret "prometheus-operator-tls" not found Apr 22 20:00:16.893297 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:16.893276 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b301d43e-1b56-4056-b42e-684724967abb-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-pct54\" (UID: \"b301d43e-1b56-4056-b42e-684724967abb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pct54" Apr 22 20:00:16.895184 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:16.895158 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b301d43e-1b56-4056-b42e-684724967abb-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-pct54\" (UID: \"b301d43e-1b56-4056-b42e-684724967abb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pct54" Apr 22 20:00:16.902214 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:16.902188 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-68srs\" (UniqueName: \"kubernetes.io/projected/b301d43e-1b56-4056-b42e-684724967abb-kube-api-access-68srs\") pod \"prometheus-operator-5676c8c784-pct54\" (UID: \"b301d43e-1b56-4056-b42e-684724967abb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pct54" Apr 22 20:00:17.251696 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:17.251665 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-x2267" Apr 22 20:00:17.395907 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:17.395865 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b301d43e-1b56-4056-b42e-684724967abb-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-pct54\" (UID: \"b301d43e-1b56-4056-b42e-684724967abb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pct54" Apr 22 20:00:17.398268 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:17.398247 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/b301d43e-1b56-4056-b42e-684724967abb-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-pct54\" (UID: \"b301d43e-1b56-4056-b42e-684724967abb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pct54" Apr 22 20:00:17.633364 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:17.633330 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-pct54" Apr 22 20:00:17.752625 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:17.752596 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-pct54"] Apr 22 20:00:17.755450 ip-10-0-128-61 kubenswrapper[2574]: W0422 20:00:17.755419 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb301d43e_1b56_4056_b42e_684724967abb.slice/crio-26d0ab1c78ecea0d7727d5f2ac792b28f09a69889c76213387ca1e93c8f4da52 WatchSource:0}: Error finding container 26d0ab1c78ecea0d7727d5f2ac792b28f09a69889c76213387ca1e93c8f4da52: Status 404 returned error can't find the container with id 26d0ab1c78ecea0d7727d5f2ac792b28f09a69889c76213387ca1e93c8f4da52 Apr 22 20:00:18.287458 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:18.287420 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-pct54" event={"ID":"b301d43e-1b56-4056-b42e-684724967abb","Type":"ContainerStarted","Data":"26d0ab1c78ecea0d7727d5f2ac792b28f09a69889c76213387ca1e93c8f4da52"} Apr 22 20:00:18.493227 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:18.493188 2574 patch_prober.go:28] interesting pod/image-registry-bbdd547d6-cdfxx container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 20:00:18.493420 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:18.493258 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" podUID="15904536-b225-4e88-a726-c2cc5e1d8dd9" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 20:00:19.292892 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:19.292850 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-pct54" event={"ID":"b301d43e-1b56-4056-b42e-684724967abb","Type":"ContainerStarted","Data":"8fc52ca17b3bafa66da5aee5572819aefc7b8042c2a0b20b4a3baad019e9f884"} Apr 22 20:00:19.292892 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:19.292900 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-pct54" event={"ID":"b301d43e-1b56-4056-b42e-684724967abb","Type":"ContainerStarted","Data":"aa42e2c25034c823f56ef7f4d1ee394194ba41568b6873d4e7fe101274502ef9"} Apr 22 20:00:19.312284 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:19.312221 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-pct54" podStartSLOduration=1.956241408 podStartE2EDuration="3.312201622s" podCreationTimestamp="2026-04-22 20:00:16 +0000 UTC" firstStartedPulling="2026-04-22 20:00:17.757332656 +0000 UTC m=+110.392716272" lastFinishedPulling="2026-04-22 20:00:19.113292866 +0000 UTC m=+111.748676486" observedRunningTime="2026-04-22 20:00:19.310695248 +0000 UTC m=+111.946078888" watchObservedRunningTime="2026-04-22 20:00:19.312201622 +0000 UTC m=+111.947585264" Apr 22 20:00:20.889097 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:20.889070 2574 scope.go:117] "RemoveContainer" containerID="0c19aaa5b0f2bd5405ce99335d0051c5eb86938d057fd32f38f64e461c31e6c0" Apr 22 20:00:20.889460 ip-10-0-128-61 kubenswrapper[2574]: E0422 20:00:20.889235 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-5hfh2_openshift-console-operator(e9453f8c-04f5-4b72-b26e-c5ccc3bfed06)\"" pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" podUID="e9453f8c-04f5-4b72-b26e-c5ccc3bfed06" Apr 22 20:00:21.110442 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.110405 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-h4qgm"] Apr 22 20:00:21.112893 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.112873 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.113794 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.113773 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-pmwbs"] Apr 22 20:00:21.116061 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.116039 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" Apr 22 20:00:21.122372 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.122341 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 20:00:21.122527 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.122508 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 20:00:21.122600 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.122528 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-zvlk4\"" Apr 22 20:00:21.122660 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.122435 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 20:00:21.123199 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.123039 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 20:00:21.123823 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.123664 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 20:00:21.124107 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.124086 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 20:00:21.124369 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.124338 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-kl442\"" Apr 22 20:00:21.156292 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.156214 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-pmwbs"] Apr 22 20:00:21.230778 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.230740 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7mng\" (UniqueName: \"kubernetes.io/projected/446e489c-3e3e-41c2-b640-067654480e5c-kube-api-access-t7mng\") pod \"kube-state-metrics-69db897b98-pmwbs\" (UID: \"446e489c-3e3e-41c2-b640-067654480e5c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" Apr 22 20:00:21.230962 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.230800 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/446e489c-3e3e-41c2-b640-067654480e5c-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-pmwbs\" (UID: \"446e489c-3e3e-41c2-b640-067654480e5c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" Apr 22 20:00:21.230962 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.230841 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/53cf1257-d4b4-4aa5-95ad-282a875175a9-node-exporter-wtmp\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.230962 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.230890 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/53cf1257-d4b4-4aa5-95ad-282a875175a9-node-exporter-accelerators-collector-config\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.230962 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.230942 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/446e489c-3e3e-41c2-b640-067654480e5c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-pmwbs\" (UID: \"446e489c-3e3e-41c2-b640-067654480e5c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" Apr 22 20:00:21.231171 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.231009 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/446e489c-3e3e-41c2-b640-067654480e5c-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-pmwbs\" (UID: \"446e489c-3e3e-41c2-b640-067654480e5c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" Apr 22 20:00:21.231171 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.231061 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/53cf1257-d4b4-4aa5-95ad-282a875175a9-metrics-client-ca\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.231171 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.231106 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/53cf1257-d4b4-4aa5-95ad-282a875175a9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.231171 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.231148 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/446e489c-3e3e-41c2-b640-067654480e5c-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pmwbs\" (UID: \"446e489c-3e3e-41c2-b640-067654480e5c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" Apr 22 20:00:21.231450 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.231196 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/53cf1257-d4b4-4aa5-95ad-282a875175a9-sys\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.231450 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.231221 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g4rc\" (UniqueName: \"kubernetes.io/projected/53cf1257-d4b4-4aa5-95ad-282a875175a9-kube-api-access-5g4rc\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.231450 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.231244 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/53cf1257-d4b4-4aa5-95ad-282a875175a9-root\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.231450 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.231267 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/53cf1257-d4b4-4aa5-95ad-282a875175a9-node-exporter-textfile\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.231450 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.231297 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/53cf1257-d4b4-4aa5-95ad-282a875175a9-node-exporter-tls\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.231450 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.231342 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/446e489c-3e3e-41c2-b640-067654480e5c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-pmwbs\" (UID: \"446e489c-3e3e-41c2-b640-067654480e5c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" Apr 22 20:00:21.332478 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.332450 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/446e489c-3e3e-41c2-b640-067654480e5c-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pmwbs\" (UID: \"446e489c-3e3e-41c2-b640-067654480e5c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" Apr 22 20:00:21.332646 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.332487 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/53cf1257-d4b4-4aa5-95ad-282a875175a9-sys\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.332646 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.332509 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5g4rc\" (UniqueName: \"kubernetes.io/projected/53cf1257-d4b4-4aa5-95ad-282a875175a9-kube-api-access-5g4rc\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.332646 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.332531 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/53cf1257-d4b4-4aa5-95ad-282a875175a9-root\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.332646 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.332554 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/53cf1257-d4b4-4aa5-95ad-282a875175a9-node-exporter-textfile\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.332646 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.332580 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/53cf1257-d4b4-4aa5-95ad-282a875175a9-node-exporter-tls\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.332646 ip-10-0-128-61 kubenswrapper[2574]: E0422 20:00:21.332609 2574 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 20:00:21.332646 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.332615 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/446e489c-3e3e-41c2-b640-067654480e5c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-pmwbs\" (UID: \"446e489c-3e3e-41c2-b640-067654480e5c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" Apr 22 20:00:21.332646 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.332644 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7mng\" (UniqueName: \"kubernetes.io/projected/446e489c-3e3e-41c2-b640-067654480e5c-kube-api-access-t7mng\") pod \"kube-state-metrics-69db897b98-pmwbs\" (UID: \"446e489c-3e3e-41c2-b640-067654480e5c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" Apr 22 20:00:21.333105 ip-10-0-128-61 kubenswrapper[2574]: E0422 20:00:21.332670 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/446e489c-3e3e-41c2-b640-067654480e5c-kube-state-metrics-tls podName:446e489c-3e3e-41c2-b640-067654480e5c nodeName:}" failed. No retries permitted until 2026-04-22 20:00:21.832649567 +0000 UTC m=+114.468033198 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/446e489c-3e3e-41c2-b640-067654480e5c-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-pmwbs" (UID: "446e489c-3e3e-41c2-b640-067654480e5c") : secret "kube-state-metrics-tls" not found Apr 22 20:00:21.333105 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.332703 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/446e489c-3e3e-41c2-b640-067654480e5c-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-pmwbs\" (UID: \"446e489c-3e3e-41c2-b640-067654480e5c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" Apr 22 20:00:21.333105 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.332739 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/53cf1257-d4b4-4aa5-95ad-282a875175a9-node-exporter-wtmp\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.333105 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.332779 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/53cf1257-d4b4-4aa5-95ad-282a875175a9-node-exporter-accelerators-collector-config\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.333105 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.332805 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/446e489c-3e3e-41c2-b640-067654480e5c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-pmwbs\" (UID: \"446e489c-3e3e-41c2-b640-067654480e5c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" Apr 22 20:00:21.333105 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.332812 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/53cf1257-d4b4-4aa5-95ad-282a875175a9-root\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.333105 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.332855 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/446e489c-3e3e-41c2-b640-067654480e5c-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-pmwbs\" (UID: \"446e489c-3e3e-41c2-b640-067654480e5c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" Apr 22 20:00:21.333105 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.332887 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/53cf1257-d4b4-4aa5-95ad-282a875175a9-metrics-client-ca\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.333105 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.332916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/53cf1257-d4b4-4aa5-95ad-282a875175a9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.333105 ip-10-0-128-61 kubenswrapper[2574]: E0422 20:00:21.332995 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 20:00:21.333105 ip-10-0-128-61 kubenswrapper[2574]: E0422 20:00:21.333052 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53cf1257-d4b4-4aa5-95ad-282a875175a9-node-exporter-tls podName:53cf1257-d4b4-4aa5-95ad-282a875175a9 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:21.833034227 +0000 UTC m=+114.468417844 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/53cf1257-d4b4-4aa5-95ad-282a875175a9-node-exporter-tls") pod "node-exporter-h4qgm" (UID: "53cf1257-d4b4-4aa5-95ad-282a875175a9") : secret "node-exporter-tls" not found Apr 22 20:00:21.333726 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.333114 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/53cf1257-d4b4-4aa5-95ad-282a875175a9-node-exporter-textfile\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.333726 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.333611 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/53cf1257-d4b4-4aa5-95ad-282a875175a9-node-exporter-accelerators-collector-config\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.333829 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.333764 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/53cf1257-d4b4-4aa5-95ad-282a875175a9-node-exporter-wtmp\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.333975 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.333958 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/53cf1257-d4b4-4aa5-95ad-282a875175a9-sys\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.334285 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.334256 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/446e489c-3e3e-41c2-b640-067654480e5c-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-pmwbs\" (UID: \"446e489c-3e3e-41c2-b640-067654480e5c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" Apr 22 20:00:21.334709 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.334691 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/53cf1257-d4b4-4aa5-95ad-282a875175a9-metrics-client-ca\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.336034 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.335326 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/53cf1257-d4b4-4aa5-95ad-282a875175a9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.336034 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.335643 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/446e489c-3e3e-41c2-b640-067654480e5c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-pmwbs\" (UID: \"446e489c-3e3e-41c2-b640-067654480e5c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" Apr 22 20:00:21.336240 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.336208 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/446e489c-3e3e-41c2-b640-067654480e5c-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-pmwbs\" (UID: \"446e489c-3e3e-41c2-b640-067654480e5c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" Apr 22 20:00:21.336603 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.336584 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/446e489c-3e3e-41c2-b640-067654480e5c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-pmwbs\" (UID: \"446e489c-3e3e-41c2-b640-067654480e5c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" Apr 22 20:00:21.341046 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.341019 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7mng\" (UniqueName: \"kubernetes.io/projected/446e489c-3e3e-41c2-b640-067654480e5c-kube-api-access-t7mng\") pod \"kube-state-metrics-69db897b98-pmwbs\" (UID: \"446e489c-3e3e-41c2-b640-067654480e5c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" Apr 22 20:00:21.341495 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.341477 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g4rc\" (UniqueName: \"kubernetes.io/projected/53cf1257-d4b4-4aa5-95ad-282a875175a9-kube-api-access-5g4rc\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.837224 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.837179 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/53cf1257-d4b4-4aa5-95ad-282a875175a9-node-exporter-tls\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:21.837383 ip-10-0-128-61 kubenswrapper[2574]: E0422 20:00:21.837316 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 20:00:21.837383 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.837309 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/446e489c-3e3e-41c2-b640-067654480e5c-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pmwbs\" (UID: \"446e489c-3e3e-41c2-b640-067654480e5c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" Apr 22 20:00:21.837510 ip-10-0-128-61 kubenswrapper[2574]: E0422 20:00:21.837390 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53cf1257-d4b4-4aa5-95ad-282a875175a9-node-exporter-tls podName:53cf1257-d4b4-4aa5-95ad-282a875175a9 nodeName:}" failed. No retries permitted until 2026-04-22 20:00:22.837374697 +0000 UTC m=+115.472758332 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/53cf1257-d4b4-4aa5-95ad-282a875175a9-node-exporter-tls") pod "node-exporter-h4qgm" (UID: "53cf1257-d4b4-4aa5-95ad-282a875175a9") : secret "node-exporter-tls" not found Apr 22 20:00:21.839778 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:21.839748 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/446e489c-3e3e-41c2-b640-067654480e5c-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-pmwbs\" (UID: \"446e489c-3e3e-41c2-b640-067654480e5c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" Apr 22 20:00:22.030748 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:22.030727 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" Apr 22 20:00:22.186279 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:22.186251 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-pmwbs"] Apr 22 20:00:22.187773 ip-10-0-128-61 kubenswrapper[2574]: W0422 20:00:22.187747 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod446e489c_3e3e_41c2_b640_067654480e5c.slice/crio-00709b7a117a3c03a358e6d50b5f58c1a9e95357719c2419638a6cbf65d99abf WatchSource:0}: Error finding container 00709b7a117a3c03a358e6d50b5f58c1a9e95357719c2419638a6cbf65d99abf: Status 404 returned error can't find the container with id 00709b7a117a3c03a358e6d50b5f58c1a9e95357719c2419638a6cbf65d99abf Apr 22 20:00:22.302238 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:22.302201 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" event={"ID":"446e489c-3e3e-41c2-b640-067654480e5c","Type":"ContainerStarted","Data":"00709b7a117a3c03a358e6d50b5f58c1a9e95357719c2419638a6cbf65d99abf"} Apr 22 20:00:22.845830 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:22.845802 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/53cf1257-d4b4-4aa5-95ad-282a875175a9-node-exporter-tls\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:22.848535 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:22.848499 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/53cf1257-d4b4-4aa5-95ad-282a875175a9-node-exporter-tls\") pod \"node-exporter-h4qgm\" (UID: \"53cf1257-d4b4-4aa5-95ad-282a875175a9\") " pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:22.925437 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:22.925406 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-h4qgm" Apr 22 20:00:22.938498 ip-10-0-128-61 kubenswrapper[2574]: W0422 20:00:22.938470 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53cf1257_d4b4_4aa5_95ad_282a875175a9.slice/crio-566a8590fc5e944d0b057273e0998d21b4546fe582ae6aa60ffbbd4af64b9d9c WatchSource:0}: Error finding container 566a8590fc5e944d0b057273e0998d21b4546fe582ae6aa60ffbbd4af64b9d9c: Status 404 returned error can't find the container with id 566a8590fc5e944d0b057273e0998d21b4546fe582ae6aa60ffbbd4af64b9d9c Apr 22 20:00:23.081529 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.081493 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5559fc8748-cqjgv"] Apr 22 20:00:23.084879 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.084857 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.087624 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.087599 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 20:00:23.087724 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.087605 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 20:00:23.087873 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.087858 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 20:00:23.087941 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.087883 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 20:00:23.088032 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.087925 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 20:00:23.088032 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.088002 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-hs77b\"" Apr 22 20:00:23.088733 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.088713 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-6oma6vkcdk24\"" Apr 22 20:00:23.098191 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.098138 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5559fc8748-cqjgv"] Apr 22 20:00:23.249023 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.248989 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c1ea0334-75a8-451e-b739-774bae8cf624-secret-thanos-querier-tls\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.249178 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.249028 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c1ea0334-75a8-451e-b739-774bae8cf624-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.249178 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.249050 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c1ea0334-75a8-451e-b739-774bae8cf624-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.249178 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.249114 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c1ea0334-75a8-451e-b739-774bae8cf624-secret-grpc-tls\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.249178 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.249147 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c1ea0334-75a8-451e-b739-774bae8cf624-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.249378 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.249184 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c1ea0334-75a8-451e-b739-774bae8cf624-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.249378 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.249211 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1ea0334-75a8-451e-b739-774bae8cf624-metrics-client-ca\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.249378 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.249236 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f54fw\" (UniqueName: \"kubernetes.io/projected/c1ea0334-75a8-451e-b739-774bae8cf624-kube-api-access-f54fw\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.306475 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.306436 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h4qgm" event={"ID":"53cf1257-d4b4-4aa5-95ad-282a875175a9","Type":"ContainerStarted","Data":"566a8590fc5e944d0b057273e0998d21b4546fe582ae6aa60ffbbd4af64b9d9c"} Apr 22 20:00:23.349891 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.349804 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c1ea0334-75a8-451e-b739-774bae8cf624-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.349891 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.349867 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1ea0334-75a8-451e-b739-774bae8cf624-metrics-client-ca\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.350093 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.349905 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f54fw\" (UniqueName: \"kubernetes.io/projected/c1ea0334-75a8-451e-b739-774bae8cf624-kube-api-access-f54fw\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.350093 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.349963 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c1ea0334-75a8-451e-b739-774bae8cf624-secret-thanos-querier-tls\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.350093 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.349991 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c1ea0334-75a8-451e-b739-774bae8cf624-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.350093 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.350022 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c1ea0334-75a8-451e-b739-774bae8cf624-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.350093 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.350068 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c1ea0334-75a8-451e-b739-774bae8cf624-secret-grpc-tls\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.350469 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.350103 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c1ea0334-75a8-451e-b739-774bae8cf624-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.350684 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.350657 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c1ea0334-75a8-451e-b739-774bae8cf624-metrics-client-ca\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.353047 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.352941 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c1ea0334-75a8-451e-b739-774bae8cf624-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.353047 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.352994 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c1ea0334-75a8-451e-b739-774bae8cf624-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.353278 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.353252 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c1ea0334-75a8-451e-b739-774bae8cf624-secret-grpc-tls\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.353441 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.353394 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/c1ea0334-75a8-451e-b739-774bae8cf624-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.353593 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.353546 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/c1ea0334-75a8-451e-b739-774bae8cf624-secret-thanos-querier-tls\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.353913 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.353894 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/c1ea0334-75a8-451e-b739-774bae8cf624-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.357816 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.357796 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f54fw\" (UniqueName: \"kubernetes.io/projected/c1ea0334-75a8-451e-b739-774bae8cf624-kube-api-access-f54fw\") pod \"thanos-querier-5559fc8748-cqjgv\" (UID: \"c1ea0334-75a8-451e-b739-774bae8cf624\") " pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.396668 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.396632 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:23.630913 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:23.630885 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5559fc8748-cqjgv"] Apr 22 20:00:23.634277 ip-10-0-128-61 kubenswrapper[2574]: W0422 20:00:23.634248 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1ea0334_75a8_451e_b739_774bae8cf624.slice/crio-1dc7b2cad17d8e3520b3464451c350e034457523fc05ebf30287c59bc6bc065b WatchSource:0}: Error finding container 1dc7b2cad17d8e3520b3464451c350e034457523fc05ebf30287c59bc6bc065b: Status 404 returned error can't find the container with id 1dc7b2cad17d8e3520b3464451c350e034457523fc05ebf30287c59bc6bc065b Apr 22 20:00:24.314578 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:24.314487 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" event={"ID":"446e489c-3e3e-41c2-b640-067654480e5c","Type":"ContainerStarted","Data":"a3e5903e57dee1ca0e0d9e6e459f9bf687773c10db74a914341686439f287a09"} Apr 22 20:00:24.314578 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:24.314529 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" event={"ID":"446e489c-3e3e-41c2-b640-067654480e5c","Type":"ContainerStarted","Data":"8e833776f3337dfcf9596936479653f294cec18ae157f299864a31354d293ae8"} Apr 22 20:00:24.314578 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:24.314544 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" event={"ID":"446e489c-3e3e-41c2-b640-067654480e5c","Type":"ContainerStarted","Data":"0c61988743876d8944c3fa6605332aff18e4e0fddac1e59e31536bc3357f17c5"} Apr 22 20:00:24.316183 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:24.316156 2574 generic.go:358] "Generic (PLEG): container finished" podID="53cf1257-d4b4-4aa5-95ad-282a875175a9" containerID="4a36cde0a21b2ca535deb2167c9fb5884bff336d7cd565ff1a9d6b33f05a8332" exitCode=0 Apr 22 20:00:24.316347 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:24.316227 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h4qgm" event={"ID":"53cf1257-d4b4-4aa5-95ad-282a875175a9","Type":"ContainerDied","Data":"4a36cde0a21b2ca535deb2167c9fb5884bff336d7cd565ff1a9d6b33f05a8332"} Apr 22 20:00:24.317496 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:24.317470 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" event={"ID":"c1ea0334-75a8-451e-b739-774bae8cf624","Type":"ContainerStarted","Data":"1dc7b2cad17d8e3520b3464451c350e034457523fc05ebf30287c59bc6bc065b"} Apr 22 20:00:24.333116 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:24.332662 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-pmwbs" podStartSLOduration=2.000172914 podStartE2EDuration="3.332647604s" podCreationTimestamp="2026-04-22 20:00:21 +0000 UTC" firstStartedPulling="2026-04-22 20:00:22.189907832 +0000 UTC m=+114.825291464" lastFinishedPulling="2026-04-22 20:00:23.522382533 +0000 UTC m=+116.157766154" observedRunningTime="2026-04-22 20:00:24.331930285 +0000 UTC m=+116.967313936" watchObservedRunningTime="2026-04-22 20:00:24.332647604 +0000 UTC m=+116.968031246" Apr 22 20:00:25.322167 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:25.322135 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h4qgm" event={"ID":"53cf1257-d4b4-4aa5-95ad-282a875175a9","Type":"ContainerStarted","Data":"c4547e8adb5b37a87b157f97f5d99080bd1a7573d920ebcb122030a551e2926a"} Apr 22 20:00:25.322167 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:25.322172 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h4qgm" event={"ID":"53cf1257-d4b4-4aa5-95ad-282a875175a9","Type":"ContainerStarted","Data":"f5674798f313698712668e30c025853a3f1388b967a76bf58494b9c4546e9ecd"} Apr 22 20:00:25.342661 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:25.342602 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-h4qgm" podStartSLOduration=3.231049678 podStartE2EDuration="4.342584547s" podCreationTimestamp="2026-04-22 20:00:21 +0000 UTC" firstStartedPulling="2026-04-22 20:00:22.94056852 +0000 UTC m=+115.575952153" lastFinishedPulling="2026-04-22 20:00:24.052103405 +0000 UTC m=+116.687487022" observedRunningTime="2026-04-22 20:00:25.341586522 +0000 UTC m=+117.976970162" watchObservedRunningTime="2026-04-22 20:00:25.342584547 +0000 UTC m=+117.977968179" Apr 22 20:00:26.326800 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:26.326764 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" event={"ID":"c1ea0334-75a8-451e-b739-774bae8cf624","Type":"ContainerStarted","Data":"a0898ddc623b44c3180197673543ead42344a7c265ea7ac58e9d186a52698543"} Apr 22 20:00:26.326800 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:26.326802 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" event={"ID":"c1ea0334-75a8-451e-b739-774bae8cf624","Type":"ContainerStarted","Data":"3d199d37354cfa7920c690a76a4ddbd03da906d93c3dc26a0f6b1f8f23989021"} Apr 22 20:00:26.327196 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:26.326812 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" event={"ID":"c1ea0334-75a8-451e-b739-774bae8cf624","Type":"ContainerStarted","Data":"30be9d41249fd8ad8ad2d1218a916980aeaf6fa119b618f9a56e2a1c86a447e0"} Apr 22 20:00:27.332502 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.332466 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" event={"ID":"c1ea0334-75a8-451e-b739-774bae8cf624","Type":"ContainerStarted","Data":"49b41d4496b595f7a3cbc9f7b88e35db68b9f332ba2cb4014549526feac09c44"} Apr 22 20:00:27.332502 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.332503 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" event={"ID":"c1ea0334-75a8-451e-b739-774bae8cf624","Type":"ContainerStarted","Data":"5a9be037f1fe3240b20185997bb030c45b1df106c34ac6eeaf06ce91f5954792"} Apr 22 20:00:27.332502 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.332513 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" event={"ID":"c1ea0334-75a8-451e-b739-774bae8cf624","Type":"ContainerStarted","Data":"21a5d36ee6a3151f2a00002708d51ef82ffea03954a3a0a574868ad51b76b3e6"} Apr 22 20:00:27.333062 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.332605 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:27.345451 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.345410 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 20:00:27.348715 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.348691 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.351296 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.351263 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 20:00:27.351296 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.351287 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 20:00:27.351488 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.351423 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 20:00:27.351597 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.351503 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 20:00:27.351597 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.351552 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 20:00:27.351597 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.351583 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 20:00:27.351976 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.351931 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 20:00:27.352143 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.352129 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 20:00:27.352143 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.352141 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-4c6hz\"" Apr 22 20:00:27.352317 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.352136 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-v36psmqluos2\"" Apr 22 20:00:27.352317 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.352276 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 20:00:27.352524 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.352509 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 20:00:27.352891 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.352863 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 20:00:27.355564 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.355543 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 20:00:27.358250 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.358225 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 20:00:27.360013 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.359199 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" podStartSLOduration=1.121279657 podStartE2EDuration="4.359184833s" podCreationTimestamp="2026-04-22 20:00:23 +0000 UTC" firstStartedPulling="2026-04-22 20:00:23.636520837 +0000 UTC m=+116.271904457" lastFinishedPulling="2026-04-22 20:00:26.874426015 +0000 UTC m=+119.509809633" observedRunningTime="2026-04-22 20:00:27.357062267 +0000 UTC m=+119.992445919" watchObservedRunningTime="2026-04-22 20:00:27.359184833 +0000 UTC m=+119.994568473" Apr 22 20:00:27.366267 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.366231 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 20:00:27.487063 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.487034 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.487063 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.487067 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.487290 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.487106 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.487290 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.487123 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8b52\" (UniqueName: \"kubernetes.io/projected/467b9557-e696-45d7-b276-cebdce0098b9-kube-api-access-m8b52\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.487290 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.487173 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-web-config\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.487290 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.487217 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-config\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.487513 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.487290 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.487513 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.487323 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/467b9557-e696-45d7-b276-cebdce0098b9-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.487513 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.487379 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.487513 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.487460 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.487513 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.487497 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.487769 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.487533 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/467b9557-e696-45d7-b276-cebdce0098b9-config-out\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.487769 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.487597 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.487769 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.487652 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.487769 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.487670 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.487965 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.487794 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.487965 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.487858 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.487965 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.487915 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/467b9557-e696-45d7-b276-cebdce0098b9-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.588892 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.588815 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.588892 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.588851 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.588892 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.588867 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/467b9557-e696-45d7-b276-cebdce0098b9-config-out\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.589143 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.588894 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.589143 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.588930 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.589143 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.588956 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.589143 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.588991 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.589143 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.589015 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.589143 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.589042 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/467b9557-e696-45d7-b276-cebdce0098b9-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.589143 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.589084 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.589143 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.589115 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.589558 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.589161 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.589558 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.589187 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8b52\" (UniqueName: \"kubernetes.io/projected/467b9557-e696-45d7-b276-cebdce0098b9-kube-api-access-m8b52\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.589558 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.589214 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-web-config\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.589558 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.589240 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-config\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.589558 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.589276 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.589558 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.589302 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/467b9557-e696-45d7-b276-cebdce0098b9-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.589558 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.589331 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.589886 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.589853 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.590158 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.590133 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.593291 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.592241 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.593291 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.592537 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/467b9557-e696-45d7-b276-cebdce0098b9-config-out\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.593291 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.592584 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.593291 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.592968 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.594304 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.593718 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.594304 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.594007 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/467b9557-e696-45d7-b276-cebdce0098b9-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.594473 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.594451 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.594711 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.594652 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.594891 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.594867 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.595328 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.595299 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.595444 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.595419 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/467b9557-e696-45d7-b276-cebdce0098b9-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.595572 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.595552 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-config\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.595713 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.595689 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.597162 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.597140 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-web-config\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.597252 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.597150 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.625341 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.625319 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8b52\" (UniqueName: \"kubernetes.io/projected/467b9557-e696-45d7-b276-cebdce0098b9-kube-api-access-m8b52\") pod \"prometheus-k8s-0\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.661283 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.661254 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:27.806680 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:27.806631 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 20:00:27.808552 ip-10-0-128-61 kubenswrapper[2574]: W0422 20:00:27.808523 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod467b9557_e696_45d7_b276_cebdce0098b9.slice/crio-3878573bc01a12850f61af87a18fd4c6bda931764e926c546dd800b28d7ecdad WatchSource:0}: Error finding container 3878573bc01a12850f61af87a18fd4c6bda931764e926c546dd800b28d7ecdad: Status 404 returned error can't find the container with id 3878573bc01a12850f61af87a18fd4c6bda931764e926c546dd800b28d7ecdad Apr 22 20:00:28.337462 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:28.337416 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"467b9557-e696-45d7-b276-cebdce0098b9","Type":"ContainerStarted","Data":"3878573bc01a12850f61af87a18fd4c6bda931764e926c546dd800b28d7ecdad"} Apr 22 20:00:28.491430 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:28.491397 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 20:00:29.341294 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:29.341262 2574 generic.go:358] "Generic (PLEG): container finished" podID="467b9557-e696-45d7-b276-cebdce0098b9" containerID="13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2" exitCode=0 Apr 22 20:00:29.341677 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:29.341342 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"467b9557-e696-45d7-b276-cebdce0098b9","Type":"ContainerDied","Data":"13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2"} Apr 22 20:00:30.264426 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:30.264398 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-798f685885-kbmjw" Apr 22 20:00:33.343569 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.343543 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5559fc8748-cqjgv" Apr 22 20:00:33.357851 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.357814 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"467b9557-e696-45d7-b276-cebdce0098b9","Type":"ContainerStarted","Data":"db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375"} Apr 22 20:00:33.357851 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.357848 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"467b9557-e696-45d7-b276-cebdce0098b9","Type":"ContainerStarted","Data":"dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b"} Apr 22 20:00:33.357992 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.357857 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"467b9557-e696-45d7-b276-cebdce0098b9","Type":"ContainerStarted","Data":"4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc"} Apr 22 20:00:33.357992 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.357867 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"467b9557-e696-45d7-b276-cebdce0098b9","Type":"ContainerStarted","Data":"064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa"} Apr 22 20:00:33.357992 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.357874 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"467b9557-e696-45d7-b276-cebdce0098b9","Type":"ContainerStarted","Data":"1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3"} Apr 22 20:00:33.357992 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.357882 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"467b9557-e696-45d7-b276-cebdce0098b9","Type":"ContainerStarted","Data":"a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3"} Apr 22 20:00:33.393417 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.393368 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.795339394 podStartE2EDuration="6.393341654s" podCreationTimestamp="2026-04-22 20:00:27 +0000 UTC" firstStartedPulling="2026-04-22 20:00:27.810648091 +0000 UTC m=+120.446031714" lastFinishedPulling="2026-04-22 20:00:32.408650344 +0000 UTC m=+125.044033974" observedRunningTime="2026-04-22 20:00:33.391814203 +0000 UTC m=+126.027197853" watchObservedRunningTime="2026-04-22 20:00:33.393341654 +0000 UTC m=+126.028725302" Apr 22 20:00:33.504307 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.504268 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" podUID="15904536-b225-4e88-a726-c2cc5e1d8dd9" containerName="registry" containerID="cri-o://cde67e52ea8a1f6c842318018313c52a58bcd418eaa2cb5525526d63e24b2631" gracePeriod=30 Apr 22 20:00:33.760028 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.760005 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 20:00:33.844645 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.844616 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/15904536-b225-4e88-a726-c2cc5e1d8dd9-image-registry-private-configuration\") pod \"15904536-b225-4e88-a726-c2cc5e1d8dd9\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " Apr 22 20:00:33.844756 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.844663 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-certificates\") pod \"15904536-b225-4e88-a726-c2cc5e1d8dd9\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " Apr 22 20:00:33.844756 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.844702 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-tls\") pod \"15904536-b225-4e88-a726-c2cc5e1d8dd9\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " Apr 22 20:00:33.844756 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.844731 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15904536-b225-4e88-a726-c2cc5e1d8dd9-trusted-ca\") pod \"15904536-b225-4e88-a726-c2cc5e1d8dd9\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " Apr 22 20:00:33.844756 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.844749 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/15904536-b225-4e88-a726-c2cc5e1d8dd9-ca-trust-extracted\") pod \"15904536-b225-4e88-a726-c2cc5e1d8dd9\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " Apr 22 20:00:33.844935 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.844769 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-bound-sa-token\") pod \"15904536-b225-4e88-a726-c2cc5e1d8dd9\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " Apr 22 20:00:33.844935 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.844794 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/15904536-b225-4e88-a726-c2cc5e1d8dd9-installation-pull-secrets\") pod \"15904536-b225-4e88-a726-c2cc5e1d8dd9\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " Apr 22 20:00:33.844935 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.844918 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcvlf\" (UniqueName: \"kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-kube-api-access-rcvlf\") pod \"15904536-b225-4e88-a726-c2cc5e1d8dd9\" (UID: \"15904536-b225-4e88-a726-c2cc5e1d8dd9\") " Apr 22 20:00:33.845169 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.845147 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "15904536-b225-4e88-a726-c2cc5e1d8dd9" (UID: "15904536-b225-4e88-a726-c2cc5e1d8dd9"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:00:33.845267 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.845227 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-certificates\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:00:33.845665 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.845641 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15904536-b225-4e88-a726-c2cc5e1d8dd9-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "15904536-b225-4e88-a726-c2cc5e1d8dd9" (UID: "15904536-b225-4e88-a726-c2cc5e1d8dd9"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:00:33.847344 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.847300 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15904536-b225-4e88-a726-c2cc5e1d8dd9-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "15904536-b225-4e88-a726-c2cc5e1d8dd9" (UID: "15904536-b225-4e88-a726-c2cc5e1d8dd9"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:33.847466 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.847403 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15904536-b225-4e88-a726-c2cc5e1d8dd9-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "15904536-b225-4e88-a726-c2cc5e1d8dd9" (UID: "15904536-b225-4e88-a726-c2cc5e1d8dd9"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:00:33.847466 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.847407 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "15904536-b225-4e88-a726-c2cc5e1d8dd9" (UID: "15904536-b225-4e88-a726-c2cc5e1d8dd9"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:00:33.847580 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.847467 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-kube-api-access-rcvlf" (OuterVolumeSpecName: "kube-api-access-rcvlf") pod "15904536-b225-4e88-a726-c2cc5e1d8dd9" (UID: "15904536-b225-4e88-a726-c2cc5e1d8dd9"). InnerVolumeSpecName "kube-api-access-rcvlf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:00:33.847580 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.847553 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "15904536-b225-4e88-a726-c2cc5e1d8dd9" (UID: "15904536-b225-4e88-a726-c2cc5e1d8dd9"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:00:33.853177 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.853153 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15904536-b225-4e88-a726-c2cc5e1d8dd9-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "15904536-b225-4e88-a726-c2cc5e1d8dd9" (UID: "15904536-b225-4e88-a726-c2cc5e1d8dd9"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:00:33.889121 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.889073 2574 scope.go:117] "RemoveContainer" containerID="0c19aaa5b0f2bd5405ce99335d0051c5eb86938d057fd32f38f64e461c31e6c0" Apr 22 20:00:33.946175 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.946150 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rcvlf\" (UniqueName: \"kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-kube-api-access-rcvlf\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:00:33.946304 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.946178 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/15904536-b225-4e88-a726-c2cc5e1d8dd9-image-registry-private-configuration\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:00:33.946304 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.946196 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-registry-tls\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:00:33.946304 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.946210 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15904536-b225-4e88-a726-c2cc5e1d8dd9-trusted-ca\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:00:33.946304 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.946223 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/15904536-b225-4e88-a726-c2cc5e1d8dd9-ca-trust-extracted\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:00:33.946304 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.946239 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15904536-b225-4e88-a726-c2cc5e1d8dd9-bound-sa-token\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:00:33.946304 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:33.946253 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/15904536-b225-4e88-a726-c2cc5e1d8dd9-installation-pull-secrets\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:00:34.362707 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:34.362673 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:00:34.363155 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:34.362817 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" event={"ID":"e9453f8c-04f5-4b72-b26e-c5ccc3bfed06","Type":"ContainerStarted","Data":"b284b6664e6e3b448baa27a31ef450b6f509d4af44ad733ed88a395003d94675"} Apr 22 20:00:34.363403 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:34.363299 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" Apr 22 20:00:34.364583 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:34.364556 2574 generic.go:358] "Generic (PLEG): container finished" podID="15904536-b225-4e88-a726-c2cc5e1d8dd9" containerID="cde67e52ea8a1f6c842318018313c52a58bcd418eaa2cb5525526d63e24b2631" exitCode=0 Apr 22 20:00:34.364710 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:34.364643 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" Apr 22 20:00:34.364784 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:34.364642 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" event={"ID":"15904536-b225-4e88-a726-c2cc5e1d8dd9","Type":"ContainerDied","Data":"cde67e52ea8a1f6c842318018313c52a58bcd418eaa2cb5525526d63e24b2631"} Apr 22 20:00:34.364784 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:34.364761 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-bbdd547d6-cdfxx" event={"ID":"15904536-b225-4e88-a726-c2cc5e1d8dd9","Type":"ContainerDied","Data":"992d9448b40350cb7723a9fdf2024ebf976cd9bd350cfa848a57c53ed5c1d0a2"} Apr 22 20:00:34.364853 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:34.364791 2574 scope.go:117] "RemoveContainer" containerID="cde67e52ea8a1f6c842318018313c52a58bcd418eaa2cb5525526d63e24b2631" Apr 22 20:00:34.374105 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:34.374082 2574 scope.go:117] "RemoveContainer" containerID="cde67e52ea8a1f6c842318018313c52a58bcd418eaa2cb5525526d63e24b2631" Apr 22 20:00:34.374417 ip-10-0-128-61 kubenswrapper[2574]: E0422 20:00:34.374383 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cde67e52ea8a1f6c842318018313c52a58bcd418eaa2cb5525526d63e24b2631\": container with ID starting with cde67e52ea8a1f6c842318018313c52a58bcd418eaa2cb5525526d63e24b2631 not found: ID does not exist" containerID="cde67e52ea8a1f6c842318018313c52a58bcd418eaa2cb5525526d63e24b2631" Apr 22 20:00:34.374489 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:34.374430 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde67e52ea8a1f6c842318018313c52a58bcd418eaa2cb5525526d63e24b2631"} err="failed to get container status \"cde67e52ea8a1f6c842318018313c52a58bcd418eaa2cb5525526d63e24b2631\": rpc error: code = NotFound desc = could not find container \"cde67e52ea8a1f6c842318018313c52a58bcd418eaa2cb5525526d63e24b2631\": container with ID starting with cde67e52ea8a1f6c842318018313c52a58bcd418eaa2cb5525526d63e24b2631 not found: ID does not exist" Apr 22 20:00:34.380381 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:34.380313 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" podStartSLOduration=53.389626079 podStartE2EDuration="55.380298164s" podCreationTimestamp="2026-04-22 19:59:39 +0000 UTC" firstStartedPulling="2026-04-22 19:59:39.692083586 +0000 UTC m=+72.327467204" lastFinishedPulling="2026-04-22 19:59:41.682755667 +0000 UTC m=+74.318139289" observedRunningTime="2026-04-22 20:00:34.378617738 +0000 UTC m=+127.014001404" watchObservedRunningTime="2026-04-22 20:00:34.380298164 +0000 UTC m=+127.015681809" Apr 22 20:00:34.390495 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:34.390468 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-bbdd547d6-cdfxx"] Apr 22 20:00:34.393946 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:34.393928 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-bbdd547d6-cdfxx"] Apr 22 20:00:34.828710 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:34.828683 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-5hfh2" Apr 22 20:00:35.893079 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:35.893039 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15904536-b225-4e88-a726-c2cc5e1d8dd9" path="/var/lib/kubelet/pods/15904536-b225-4e88-a726-c2cc5e1d8dd9/volumes" Apr 22 20:00:37.661452 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:37.661419 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:00:37.678654 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:37.678614 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs\") pod \"network-metrics-daemon-4qs28\" (UID: \"6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4\") " pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 20:00:37.680951 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:37.680930 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4-metrics-certs\") pod \"network-metrics-daemon-4qs28\" (UID: \"6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4\") " pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 20:00:37.808845 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:37.808820 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vnzkf\"" Apr 22 20:00:37.816256 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:37.816238 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4qs28" Apr 22 20:00:37.935987 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:37.935959 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4qs28"] Apr 22 20:00:37.938394 ip-10-0-128-61 kubenswrapper[2574]: W0422 20:00:37.938346 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cde1b60_62ad_49fe_a3db_ffc6b90f6eb4.slice/crio-db2c9ef90fdb6df662db362484fac3b30cff9123b5b290cb0b3731993f182cc5 WatchSource:0}: Error finding container db2c9ef90fdb6df662db362484fac3b30cff9123b5b290cb0b3731993f182cc5: Status 404 returned error can't find the container with id db2c9ef90fdb6df662db362484fac3b30cff9123b5b290cb0b3731993f182cc5 Apr 22 20:00:38.377900 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:38.377872 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4qs28" event={"ID":"6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4","Type":"ContainerStarted","Data":"db2c9ef90fdb6df662db362484fac3b30cff9123b5b290cb0b3731993f182cc5"} Apr 22 20:00:40.385418 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:40.385377 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4qs28" event={"ID":"6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4","Type":"ContainerStarted","Data":"75186ad04821bdab0869d164d8e050141850ed2df87bf191df6bcb2f44e5f6e3"} Apr 22 20:00:40.385418 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:40.385422 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4qs28" event={"ID":"6cde1b60-62ad-49fe-a3db-ffc6b90f6eb4","Type":"ContainerStarted","Data":"c3e6875dbaffbfbf8b65f8bb73ba7836878246ba00ac22836a1f2cf38ec19b24"} Apr 22 20:00:40.406587 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:00:40.406542 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4qs28" podStartSLOduration=131.945268915 podStartE2EDuration="2m13.406530775s" podCreationTimestamp="2026-04-22 19:58:27 +0000 UTC" firstStartedPulling="2026-04-22 20:00:37.940170641 +0000 UTC m=+130.575554258" lastFinishedPulling="2026-04-22 20:00:39.401432502 +0000 UTC m=+132.036816118" observedRunningTime="2026-04-22 20:00:40.405172975 +0000 UTC m=+133.040556615" watchObservedRunningTime="2026-04-22 20:00:40.406530775 +0000 UTC m=+133.041914414" Apr 22 20:01:27.662438 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:27.662340 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:27.680606 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:27.680584 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:28.546610 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:28.546585 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:45.599056 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.599018 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 20:01:45.599613 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.599491 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="prometheus" containerID="cri-o://a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3" gracePeriod=600 Apr 22 20:01:45.599613 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.599539 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="thanos-sidecar" containerID="cri-o://064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa" gracePeriod=600 Apr 22 20:01:45.599613 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.599592 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="kube-rbac-proxy-thanos" containerID="cri-o://db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375" gracePeriod=600 Apr 22 20:01:45.599613 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.599603 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="config-reloader" containerID="cri-o://1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3" gracePeriod=600 Apr 22 20:01:45.599826 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.599536 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="kube-rbac-proxy" containerID="cri-o://dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b" gracePeriod=600 Apr 22 20:01:45.599826 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.599553 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="kube-rbac-proxy-web" containerID="cri-o://4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc" gracePeriod=600 Apr 22 20:01:45.839977 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.839953 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:45.920153 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.920072 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8b52\" (UniqueName: \"kubernetes.io/projected/467b9557-e696-45d7-b276-cebdce0098b9-kube-api-access-m8b52\") pod \"467b9557-e696-45d7-b276-cebdce0098b9\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " Apr 22 20:01:45.920153 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.920118 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-configmap-kubelet-serving-ca-bundle\") pod \"467b9557-e696-45d7-b276-cebdce0098b9\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " Apr 22 20:01:45.920153 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.920154 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"467b9557-e696-45d7-b276-cebdce0098b9\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " Apr 22 20:01:45.920438 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.920170 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-prometheus-k8s-tls\") pod \"467b9557-e696-45d7-b276-cebdce0098b9\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " Apr 22 20:01:45.920438 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.920193 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-web-config\") pod \"467b9557-e696-45d7-b276-cebdce0098b9\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " Apr 22 20:01:45.920438 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.920216 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"467b9557-e696-45d7-b276-cebdce0098b9\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " Apr 22 20:01:45.920627 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.920605 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-kube-rbac-proxy\") pod \"467b9557-e696-45d7-b276-cebdce0098b9\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " Apr 22 20:01:45.920688 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.920667 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-thanos-prometheus-http-client-file\") pod \"467b9557-e696-45d7-b276-cebdce0098b9\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " Apr 22 20:01:45.920740 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.920725 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/467b9557-e696-45d7-b276-cebdce0098b9-config-out\") pod \"467b9557-e696-45d7-b276-cebdce0098b9\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " Apr 22 20:01:45.920789 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.920758 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-grpc-tls\") pod \"467b9557-e696-45d7-b276-cebdce0098b9\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " Apr 22 20:01:45.920789 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.920784 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-prometheus-k8s-rulefiles-0\") pod \"467b9557-e696-45d7-b276-cebdce0098b9\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " Apr 22 20:01:45.920892 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.920817 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-prometheus-trusted-ca-bundle\") pod \"467b9557-e696-45d7-b276-cebdce0098b9\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " Apr 22 20:01:45.920892 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.920848 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-configmap-serving-certs-ca-bundle\") pod \"467b9557-e696-45d7-b276-cebdce0098b9\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " Apr 22 20:01:45.920892 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.920873 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/467b9557-e696-45d7-b276-cebdce0098b9-tls-assets\") pod \"467b9557-e696-45d7-b276-cebdce0098b9\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " Apr 22 20:01:45.921031 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.920912 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-config\") pod \"467b9557-e696-45d7-b276-cebdce0098b9\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " Apr 22 20:01:45.921031 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.920945 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-configmap-metrics-client-ca\") pod \"467b9557-e696-45d7-b276-cebdce0098b9\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " Apr 22 20:01:45.921031 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.920977 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/467b9557-e696-45d7-b276-cebdce0098b9-prometheus-k8s-db\") pod \"467b9557-e696-45d7-b276-cebdce0098b9\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " Apr 22 20:01:45.921031 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.921006 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-metrics-client-certs\") pod \"467b9557-e696-45d7-b276-cebdce0098b9\" (UID: \"467b9557-e696-45d7-b276-cebdce0098b9\") " Apr 22 20:01:45.921031 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.920664 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "467b9557-e696-45d7-b276-cebdce0098b9" (UID: "467b9557-e696-45d7-b276-cebdce0098b9"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:01:45.921298 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.921263 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:01:45.923139 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.923109 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "467b9557-e696-45d7-b276-cebdce0098b9" (UID: "467b9557-e696-45d7-b276-cebdce0098b9"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:01:45.923783 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.923650 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "467b9557-e696-45d7-b276-cebdce0098b9" (UID: "467b9557-e696-45d7-b276-cebdce0098b9"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:01:45.923894 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.923839 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "467b9557-e696-45d7-b276-cebdce0098b9" (UID: "467b9557-e696-45d7-b276-cebdce0098b9"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:45.923978 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.923888 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "467b9557-e696-45d7-b276-cebdce0098b9" (UID: "467b9557-e696-45d7-b276-cebdce0098b9"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:45.924177 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.924117 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "467b9557-e696-45d7-b276-cebdce0098b9" (UID: "467b9557-e696-45d7-b276-cebdce0098b9"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:45.924550 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.924511 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "467b9557-e696-45d7-b276-cebdce0098b9" (UID: "467b9557-e696-45d7-b276-cebdce0098b9"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:45.924660 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.924614 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "467b9557-e696-45d7-b276-cebdce0098b9" (UID: "467b9557-e696-45d7-b276-cebdce0098b9"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:01:45.924729 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.924691 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "467b9557-e696-45d7-b276-cebdce0098b9" (UID: "467b9557-e696-45d7-b276-cebdce0098b9"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 20:01:45.924972 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.924936 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/467b9557-e696-45d7-b276-cebdce0098b9-kube-api-access-m8b52" (OuterVolumeSpecName: "kube-api-access-m8b52") pod "467b9557-e696-45d7-b276-cebdce0098b9" (UID: "467b9557-e696-45d7-b276-cebdce0098b9"). InnerVolumeSpecName "kube-api-access-m8b52". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:01:45.925179 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.925144 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "467b9557-e696-45d7-b276-cebdce0098b9" (UID: "467b9557-e696-45d7-b276-cebdce0098b9"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:45.925547 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.925460 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/467b9557-e696-45d7-b276-cebdce0098b9-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "467b9557-e696-45d7-b276-cebdce0098b9" (UID: "467b9557-e696-45d7-b276-cebdce0098b9"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:01:45.925650 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.925630 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/467b9557-e696-45d7-b276-cebdce0098b9-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "467b9557-e696-45d7-b276-cebdce0098b9" (UID: "467b9557-e696-45d7-b276-cebdce0098b9"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:01:45.925964 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.925942 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/467b9557-e696-45d7-b276-cebdce0098b9-config-out" (OuterVolumeSpecName: "config-out") pod "467b9557-e696-45d7-b276-cebdce0098b9" (UID: "467b9557-e696-45d7-b276-cebdce0098b9"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 20:01:45.926210 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.926188 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "467b9557-e696-45d7-b276-cebdce0098b9" (UID: "467b9557-e696-45d7-b276-cebdce0098b9"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:45.926562 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.926542 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "467b9557-e696-45d7-b276-cebdce0098b9" (UID: "467b9557-e696-45d7-b276-cebdce0098b9"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:45.926632 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.926584 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-config" (OuterVolumeSpecName: "config") pod "467b9557-e696-45d7-b276-cebdce0098b9" (UID: "467b9557-e696-45d7-b276-cebdce0098b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:45.935070 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:45.935051 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-web-config" (OuterVolumeSpecName: "web-config") pod "467b9557-e696-45d7-b276-cebdce0098b9" (UID: "467b9557-e696-45d7-b276-cebdce0098b9"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 20:01:46.022165 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.022138 2574 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-grpc-tls\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:01:46.022165 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.022165 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:01:46.022331 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.022175 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-prometheus-trusted-ca-bundle\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:01:46.022331 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.022185 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:01:46.022331 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.022195 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/467b9557-e696-45d7-b276-cebdce0098b9-tls-assets\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:01:46.022331 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.022204 2574 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-config\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:01:46.022331 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.022213 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/467b9557-e696-45d7-b276-cebdce0098b9-configmap-metrics-client-ca\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:01:46.022331 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.022222 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/467b9557-e696-45d7-b276-cebdce0098b9-prometheus-k8s-db\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:01:46.022331 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.022230 2574 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-metrics-client-certs\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:01:46.022331 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.022239 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m8b52\" (UniqueName: \"kubernetes.io/projected/467b9557-e696-45d7-b276-cebdce0098b9-kube-api-access-m8b52\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:01:46.022331 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.022248 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:01:46.022331 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.022257 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-prometheus-k8s-tls\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:01:46.022331 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.022266 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-web-config\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:01:46.022331 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.022275 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:01:46.022331 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.022285 2574 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-secret-kube-rbac-proxy\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:01:46.022331 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.022295 2574 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/467b9557-e696-45d7-b276-cebdce0098b9-thanos-prometheus-http-client-file\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:01:46.022331 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.022303 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/467b9557-e696-45d7-b276-cebdce0098b9-config-out\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:01:46.582812 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.582783 2574 generic.go:358] "Generic (PLEG): container finished" podID="467b9557-e696-45d7-b276-cebdce0098b9" containerID="db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375" exitCode=0 Apr 22 20:01:46.582812 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.582808 2574 generic.go:358] "Generic (PLEG): container finished" podID="467b9557-e696-45d7-b276-cebdce0098b9" containerID="dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b" exitCode=0 Apr 22 20:01:46.582812 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.582815 2574 generic.go:358] "Generic (PLEG): container finished" podID="467b9557-e696-45d7-b276-cebdce0098b9" containerID="4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc" exitCode=0 Apr 22 20:01:46.582812 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.582821 2574 generic.go:358] "Generic (PLEG): container finished" podID="467b9557-e696-45d7-b276-cebdce0098b9" containerID="064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa" exitCode=0 Apr 22 20:01:46.582812 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.582826 2574 generic.go:358] "Generic (PLEG): container finished" podID="467b9557-e696-45d7-b276-cebdce0098b9" containerID="1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3" exitCode=0 Apr 22 20:01:46.582812 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.582831 2574 generic.go:358] "Generic (PLEG): container finished" podID="467b9557-e696-45d7-b276-cebdce0098b9" containerID="a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3" exitCode=0 Apr 22 20:01:46.583156 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.582880 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.583156 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.582870 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"467b9557-e696-45d7-b276-cebdce0098b9","Type":"ContainerDied","Data":"db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375"} Apr 22 20:01:46.583156 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.582982 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"467b9557-e696-45d7-b276-cebdce0098b9","Type":"ContainerDied","Data":"dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b"} Apr 22 20:01:46.583156 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.582994 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"467b9557-e696-45d7-b276-cebdce0098b9","Type":"ContainerDied","Data":"4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc"} Apr 22 20:01:46.583156 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.583003 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"467b9557-e696-45d7-b276-cebdce0098b9","Type":"ContainerDied","Data":"064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa"} Apr 22 20:01:46.583156 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.583012 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"467b9557-e696-45d7-b276-cebdce0098b9","Type":"ContainerDied","Data":"1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3"} Apr 22 20:01:46.583156 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.583023 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"467b9557-e696-45d7-b276-cebdce0098b9","Type":"ContainerDied","Data":"a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3"} Apr 22 20:01:46.583156 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.583033 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"467b9557-e696-45d7-b276-cebdce0098b9","Type":"ContainerDied","Data":"3878573bc01a12850f61af87a18fd4c6bda931764e926c546dd800b28d7ecdad"} Apr 22 20:01:46.583156 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.583046 2574 scope.go:117] "RemoveContainer" containerID="db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375" Apr 22 20:01:46.590734 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.590658 2574 scope.go:117] "RemoveContainer" containerID="dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b" Apr 22 20:01:46.597146 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.597126 2574 scope.go:117] "RemoveContainer" containerID="4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc" Apr 22 20:01:46.603455 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.603271 2574 scope.go:117] "RemoveContainer" containerID="064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa" Apr 22 20:01:46.611142 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.611127 2574 scope.go:117] "RemoveContainer" containerID="1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3" Apr 22 20:01:46.619188 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.619166 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 20:01:46.627896 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.627870 2574 scope.go:117] "RemoveContainer" containerID="a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3" Apr 22 20:01:46.631841 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.631821 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 20:01:46.634908 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.634891 2574 scope.go:117] "RemoveContainer" containerID="13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2" Apr 22 20:01:46.640767 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.640751 2574 scope.go:117] "RemoveContainer" containerID="db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375" Apr 22 20:01:46.640997 ip-10-0-128-61 kubenswrapper[2574]: E0422 20:01:46.640980 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375\": container with ID starting with db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375 not found: ID does not exist" containerID="db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375" Apr 22 20:01:46.641055 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.641004 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375"} err="failed to get container status \"db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375\": rpc error: code = NotFound desc = could not find container \"db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375\": container with ID starting with db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375 not found: ID does not exist" Apr 22 20:01:46.641055 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.641021 2574 scope.go:117] "RemoveContainer" containerID="dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b" Apr 22 20:01:46.641210 ip-10-0-128-61 kubenswrapper[2574]: E0422 20:01:46.641195 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b\": container with ID starting with dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b not found: ID does not exist" containerID="dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b" Apr 22 20:01:46.641254 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.641213 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b"} err="failed to get container status \"dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b\": rpc error: code = NotFound desc = could not find container \"dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b\": container with ID starting with dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b not found: ID does not exist" Apr 22 20:01:46.641254 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.641225 2574 scope.go:117] "RemoveContainer" containerID="4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc" Apr 22 20:01:46.641474 ip-10-0-128-61 kubenswrapper[2574]: E0422 20:01:46.641455 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc\": container with ID starting with 4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc not found: ID does not exist" containerID="4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc" Apr 22 20:01:46.641552 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.641484 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc"} err="failed to get container status \"4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc\": rpc error: code = NotFound desc = could not find container \"4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc\": container with ID starting with 4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc not found: ID does not exist" Apr 22 20:01:46.641552 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.641505 2574 scope.go:117] "RemoveContainer" containerID="064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa" Apr 22 20:01:46.641720 ip-10-0-128-61 kubenswrapper[2574]: E0422 20:01:46.641704 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa\": container with ID starting with 064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa not found: ID does not exist" containerID="064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa" Apr 22 20:01:46.641773 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.641730 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa"} err="failed to get container status \"064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa\": rpc error: code = NotFound desc = could not find container \"064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa\": container with ID starting with 064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa not found: ID does not exist" Apr 22 20:01:46.641773 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.641750 2574 scope.go:117] "RemoveContainer" containerID="1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3" Apr 22 20:01:46.641978 ip-10-0-128-61 kubenswrapper[2574]: E0422 20:01:46.641963 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3\": container with ID starting with 1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3 not found: ID does not exist" containerID="1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3" Apr 22 20:01:46.642016 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.641981 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3"} err="failed to get container status \"1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3\": rpc error: code = NotFound desc = could not find container \"1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3\": container with ID starting with 1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3 not found: ID does not exist" Apr 22 20:01:46.642016 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.641994 2574 scope.go:117] "RemoveContainer" containerID="a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3" Apr 22 20:01:46.642170 ip-10-0-128-61 kubenswrapper[2574]: E0422 20:01:46.642156 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3\": container with ID starting with a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3 not found: ID does not exist" containerID="a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3" Apr 22 20:01:46.642209 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.642172 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3"} err="failed to get container status \"a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3\": rpc error: code = NotFound desc = could not find container \"a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3\": container with ID starting with a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3 not found: ID does not exist" Apr 22 20:01:46.642209 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.642183 2574 scope.go:117] "RemoveContainer" containerID="13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2" Apr 22 20:01:46.642380 ip-10-0-128-61 kubenswrapper[2574]: E0422 20:01:46.642347 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2\": container with ID starting with 13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2 not found: ID does not exist" containerID="13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2" Apr 22 20:01:46.642426 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.642387 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2"} err="failed to get container status \"13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2\": rpc error: code = NotFound desc = could not find container \"13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2\": container with ID starting with 13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2 not found: ID does not exist" Apr 22 20:01:46.642426 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.642402 2574 scope.go:117] "RemoveContainer" containerID="db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375" Apr 22 20:01:46.642616 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.642598 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375"} err="failed to get container status \"db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375\": rpc error: code = NotFound desc = could not find container \"db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375\": container with ID starting with db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375 not found: ID does not exist" Apr 22 20:01:46.642688 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.642617 2574 scope.go:117] "RemoveContainer" containerID="dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b" Apr 22 20:01:46.642814 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.642798 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b"} err="failed to get container status \"dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b\": rpc error: code = NotFound desc = could not find container \"dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b\": container with ID starting with dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b not found: ID does not exist" Apr 22 20:01:46.642855 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.642814 2574 scope.go:117] "RemoveContainer" containerID="4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc" Apr 22 20:01:46.643023 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.643005 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc"} err="failed to get container status \"4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc\": rpc error: code = NotFound desc = could not find container \"4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc\": container with ID starting with 4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc not found: ID does not exist" Apr 22 20:01:46.643088 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.643025 2574 scope.go:117] "RemoveContainer" containerID="064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa" Apr 22 20:01:46.643236 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.643210 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa"} err="failed to get container status \"064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa\": rpc error: code = NotFound desc = could not find container \"064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa\": container with ID starting with 064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa not found: ID does not exist" Apr 22 20:01:46.643283 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.643236 2574 scope.go:117] "RemoveContainer" containerID="1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3" Apr 22 20:01:46.643448 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.643431 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3"} err="failed to get container status \"1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3\": rpc error: code = NotFound desc = could not find container \"1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3\": container with ID starting with 1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3 not found: ID does not exist" Apr 22 20:01:46.643492 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.643448 2574 scope.go:117] "RemoveContainer" containerID="a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3" Apr 22 20:01:46.643632 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.643614 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3"} err="failed to get container status \"a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3\": rpc error: code = NotFound desc = could not find container \"a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3\": container with ID starting with a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3 not found: ID does not exist" Apr 22 20:01:46.643672 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.643633 2574 scope.go:117] "RemoveContainer" containerID="13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2" Apr 22 20:01:46.643805 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.643789 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2"} err="failed to get container status \"13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2\": rpc error: code = NotFound desc = could not find container \"13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2\": container with ID starting with 13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2 not found: ID does not exist" Apr 22 20:01:46.643847 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.643805 2574 scope.go:117] "RemoveContainer" containerID="db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375" Apr 22 20:01:46.643979 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.643964 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375"} err="failed to get container status \"db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375\": rpc error: code = NotFound desc = could not find container \"db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375\": container with ID starting with db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375 not found: ID does not exist" Apr 22 20:01:46.644026 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.643979 2574 scope.go:117] "RemoveContainer" containerID="dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b" Apr 22 20:01:46.644175 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.644157 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b"} err="failed to get container status \"dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b\": rpc error: code = NotFound desc = could not find container \"dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b\": container with ID starting with dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b not found: ID does not exist" Apr 22 20:01:46.644218 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.644176 2574 scope.go:117] "RemoveContainer" containerID="4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc" Apr 22 20:01:46.644375 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.644341 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc"} err="failed to get container status \"4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc\": rpc error: code = NotFound desc = could not find container \"4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc\": container with ID starting with 4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc not found: ID does not exist" Apr 22 20:01:46.644436 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.644402 2574 scope.go:117] "RemoveContainer" containerID="064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa" Apr 22 20:01:46.644599 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.644584 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa"} err="failed to get container status \"064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa\": rpc error: code = NotFound desc = could not find container \"064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa\": container with ID starting with 064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa not found: ID does not exist" Apr 22 20:01:46.644644 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.644598 2574 scope.go:117] "RemoveContainer" containerID="1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3" Apr 22 20:01:46.644791 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.644771 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3"} err="failed to get container status \"1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3\": rpc error: code = NotFound desc = could not find container \"1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3\": container with ID starting with 1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3 not found: ID does not exist" Apr 22 20:01:46.644836 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.644793 2574 scope.go:117] "RemoveContainer" containerID="a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3" Apr 22 20:01:46.644972 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.644956 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3"} err="failed to get container status \"a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3\": rpc error: code = NotFound desc = could not find container \"a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3\": container with ID starting with a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3 not found: ID does not exist" Apr 22 20:01:46.645018 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.644973 2574 scope.go:117] "RemoveContainer" containerID="13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2" Apr 22 20:01:46.645164 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.645149 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2"} err="failed to get container status \"13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2\": rpc error: code = NotFound desc = could not find container \"13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2\": container with ID starting with 13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2 not found: ID does not exist" Apr 22 20:01:46.645202 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.645164 2574 scope.go:117] "RemoveContainer" containerID="db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375" Apr 22 20:01:46.645346 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.645331 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375"} err="failed to get container status \"db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375\": rpc error: code = NotFound desc = could not find container \"db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375\": container with ID starting with db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375 not found: ID does not exist" Apr 22 20:01:46.645407 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.645345 2574 scope.go:117] "RemoveContainer" containerID="dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b" Apr 22 20:01:46.645545 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.645526 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b"} err="failed to get container status \"dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b\": rpc error: code = NotFound desc = could not find container \"dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b\": container with ID starting with dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b not found: ID does not exist" Apr 22 20:01:46.645584 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.645547 2574 scope.go:117] "RemoveContainer" containerID="4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc" Apr 22 20:01:46.645721 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.645707 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc"} err="failed to get container status \"4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc\": rpc error: code = NotFound desc = could not find container \"4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc\": container with ID starting with 4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc not found: ID does not exist" Apr 22 20:01:46.645766 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.645722 2574 scope.go:117] "RemoveContainer" containerID="064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa" Apr 22 20:01:46.645888 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.645873 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa"} err="failed to get container status \"064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa\": rpc error: code = NotFound desc = could not find container \"064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa\": container with ID starting with 064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa not found: ID does not exist" Apr 22 20:01:46.645930 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.645890 2574 scope.go:117] "RemoveContainer" containerID="1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3" Apr 22 20:01:46.646060 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.646043 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3"} err="failed to get container status \"1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3\": rpc error: code = NotFound desc = could not find container \"1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3\": container with ID starting with 1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3 not found: ID does not exist" Apr 22 20:01:46.646099 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.646059 2574 scope.go:117] "RemoveContainer" containerID="a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3" Apr 22 20:01:46.646222 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.646207 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3"} err="failed to get container status \"a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3\": rpc error: code = NotFound desc = could not find container \"a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3\": container with ID starting with a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3 not found: ID does not exist" Apr 22 20:01:46.646257 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.646224 2574 scope.go:117] "RemoveContainer" containerID="13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2" Apr 22 20:01:46.646403 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.646387 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2"} err="failed to get container status \"13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2\": rpc error: code = NotFound desc = could not find container \"13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2\": container with ID starting with 13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2 not found: ID does not exist" Apr 22 20:01:46.646441 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.646403 2574 scope.go:117] "RemoveContainer" containerID="db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375" Apr 22 20:01:46.646577 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.646557 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375"} err="failed to get container status \"db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375\": rpc error: code = NotFound desc = could not find container \"db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375\": container with ID starting with db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375 not found: ID does not exist" Apr 22 20:01:46.646616 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.646578 2574 scope.go:117] "RemoveContainer" containerID="dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b" Apr 22 20:01:46.646783 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.646762 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b"} err="failed to get container status \"dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b\": rpc error: code = NotFound desc = could not find container \"dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b\": container with ID starting with dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b not found: ID does not exist" Apr 22 20:01:46.646830 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.646784 2574 scope.go:117] "RemoveContainer" containerID="4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc" Apr 22 20:01:46.646964 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.646950 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc"} err="failed to get container status \"4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc\": rpc error: code = NotFound desc = could not find container \"4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc\": container with ID starting with 4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc not found: ID does not exist" Apr 22 20:01:46.647002 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.646965 2574 scope.go:117] "RemoveContainer" containerID="064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa" Apr 22 20:01:46.647174 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.647153 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa"} err="failed to get container status \"064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa\": rpc error: code = NotFound desc = could not find container \"064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa\": container with ID starting with 064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa not found: ID does not exist" Apr 22 20:01:46.647239 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.647176 2574 scope.go:117] "RemoveContainer" containerID="1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3" Apr 22 20:01:46.647415 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.647397 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3"} err="failed to get container status \"1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3\": rpc error: code = NotFound desc = could not find container \"1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3\": container with ID starting with 1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3 not found: ID does not exist" Apr 22 20:01:46.647503 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.647416 2574 scope.go:117] "RemoveContainer" containerID="a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3" Apr 22 20:01:46.647606 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.647587 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3"} err="failed to get container status \"a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3\": rpc error: code = NotFound desc = could not find container \"a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3\": container with ID starting with a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3 not found: ID does not exist" Apr 22 20:01:46.647651 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.647607 2574 scope.go:117] "RemoveContainer" containerID="13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2" Apr 22 20:01:46.647774 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.647754 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2"} err="failed to get container status \"13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2\": rpc error: code = NotFound desc = could not find container \"13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2\": container with ID starting with 13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2 not found: ID does not exist" Apr 22 20:01:46.647774 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.647774 2574 scope.go:117] "RemoveContainer" containerID="db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375" Apr 22 20:01:46.647986 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.647969 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375"} err="failed to get container status \"db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375\": rpc error: code = NotFound desc = could not find container \"db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375\": container with ID starting with db5b1d8659d945c3caf89662199fa5cf3923660e103f82fa8515574a2bb62375 not found: ID does not exist" Apr 22 20:01:46.648055 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.647988 2574 scope.go:117] "RemoveContainer" containerID="dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b" Apr 22 20:01:46.648205 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.648189 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b"} err="failed to get container status \"dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b\": rpc error: code = NotFound desc = could not find container \"dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b\": container with ID starting with dce6e0016e80561d7173389545b1e0fec3a526aba2a58646b649909ca4882f2b not found: ID does not exist" Apr 22 20:01:46.648252 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.648205 2574 scope.go:117] "RemoveContainer" containerID="4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc" Apr 22 20:01:46.648400 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.648384 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc"} err="failed to get container status \"4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc\": rpc error: code = NotFound desc = could not find container \"4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc\": container with ID starting with 4fd97f6ab6c04874049af072582453b6baa2884aadfd33eeeae532883f4075fc not found: ID does not exist" Apr 22 20:01:46.648463 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.648402 2574 scope.go:117] "RemoveContainer" containerID="064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa" Apr 22 20:01:46.648597 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.648581 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa"} err="failed to get container status \"064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa\": rpc error: code = NotFound desc = could not find container \"064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa\": container with ID starting with 064f1359a5157d98647407a613a60ee059016c626fe98c6d12c153e651b388aa not found: ID does not exist" Apr 22 20:01:46.648652 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.648598 2574 scope.go:117] "RemoveContainer" containerID="1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3" Apr 22 20:01:46.648780 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.648762 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3"} err="failed to get container status \"1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3\": rpc error: code = NotFound desc = could not find container \"1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3\": container with ID starting with 1b8c8878839ed95733b640db60229ded11b946dc089b0b3a8e8505f3504aace3 not found: ID does not exist" Apr 22 20:01:46.648827 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.648780 2574 scope.go:117] "RemoveContainer" containerID="a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3" Apr 22 20:01:46.648965 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.648950 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3"} err="failed to get container status \"a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3\": rpc error: code = NotFound desc = could not find container \"a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3\": container with ID starting with a66de5cf15ae73c1a41f2fbd113d84b81fe8c6c18dc5ed48ff58e1a264ec49b3 not found: ID does not exist" Apr 22 20:01:46.649010 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.648965 2574 scope.go:117] "RemoveContainer" containerID="13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2" Apr 22 20:01:46.649134 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.649121 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2"} err="failed to get container status \"13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2\": rpc error: code = NotFound desc = could not find container \"13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2\": container with ID starting with 13a63c7ad753274eaf9c620f08587e4d972cac8f52bb09887138c4279580f0b2 not found: ID does not exist" Apr 22 20:01:46.673018 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.672996 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 20:01:46.673267 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673256 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="prometheus" Apr 22 20:01:46.673305 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673268 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="prometheus" Apr 22 20:01:46.673305 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673278 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="config-reloader" Apr 22 20:01:46.673305 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673283 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="config-reloader" Apr 22 20:01:46.673305 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673290 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="kube-rbac-proxy-thanos" Apr 22 20:01:46.673305 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673295 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="kube-rbac-proxy-thanos" Apr 22 20:01:46.673305 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673304 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="kube-rbac-proxy-web" Apr 22 20:01:46.673489 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673309 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="kube-rbac-proxy-web" Apr 22 20:01:46.673489 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673319 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15904536-b225-4e88-a726-c2cc5e1d8dd9" containerName="registry" Apr 22 20:01:46.673489 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673324 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="15904536-b225-4e88-a726-c2cc5e1d8dd9" containerName="registry" Apr 22 20:01:46.673489 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673331 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="thanos-sidecar" Apr 22 20:01:46.673489 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673335 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="thanos-sidecar" Apr 22 20:01:46.673489 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673342 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="kube-rbac-proxy" Apr 22 20:01:46.673489 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673346 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="kube-rbac-proxy" Apr 22 20:01:46.673489 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673353 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="init-config-reloader" Apr 22 20:01:46.673489 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673370 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="init-config-reloader" Apr 22 20:01:46.673489 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673410 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="15904536-b225-4e88-a726-c2cc5e1d8dd9" containerName="registry" Apr 22 20:01:46.673489 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673417 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="config-reloader" Apr 22 20:01:46.673489 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673423 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="prometheus" Apr 22 20:01:46.673489 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673429 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="kube-rbac-proxy-web" Apr 22 20:01:46.673489 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673437 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="kube-rbac-proxy-thanos" Apr 22 20:01:46.673489 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673442 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="thanos-sidecar" Apr 22 20:01:46.673489 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.673448 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="467b9557-e696-45d7-b276-cebdce0098b9" containerName="kube-rbac-proxy" Apr 22 20:01:46.676927 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.676908 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.682650 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.682634 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 20:01:46.682929 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.682907 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 20:01:46.683050 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.683018 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-4c6hz\"" Apr 22 20:01:46.683423 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.683406 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 20:01:46.683642 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.683629 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 20:01:46.684337 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.683888 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 20:01:46.684337 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.683932 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 20:01:46.684337 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.683951 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 20:01:46.684337 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.683957 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 20:01:46.684337 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.683982 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 20:01:46.684337 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.684120 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 20:01:46.684337 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.684127 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 20:01:46.684718 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.684540 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-v36psmqluos2\"" Apr 22 20:01:46.688178 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.688160 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 20:01:46.690529 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.690513 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 20:01:46.703631 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.703612 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 20:01:46.727688 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.727662 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.727798 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.727700 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/abed7e62-f018-4bfc-97ca-4168a22c9b00-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.727798 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.727736 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/abed7e62-f018-4bfc-97ca-4168a22c9b00-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.727798 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.727791 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/abed7e62-f018-4bfc-97ca-4168a22c9b00-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.727894 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.727846 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/abed7e62-f018-4bfc-97ca-4168a22c9b00-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.727894 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.727873 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-config\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.727957 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.727895 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abed7e62-f018-4bfc-97ca-4168a22c9b00-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.727957 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.727913 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-web-config\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.727957 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.727935 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.728093 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.727956 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.728093 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.727994 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.728093 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.728037 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abed7e62-f018-4bfc-97ca-4168a22c9b00-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.728093 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.728082 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.728282 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.728114 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/abed7e62-f018-4bfc-97ca-4168a22c9b00-config-out\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.728282 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.728141 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.728282 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.728165 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.728282 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.728187 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf8cf\" (UniqueName: \"kubernetes.io/projected/abed7e62-f018-4bfc-97ca-4168a22c9b00-kube-api-access-hf8cf\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.728282 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.728237 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abed7e62-f018-4bfc-97ca-4168a22c9b00-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.829604 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.829575 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.829604 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.829608 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/abed7e62-f018-4bfc-97ca-4168a22c9b00-config-out\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.829788 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.829624 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.829788 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.829641 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.829788 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.829749 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hf8cf\" (UniqueName: \"kubernetes.io/projected/abed7e62-f018-4bfc-97ca-4168a22c9b00-kube-api-access-hf8cf\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.829924 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.829796 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abed7e62-f018-4bfc-97ca-4168a22c9b00-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.829924 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.829838 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.829924 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.829888 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/abed7e62-f018-4bfc-97ca-4168a22c9b00-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.830077 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.829929 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/abed7e62-f018-4bfc-97ca-4168a22c9b00-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.830077 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.829961 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/abed7e62-f018-4bfc-97ca-4168a22c9b00-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.830077 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.830001 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/abed7e62-f018-4bfc-97ca-4168a22c9b00-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.830077 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.830030 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-config\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.830077 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.830061 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abed7e62-f018-4bfc-97ca-4168a22c9b00-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.830311 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.830085 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-web-config\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.830311 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.830123 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.830311 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.830155 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.830311 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.830183 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.830311 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.830211 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abed7e62-f018-4bfc-97ca-4168a22c9b00-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.831240 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.831213 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abed7e62-f018-4bfc-97ca-4168a22c9b00-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.831914 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.831536 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/abed7e62-f018-4bfc-97ca-4168a22c9b00-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.833783 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.832689 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.833783 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.832831 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abed7e62-f018-4bfc-97ca-4168a22c9b00-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.833783 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.832863 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/abed7e62-f018-4bfc-97ca-4168a22c9b00-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.833783 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.832880 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.833783 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.833104 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.833783 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.833435 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/abed7e62-f018-4bfc-97ca-4168a22c9b00-config-out\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.833783 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.833685 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/abed7e62-f018-4bfc-97ca-4168a22c9b00-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.834288 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.833848 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.834288 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.834158 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/abed7e62-f018-4bfc-97ca-4168a22c9b00-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.835040 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.835022 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-config\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.835152 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.835131 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/abed7e62-f018-4bfc-97ca-4168a22c9b00-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.835329 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.835309 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.835604 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.835589 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.835831 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.835809 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.835987 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.835969 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/abed7e62-f018-4bfc-97ca-4168a22c9b00-web-config\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.838736 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.838715 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf8cf\" (UniqueName: \"kubernetes.io/projected/abed7e62-f018-4bfc-97ca-4168a22c9b00-kube-api-access-hf8cf\") pod \"prometheus-k8s-0\" (UID: \"abed7e62-f018-4bfc-97ca-4168a22c9b00\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:46.987693 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:46.987670 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:01:47.122029 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:47.122003 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 20:01:47.123220 ip-10-0-128-61 kubenswrapper[2574]: W0422 20:01:47.123193 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabed7e62_f018_4bfc_97ca_4168a22c9b00.slice/crio-2ea93365afc7ce94f4656a5c3aef52d282ff6ddbd185bd96a4e9efd62d5f338c WatchSource:0}: Error finding container 2ea93365afc7ce94f4656a5c3aef52d282ff6ddbd185bd96a4e9efd62d5f338c: Status 404 returned error can't find the container with id 2ea93365afc7ce94f4656a5c3aef52d282ff6ddbd185bd96a4e9efd62d5f338c Apr 22 20:01:47.586725 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:47.586694 2574 generic.go:358] "Generic (PLEG): container finished" podID="abed7e62-f018-4bfc-97ca-4168a22c9b00" containerID="eb8e673a4da420c3aa9bb6f04b35fbeaad74fefd05fbb30498e95cd289946953" exitCode=0 Apr 22 20:01:47.586882 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:47.586781 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"abed7e62-f018-4bfc-97ca-4168a22c9b00","Type":"ContainerDied","Data":"eb8e673a4da420c3aa9bb6f04b35fbeaad74fefd05fbb30498e95cd289946953"} Apr 22 20:01:47.586882 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:47.586820 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"abed7e62-f018-4bfc-97ca-4168a22c9b00","Type":"ContainerStarted","Data":"2ea93365afc7ce94f4656a5c3aef52d282ff6ddbd185bd96a4e9efd62d5f338c"} Apr 22 20:01:47.892714 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:47.892687 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="467b9557-e696-45d7-b276-cebdce0098b9" path="/var/lib/kubelet/pods/467b9557-e696-45d7-b276-cebdce0098b9/volumes" Apr 22 20:01:48.593503 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:48.593466 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"abed7e62-f018-4bfc-97ca-4168a22c9b00","Type":"ContainerStarted","Data":"faf1fab5b605d2379d80b9ec39e8fd42330a42f078f713d90ad0f71c78dab6dd"} Apr 22 20:01:48.593503 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:48.593503 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"abed7e62-f018-4bfc-97ca-4168a22c9b00","Type":"ContainerStarted","Data":"3cd13dd1f122a2dba8324bfc895b7a64e9932499b17df54255c347a847f5278d"} Apr 22 20:01:48.593705 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:48.593515 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"abed7e62-f018-4bfc-97ca-4168a22c9b00","Type":"ContainerStarted","Data":"b03b2b105be9d7bdb30869c75ce03d81d879337513fce6b7a219527bbc25da50"} Apr 22 20:01:48.593705 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:48.593526 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"abed7e62-f018-4bfc-97ca-4168a22c9b00","Type":"ContainerStarted","Data":"a38b80b6dad3a4e759a6f856e2b1c8b4862c010bcf03e40f59e9b15e5bc00b5f"} Apr 22 20:01:48.593705 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:48.593537 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"abed7e62-f018-4bfc-97ca-4168a22c9b00","Type":"ContainerStarted","Data":"9a9b324c196b890f0d702f706338d1a657206a224eee0dc30b19d27011518dd0"} Apr 22 20:01:48.593705 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:48.593550 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"abed7e62-f018-4bfc-97ca-4168a22c9b00","Type":"ContainerStarted","Data":"ce1ff16de16d808e70b78f7d226fb905d6ea1798ae1db2afa59062fca26f141b"} Apr 22 20:01:48.630018 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:48.629957 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.6299405350000002 podStartE2EDuration="2.629940535s" podCreationTimestamp="2026-04-22 20:01:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:01:48.628318893 +0000 UTC m=+201.263702532" watchObservedRunningTime="2026-04-22 20:01:48.629940535 +0000 UTC m=+201.265324175" Apr 22 20:01:51.988402 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:01:51.988370 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:02:46.988907 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:02:46.988822 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:02:47.003762 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:02:47.003732 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:02:47.777322 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:02:47.777298 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 20:03:27.787480 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:03:27.787449 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:03:27.792120 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:03:27.792099 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:05:51.396703 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:05:51.396623 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-tgmg8"] Apr 22 20:05:51.398636 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:05:51.398618 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-tgmg8" Apr 22 20:05:51.400920 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:05:51.400894 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 20:05:51.400920 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:05:51.400915 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 20:05:51.401122 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:05:51.400950 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-4hhk5\"" Apr 22 20:05:51.401918 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:05:51.401902 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 20:05:51.405877 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:05:51.405718 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-tgmg8"] Apr 22 20:05:51.468551 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:05:51.468525 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwpwt\" (UniqueName: \"kubernetes.io/projected/67856b18-c766-4866-9021-63a1c17b4c5e-kube-api-access-hwpwt\") pod \"s3-init-tgmg8\" (UID: \"67856b18-c766-4866-9021-63a1c17b4c5e\") " pod="kserve/s3-init-tgmg8" Apr 22 20:05:51.569840 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:05:51.569814 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwpwt\" (UniqueName: \"kubernetes.io/projected/67856b18-c766-4866-9021-63a1c17b4c5e-kube-api-access-hwpwt\") pod \"s3-init-tgmg8\" (UID: \"67856b18-c766-4866-9021-63a1c17b4c5e\") " pod="kserve/s3-init-tgmg8" Apr 22 20:05:51.578506 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:05:51.578483 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwpwt\" (UniqueName: \"kubernetes.io/projected/67856b18-c766-4866-9021-63a1c17b4c5e-kube-api-access-hwpwt\") pod \"s3-init-tgmg8\" (UID: \"67856b18-c766-4866-9021-63a1c17b4c5e\") " pod="kserve/s3-init-tgmg8" Apr 22 20:05:51.725223 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:05:51.725168 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-tgmg8" Apr 22 20:05:51.842671 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:05:51.842647 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-tgmg8"] Apr 22 20:05:51.845083 ip-10-0-128-61 kubenswrapper[2574]: W0422 20:05:51.845055 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67856b18_c766_4866_9021_63a1c17b4c5e.slice/crio-b1bef161d18825b9cd8fdf67d81d5dad550303679514130faade2099a8e21679 WatchSource:0}: Error finding container b1bef161d18825b9cd8fdf67d81d5dad550303679514130faade2099a8e21679: Status 404 returned error can't find the container with id b1bef161d18825b9cd8fdf67d81d5dad550303679514130faade2099a8e21679 Apr 22 20:05:51.847125 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:05:51.847106 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:05:52.278353 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:05:52.278321 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-tgmg8" event={"ID":"67856b18-c766-4866-9021-63a1c17b4c5e","Type":"ContainerStarted","Data":"b1bef161d18825b9cd8fdf67d81d5dad550303679514130faade2099a8e21679"} Apr 22 20:05:57.295506 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:05:57.295466 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-tgmg8" event={"ID":"67856b18-c766-4866-9021-63a1c17b4c5e","Type":"ContainerStarted","Data":"31da29c89e05b61d4866b751bb352059a15b031721faf158219529780b62e1f3"} Apr 22 20:05:57.309666 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:05:57.309613 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-tgmg8" podStartSLOduration=1.881659799 podStartE2EDuration="6.309597911s" podCreationTimestamp="2026-04-22 20:05:51 +0000 UTC" firstStartedPulling="2026-04-22 20:05:51.847228415 +0000 UTC m=+444.482612032" lastFinishedPulling="2026-04-22 20:05:56.275166525 +0000 UTC m=+448.910550144" observedRunningTime="2026-04-22 20:05:57.309309701 +0000 UTC m=+449.944693340" watchObservedRunningTime="2026-04-22 20:05:57.309597911 +0000 UTC m=+449.944981551" Apr 22 20:05:59.302653 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:05:59.302620 2574 generic.go:358] "Generic (PLEG): container finished" podID="67856b18-c766-4866-9021-63a1c17b4c5e" containerID="31da29c89e05b61d4866b751bb352059a15b031721faf158219529780b62e1f3" exitCode=0 Apr 22 20:05:59.302945 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:05:59.302679 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-tgmg8" event={"ID":"67856b18-c766-4866-9021-63a1c17b4c5e","Type":"ContainerDied","Data":"31da29c89e05b61d4866b751bb352059a15b031721faf158219529780b62e1f3"} Apr 22 20:06:00.434595 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:00.434574 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-tgmg8" Apr 22 20:06:00.545867 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:00.545846 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwpwt\" (UniqueName: \"kubernetes.io/projected/67856b18-c766-4866-9021-63a1c17b4c5e-kube-api-access-hwpwt\") pod \"67856b18-c766-4866-9021-63a1c17b4c5e\" (UID: \"67856b18-c766-4866-9021-63a1c17b4c5e\") " Apr 22 20:06:00.547861 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:00.547835 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67856b18-c766-4866-9021-63a1c17b4c5e-kube-api-access-hwpwt" (OuterVolumeSpecName: "kube-api-access-hwpwt") pod "67856b18-c766-4866-9021-63a1c17b4c5e" (UID: "67856b18-c766-4866-9021-63a1c17b4c5e"). InnerVolumeSpecName "kube-api-access-hwpwt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:06:00.646616 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:00.646593 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hwpwt\" (UniqueName: \"kubernetes.io/projected/67856b18-c766-4866-9021-63a1c17b4c5e-kube-api-access-hwpwt\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:06:01.308785 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:01.308753 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-tgmg8" Apr 22 20:06:01.308945 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:01.308753 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-tgmg8" event={"ID":"67856b18-c766-4866-9021-63a1c17b4c5e","Type":"ContainerDied","Data":"b1bef161d18825b9cd8fdf67d81d5dad550303679514130faade2099a8e21679"} Apr 22 20:06:01.308945 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:01.308861 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1bef161d18825b9cd8fdf67d81d5dad550303679514130faade2099a8e21679" Apr 22 20:06:07.776026 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:07.775984 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-vwjw4"] Apr 22 20:06:07.776432 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:07.776328 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67856b18-c766-4866-9021-63a1c17b4c5e" containerName="s3-init" Apr 22 20:06:07.776432 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:07.776343 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="67856b18-c766-4866-9021-63a1c17b4c5e" containerName="s3-init" Apr 22 20:06:07.776432 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:07.776413 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="67856b18-c766-4866-9021-63a1c17b4c5e" containerName="s3-init" Apr 22 20:06:07.779632 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:07.779611 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-vwjw4" Apr 22 20:06:07.782406 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:07.782326 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 22 20:06:07.783254 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:07.783231 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 22 20:06:07.783338 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:07.783253 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 20:06:07.783794 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:07.783776 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 20:06:07.783794 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:07.783784 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-4hhk5\"" Apr 22 20:06:07.787624 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:07.787602 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-vwjw4"] Apr 22 20:06:07.897109 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:07.897075 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/4e880133-18d7-49aa-9d13-27b4d1faee15-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-vwjw4\" (UID: \"4e880133-18d7-49aa-9d13-27b4d1faee15\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-vwjw4" Apr 22 20:06:07.897236 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:07.897123 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4e880133-18d7-49aa-9d13-27b4d1faee15-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-vwjw4\" (UID: \"4e880133-18d7-49aa-9d13-27b4d1faee15\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-vwjw4" Apr 22 20:06:07.897236 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:07.897207 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62ckg\" (UniqueName: \"kubernetes.io/projected/4e880133-18d7-49aa-9d13-27b4d1faee15-kube-api-access-62ckg\") pod \"seaweedfs-tls-custom-5c88b85bb7-vwjw4\" (UID: \"4e880133-18d7-49aa-9d13-27b4d1faee15\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-vwjw4" Apr 22 20:06:07.997742 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:07.997710 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/4e880133-18d7-49aa-9d13-27b4d1faee15-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-vwjw4\" (UID: \"4e880133-18d7-49aa-9d13-27b4d1faee15\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-vwjw4" Apr 22 20:06:07.997878 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:07.997749 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4e880133-18d7-49aa-9d13-27b4d1faee15-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-vwjw4\" (UID: \"4e880133-18d7-49aa-9d13-27b4d1faee15\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-vwjw4" Apr 22 20:06:07.997878 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:07.997777 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62ckg\" (UniqueName: \"kubernetes.io/projected/4e880133-18d7-49aa-9d13-27b4d1faee15-kube-api-access-62ckg\") pod \"seaweedfs-tls-custom-5c88b85bb7-vwjw4\" (UID: \"4e880133-18d7-49aa-9d13-27b4d1faee15\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-vwjw4" Apr 22 20:06:07.998080 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:07.998061 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/4e880133-18d7-49aa-9d13-27b4d1faee15-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-vwjw4\" (UID: \"4e880133-18d7-49aa-9d13-27b4d1faee15\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-vwjw4" Apr 22 20:06:08.000263 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:08.000234 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/4e880133-18d7-49aa-9d13-27b4d1faee15-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-vwjw4\" (UID: \"4e880133-18d7-49aa-9d13-27b4d1faee15\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-vwjw4" Apr 22 20:06:08.006374 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:08.006341 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62ckg\" (UniqueName: \"kubernetes.io/projected/4e880133-18d7-49aa-9d13-27b4d1faee15-kube-api-access-62ckg\") pod \"seaweedfs-tls-custom-5c88b85bb7-vwjw4\" (UID: \"4e880133-18d7-49aa-9d13-27b4d1faee15\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-vwjw4" Apr 22 20:06:08.089130 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:08.089094 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-vwjw4" Apr 22 20:06:08.205856 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:08.205826 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-vwjw4"] Apr 22 20:06:08.208755 ip-10-0-128-61 kubenswrapper[2574]: W0422 20:06:08.208723 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e880133_18d7_49aa_9d13_27b4d1faee15.slice/crio-2575ef97842da6b929b8caf49be0244732a53e0d32c404ad00189ed94046f20e WatchSource:0}: Error finding container 2575ef97842da6b929b8caf49be0244732a53e0d32c404ad00189ed94046f20e: Status 404 returned error can't find the container with id 2575ef97842da6b929b8caf49be0244732a53e0d32c404ad00189ed94046f20e Apr 22 20:06:08.329126 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:08.329089 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-vwjw4" event={"ID":"4e880133-18d7-49aa-9d13-27b4d1faee15","Type":"ContainerStarted","Data":"2575ef97842da6b929b8caf49be0244732a53e0d32c404ad00189ed94046f20e"} Apr 22 20:06:11.340605 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:11.340011 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-vwjw4" event={"ID":"4e880133-18d7-49aa-9d13-27b4d1faee15","Type":"ContainerStarted","Data":"20db4c572ac505f5225a619f52dd705ff6033bada46a28c0c31cb77f9707e0b7"} Apr 22 20:06:11.357032 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:11.356979 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-vwjw4" podStartSLOduration=1.914546965 podStartE2EDuration="4.356963874s" podCreationTimestamp="2026-04-22 20:06:07 +0000 UTC" firstStartedPulling="2026-04-22 20:06:08.209955296 +0000 UTC m=+460.845338918" lastFinishedPulling="2026-04-22 20:06:10.65237221 +0000 UTC m=+463.287755827" observedRunningTime="2026-04-22 20:06:11.356457497 +0000 UTC m=+463.991841137" watchObservedRunningTime="2026-04-22 20:06:11.356963874 +0000 UTC m=+463.992347515" Apr 22 20:06:11.637751 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:11.637662 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-cdkxx"] Apr 22 20:06:11.642122 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:11.642098 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-cdkxx" Apr 22 20:06:11.647164 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:11.647136 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-cdkxx"] Apr 22 20:06:11.730329 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:11.730282 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cphn8\" (UniqueName: \"kubernetes.io/projected/15611ff2-e7f2-4520-a154-46d0fcb360b8-kube-api-access-cphn8\") pod \"s3-tls-init-custom-cdkxx\" (UID: \"15611ff2-e7f2-4520-a154-46d0fcb360b8\") " pod="kserve/s3-tls-init-custom-cdkxx" Apr 22 20:06:11.831286 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:11.831247 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cphn8\" (UniqueName: \"kubernetes.io/projected/15611ff2-e7f2-4520-a154-46d0fcb360b8-kube-api-access-cphn8\") pod \"s3-tls-init-custom-cdkxx\" (UID: \"15611ff2-e7f2-4520-a154-46d0fcb360b8\") " pod="kserve/s3-tls-init-custom-cdkxx" Apr 22 20:06:11.839457 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:11.839426 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cphn8\" (UniqueName: \"kubernetes.io/projected/15611ff2-e7f2-4520-a154-46d0fcb360b8-kube-api-access-cphn8\") pod \"s3-tls-init-custom-cdkxx\" (UID: \"15611ff2-e7f2-4520-a154-46d0fcb360b8\") " pod="kserve/s3-tls-init-custom-cdkxx" Apr 22 20:06:11.965148 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:11.965047 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-cdkxx" Apr 22 20:06:12.086509 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:12.086468 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-cdkxx"] Apr 22 20:06:12.089175 ip-10-0-128-61 kubenswrapper[2574]: W0422 20:06:12.089150 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15611ff2_e7f2_4520_a154_46d0fcb360b8.slice/crio-1eb1068d947eec2ce0c061bc4cf81026dea4e54b866f557b5ecc9a585b3fb727 WatchSource:0}: Error finding container 1eb1068d947eec2ce0c061bc4cf81026dea4e54b866f557b5ecc9a585b3fb727: Status 404 returned error can't find the container with id 1eb1068d947eec2ce0c061bc4cf81026dea4e54b866f557b5ecc9a585b3fb727 Apr 22 20:06:12.344972 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:12.344938 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-cdkxx" event={"ID":"15611ff2-e7f2-4520-a154-46d0fcb360b8","Type":"ContainerStarted","Data":"d36709303a0a2d38d32d3ca33987c317afda96b4300555e5b98f956da78697b9"} Apr 22 20:06:12.344972 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:12.344976 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-cdkxx" event={"ID":"15611ff2-e7f2-4520-a154-46d0fcb360b8","Type":"ContainerStarted","Data":"1eb1068d947eec2ce0c061bc4cf81026dea4e54b866f557b5ecc9a585b3fb727"} Apr 22 20:06:12.360740 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:12.360683 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-cdkxx" podStartSLOduration=1.360667732 podStartE2EDuration="1.360667732s" podCreationTimestamp="2026-04-22 20:06:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:06:12.359279503 +0000 UTC m=+464.994663146" watchObservedRunningTime="2026-04-22 20:06:12.360667732 +0000 UTC m=+464.996051371" Apr 22 20:06:17.361785 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:17.361744 2574 generic.go:358] "Generic (PLEG): container finished" podID="15611ff2-e7f2-4520-a154-46d0fcb360b8" containerID="d36709303a0a2d38d32d3ca33987c317afda96b4300555e5b98f956da78697b9" exitCode=0 Apr 22 20:06:17.362167 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:17.361819 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-cdkxx" event={"ID":"15611ff2-e7f2-4520-a154-46d0fcb360b8","Type":"ContainerDied","Data":"d36709303a0a2d38d32d3ca33987c317afda96b4300555e5b98f956da78697b9"} Apr 22 20:06:18.492477 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:18.492447 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-cdkxx" Apr 22 20:06:18.593606 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:18.593565 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cphn8\" (UniqueName: \"kubernetes.io/projected/15611ff2-e7f2-4520-a154-46d0fcb360b8-kube-api-access-cphn8\") pod \"15611ff2-e7f2-4520-a154-46d0fcb360b8\" (UID: \"15611ff2-e7f2-4520-a154-46d0fcb360b8\") " Apr 22 20:06:18.595657 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:18.595630 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15611ff2-e7f2-4520-a154-46d0fcb360b8-kube-api-access-cphn8" (OuterVolumeSpecName: "kube-api-access-cphn8") pod "15611ff2-e7f2-4520-a154-46d0fcb360b8" (UID: "15611ff2-e7f2-4520-a154-46d0fcb360b8"). InnerVolumeSpecName "kube-api-access-cphn8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:06:18.694731 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:18.694641 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cphn8\" (UniqueName: \"kubernetes.io/projected/15611ff2-e7f2-4520-a154-46d0fcb360b8-kube-api-access-cphn8\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:06:19.369674 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:19.369647 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-cdkxx" Apr 22 20:06:19.369846 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:19.369645 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-cdkxx" event={"ID":"15611ff2-e7f2-4520-a154-46d0fcb360b8","Type":"ContainerDied","Data":"1eb1068d947eec2ce0c061bc4cf81026dea4e54b866f557b5ecc9a585b3fb727"} Apr 22 20:06:19.369846 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:19.369753 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1eb1068d947eec2ce0c061bc4cf81026dea4e54b866f557b5ecc9a585b3fb727" Apr 22 20:06:22.275877 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:22.275841 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-nzstf"] Apr 22 20:06:22.276290 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:22.276158 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15611ff2-e7f2-4520-a154-46d0fcb360b8" containerName="s3-tls-init-custom" Apr 22 20:06:22.276290 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:22.276169 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="15611ff2-e7f2-4520-a154-46d0fcb360b8" containerName="s3-tls-init-custom" Apr 22 20:06:22.276290 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:22.276222 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="15611ff2-e7f2-4520-a154-46d0fcb360b8" containerName="s3-tls-init-custom" Apr 22 20:06:22.279208 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:22.279189 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-nzstf" Apr 22 20:06:22.281570 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:22.281547 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 22 20:06:22.284595 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:22.284570 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-nzstf"] Apr 22 20:06:22.425650 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:22.425594 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58gx4\" (UniqueName: \"kubernetes.io/projected/9a0d73ad-15bd-46d8-9ac3-ec2b3a7909e6-kube-api-access-58gx4\") pod \"s3-tls-init-serving-nzstf\" (UID: \"9a0d73ad-15bd-46d8-9ac3-ec2b3a7909e6\") " pod="kserve/s3-tls-init-serving-nzstf" Apr 22 20:06:22.526562 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:22.526471 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58gx4\" (UniqueName: \"kubernetes.io/projected/9a0d73ad-15bd-46d8-9ac3-ec2b3a7909e6-kube-api-access-58gx4\") pod \"s3-tls-init-serving-nzstf\" (UID: \"9a0d73ad-15bd-46d8-9ac3-ec2b3a7909e6\") " pod="kserve/s3-tls-init-serving-nzstf" Apr 22 20:06:22.537145 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:22.537119 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58gx4\" (UniqueName: \"kubernetes.io/projected/9a0d73ad-15bd-46d8-9ac3-ec2b3a7909e6-kube-api-access-58gx4\") pod \"s3-tls-init-serving-nzstf\" (UID: \"9a0d73ad-15bd-46d8-9ac3-ec2b3a7909e6\") " pod="kserve/s3-tls-init-serving-nzstf" Apr 22 20:06:22.600098 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:22.600060 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-nzstf" Apr 22 20:06:22.743800 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:22.743586 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-nzstf"] Apr 22 20:06:22.746407 ip-10-0-128-61 kubenswrapper[2574]: W0422 20:06:22.746378 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a0d73ad_15bd_46d8_9ac3_ec2b3a7909e6.slice/crio-3f7841176fa9eb38adf87669c310c93b66c84b1033bd0875631d3eb8b3b9e6e2 WatchSource:0}: Error finding container 3f7841176fa9eb38adf87669c310c93b66c84b1033bd0875631d3eb8b3b9e6e2: Status 404 returned error can't find the container with id 3f7841176fa9eb38adf87669c310c93b66c84b1033bd0875631d3eb8b3b9e6e2 Apr 22 20:06:23.383898 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:23.383859 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-nzstf" event={"ID":"9a0d73ad-15bd-46d8-9ac3-ec2b3a7909e6","Type":"ContainerStarted","Data":"22747ee034486ed4893ea5caf0d1816e7f7c5a205253265727743be9b5c306a3"} Apr 22 20:06:23.383898 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:23.383895 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-nzstf" event={"ID":"9a0d73ad-15bd-46d8-9ac3-ec2b3a7909e6","Type":"ContainerStarted","Data":"3f7841176fa9eb38adf87669c310c93b66c84b1033bd0875631d3eb8b3b9e6e2"} Apr 22 20:06:23.399919 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:23.399859 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-nzstf" podStartSLOduration=1.399838235 podStartE2EDuration="1.399838235s" podCreationTimestamp="2026-04-22 20:06:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 20:06:23.397666013 +0000 UTC m=+476.033049647" watchObservedRunningTime="2026-04-22 20:06:23.399838235 +0000 UTC m=+476.035221877" Apr 22 20:06:26.394427 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:26.394387 2574 generic.go:358] "Generic (PLEG): container finished" podID="9a0d73ad-15bd-46d8-9ac3-ec2b3a7909e6" containerID="22747ee034486ed4893ea5caf0d1816e7f7c5a205253265727743be9b5c306a3" exitCode=0 Apr 22 20:06:26.394755 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:26.394450 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-nzstf" event={"ID":"9a0d73ad-15bd-46d8-9ac3-ec2b3a7909e6","Type":"ContainerDied","Data":"22747ee034486ed4893ea5caf0d1816e7f7c5a205253265727743be9b5c306a3"} Apr 22 20:06:27.514611 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:27.514583 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-nzstf" Apr 22 20:06:27.668202 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:27.668127 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58gx4\" (UniqueName: \"kubernetes.io/projected/9a0d73ad-15bd-46d8-9ac3-ec2b3a7909e6-kube-api-access-58gx4\") pod \"9a0d73ad-15bd-46d8-9ac3-ec2b3a7909e6\" (UID: \"9a0d73ad-15bd-46d8-9ac3-ec2b3a7909e6\") " Apr 22 20:06:27.670104 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:27.670067 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a0d73ad-15bd-46d8-9ac3-ec2b3a7909e6-kube-api-access-58gx4" (OuterVolumeSpecName: "kube-api-access-58gx4") pod "9a0d73ad-15bd-46d8-9ac3-ec2b3a7909e6" (UID: "9a0d73ad-15bd-46d8-9ac3-ec2b3a7909e6"). InnerVolumeSpecName "kube-api-access-58gx4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 20:06:27.769303 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:27.769275 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-58gx4\" (UniqueName: \"kubernetes.io/projected/9a0d73ad-15bd-46d8-9ac3-ec2b3a7909e6-kube-api-access-58gx4\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 20:06:28.400451 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:28.400411 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-nzstf" event={"ID":"9a0d73ad-15bd-46d8-9ac3-ec2b3a7909e6","Type":"ContainerDied","Data":"3f7841176fa9eb38adf87669c310c93b66c84b1033bd0875631d3eb8b3b9e6e2"} Apr 22 20:06:28.400451 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:28.400461 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f7841176fa9eb38adf87669c310c93b66c84b1033bd0875631d3eb8b3b9e6e2" Apr 22 20:06:28.400451 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:06:28.400423 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-nzstf" Apr 22 20:08:27.807639 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:08:27.807609 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:08:27.813034 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:08:27.813004 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:13:27.827677 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:13:27.827599 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:13:27.834048 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:13:27.834024 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:18:27.848132 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:18:27.848102 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:18:27.855235 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:18:27.855213 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:23:27.869274 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:23:27.869242 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:23:27.876571 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:23:27.876546 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:28:27.889793 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:28:27.889766 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:28:27.906750 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:28:27.906728 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:33:27.917171 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:33:27.917146 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:33:27.927990 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:33:27.927963 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:38:27.938327 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:38:27.938302 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:38:27.949700 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:38:27.949676 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:43:27.959844 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:43:27.959761 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:43:27.973420 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:43:27.973398 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:48:27.980320 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:48:27.980292 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:48:27.998940 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:48:27.998914 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:53:28.001742 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:53:28.001711 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:53:28.026848 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:53:28.026821 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:58:28.032433 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:58:28.032402 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:58:28.049968 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:58:28.049945 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 20:59:30.591211 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:30.591177 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qn67f/must-gather-p8pgh"] Apr 22 20:59:30.591769 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:30.591514 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a0d73ad-15bd-46d8-9ac3-ec2b3a7909e6" containerName="s3-tls-init-serving" Apr 22 20:59:30.591769 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:30.591530 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a0d73ad-15bd-46d8-9ac3-ec2b3a7909e6" containerName="s3-tls-init-serving" Apr 22 20:59:30.591769 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:30.591584 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a0d73ad-15bd-46d8-9ac3-ec2b3a7909e6" containerName="s3-tls-init-serving" Apr 22 20:59:30.594473 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:30.594456 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qn67f/must-gather-p8pgh" Apr 22 20:59:30.596938 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:30.596916 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qn67f\"/\"kube-root-ca.crt\"" Apr 22 20:59:30.596938 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:30.596930 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qn67f\"/\"openshift-service-ca.crt\"" Apr 22 20:59:30.600906 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:30.600879 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qn67f/must-gather-p8pgh"] Apr 22 20:59:30.728758 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:30.728731 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58jmm\" (UniqueName: \"kubernetes.io/projected/8ffda091-c78c-454c-a956-bd7fb9751e25-kube-api-access-58jmm\") pod \"must-gather-p8pgh\" (UID: \"8ffda091-c78c-454c-a956-bd7fb9751e25\") " pod="openshift-must-gather-qn67f/must-gather-p8pgh" Apr 22 20:59:30.728851 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:30.728765 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8ffda091-c78c-454c-a956-bd7fb9751e25-must-gather-output\") pod \"must-gather-p8pgh\" (UID: \"8ffda091-c78c-454c-a956-bd7fb9751e25\") " pod="openshift-must-gather-qn67f/must-gather-p8pgh" Apr 22 20:59:30.830101 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:30.830079 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8ffda091-c78c-454c-a956-bd7fb9751e25-must-gather-output\") pod \"must-gather-p8pgh\" (UID: \"8ffda091-c78c-454c-a956-bd7fb9751e25\") " pod="openshift-must-gather-qn67f/must-gather-p8pgh" Apr 22 20:59:30.830180 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:30.830140 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-58jmm\" (UniqueName: \"kubernetes.io/projected/8ffda091-c78c-454c-a956-bd7fb9751e25-kube-api-access-58jmm\") pod \"must-gather-p8pgh\" (UID: \"8ffda091-c78c-454c-a956-bd7fb9751e25\") " pod="openshift-must-gather-qn67f/must-gather-p8pgh" Apr 22 20:59:30.830480 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:30.830453 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8ffda091-c78c-454c-a956-bd7fb9751e25-must-gather-output\") pod \"must-gather-p8pgh\" (UID: \"8ffda091-c78c-454c-a956-bd7fb9751e25\") " pod="openshift-must-gather-qn67f/must-gather-p8pgh" Apr 22 20:59:30.838682 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:30.838658 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-58jmm\" (UniqueName: \"kubernetes.io/projected/8ffda091-c78c-454c-a956-bd7fb9751e25-kube-api-access-58jmm\") pod \"must-gather-p8pgh\" (UID: \"8ffda091-c78c-454c-a956-bd7fb9751e25\") " pod="openshift-must-gather-qn67f/must-gather-p8pgh" Apr 22 20:59:30.913226 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:30.913174 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qn67f/must-gather-p8pgh" Apr 22 20:59:31.030020 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:31.029997 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qn67f/must-gather-p8pgh"] Apr 22 20:59:31.032638 ip-10-0-128-61 kubenswrapper[2574]: W0422 20:59:31.032607 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ffda091_c78c_454c_a956_bd7fb9751e25.slice/crio-a42b0619d565846501801973bc3d45f169c0d2a417adb1731891bc11f9b226a3 WatchSource:0}: Error finding container a42b0619d565846501801973bc3d45f169c0d2a417adb1731891bc11f9b226a3: Status 404 returned error can't find the container with id a42b0619d565846501801973bc3d45f169c0d2a417adb1731891bc11f9b226a3 Apr 22 20:59:31.034227 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:31.034208 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 20:59:31.493444 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:31.493415 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qn67f/must-gather-p8pgh" event={"ID":"8ffda091-c78c-454c-a956-bd7fb9751e25","Type":"ContainerStarted","Data":"a42b0619d565846501801973bc3d45f169c0d2a417adb1731891bc11f9b226a3"} Apr 22 20:59:35.508791 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:35.508702 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qn67f/must-gather-p8pgh" event={"ID":"8ffda091-c78c-454c-a956-bd7fb9751e25","Type":"ContainerStarted","Data":"584cd63fca48cbbd38ae75bb8f698ff5c99d9aae5f46ea459574af20cfce3ea4"} Apr 22 20:59:35.508791 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:35.508741 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qn67f/must-gather-p8pgh" event={"ID":"8ffda091-c78c-454c-a956-bd7fb9751e25","Type":"ContainerStarted","Data":"45fab98f85be704646dd6430b9e74c9775fae3b8eb68d72f9d4be8504e9e13ea"} Apr 22 20:59:35.524083 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:35.524038 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qn67f/must-gather-p8pgh" podStartSLOduration=1.481342161 podStartE2EDuration="5.524024676s" podCreationTimestamp="2026-04-22 20:59:30 +0000 UTC" firstStartedPulling="2026-04-22 20:59:31.034396115 +0000 UTC m=+3663.669779731" lastFinishedPulling="2026-04-22 20:59:35.077078621 +0000 UTC m=+3667.712462246" observedRunningTime="2026-04-22 20:59:35.522568928 +0000 UTC m=+3668.157952578" watchObservedRunningTime="2026-04-22 20:59:35.524024676 +0000 UTC m=+3668.159408315" Apr 22 20:59:54.567182 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:54.567151 2574 generic.go:358] "Generic (PLEG): container finished" podID="8ffda091-c78c-454c-a956-bd7fb9751e25" containerID="45fab98f85be704646dd6430b9e74c9775fae3b8eb68d72f9d4be8504e9e13ea" exitCode=0 Apr 22 20:59:54.567588 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:54.567225 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qn67f/must-gather-p8pgh" event={"ID":"8ffda091-c78c-454c-a956-bd7fb9751e25","Type":"ContainerDied","Data":"45fab98f85be704646dd6430b9e74c9775fae3b8eb68d72f9d4be8504e9e13ea"} Apr 22 20:59:54.567588 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:54.567533 2574 scope.go:117] "RemoveContainer" containerID="45fab98f85be704646dd6430b9e74c9775fae3b8eb68d72f9d4be8504e9e13ea" Apr 22 20:59:55.009051 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:55.008979 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qn67f_must-gather-p8pgh_8ffda091-c78c-454c-a956-bd7fb9751e25/gather/0.log" Apr 22 20:59:58.433126 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:58.433092 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-kdnth_c2238649-8094-4f67-abfd-33276e6b9b3a/global-pull-secret-syncer/0.log" Apr 22 20:59:58.559456 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:58.559427 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-z2q6l_abb9a485-04a7-4c17-a721-ef4a0635e91f/konnectivity-agent/0.log" Apr 22 20:59:58.631051 ip-10-0-128-61 kubenswrapper[2574]: I0422 20:59:58.631023 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-61.ec2.internal_38ecb0e569cd2e23a0f7c4c92d20d963/haproxy/0.log" Apr 22 21:00:00.458015 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:00.457983 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qn67f/must-gather-p8pgh"] Apr 22 21:00:00.458388 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:00.458199 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-qn67f/must-gather-p8pgh" podUID="8ffda091-c78c-454c-a956-bd7fb9751e25" containerName="copy" containerID="cri-o://584cd63fca48cbbd38ae75bb8f698ff5c99d9aae5f46ea459574af20cfce3ea4" gracePeriod=2 Apr 22 21:00:00.462897 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:00.462875 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qn67f/must-gather-p8pgh"] Apr 22 21:00:00.591601 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:00.591577 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qn67f_must-gather-p8pgh_8ffda091-c78c-454c-a956-bd7fb9751e25/copy/0.log" Apr 22 21:00:00.591965 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:00.591933 2574 generic.go:358] "Generic (PLEG): container finished" podID="8ffda091-c78c-454c-a956-bd7fb9751e25" containerID="584cd63fca48cbbd38ae75bb8f698ff5c99d9aae5f46ea459574af20cfce3ea4" exitCode=143 Apr 22 21:00:00.681039 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:00.681020 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qn67f_must-gather-p8pgh_8ffda091-c78c-454c-a956-bd7fb9751e25/copy/0.log" Apr 22 21:00:00.681415 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:00.681396 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qn67f/must-gather-p8pgh" Apr 22 21:00:00.683389 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:00.683366 2574 status_manager.go:895] "Failed to get status for pod" podUID="8ffda091-c78c-454c-a956-bd7fb9751e25" pod="openshift-must-gather-qn67f/must-gather-p8pgh" err="pods \"must-gather-p8pgh\" is forbidden: User \"system:node:ip-10-0-128-61.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-qn67f\": no relationship found between node 'ip-10-0-128-61.ec2.internal' and this object" Apr 22 21:00:00.792248 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:00.792189 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58jmm\" (UniqueName: \"kubernetes.io/projected/8ffda091-c78c-454c-a956-bd7fb9751e25-kube-api-access-58jmm\") pod \"8ffda091-c78c-454c-a956-bd7fb9751e25\" (UID: \"8ffda091-c78c-454c-a956-bd7fb9751e25\") " Apr 22 21:00:00.792248 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:00.792232 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8ffda091-c78c-454c-a956-bd7fb9751e25-must-gather-output\") pod \"8ffda091-c78c-454c-a956-bd7fb9751e25\" (UID: \"8ffda091-c78c-454c-a956-bd7fb9751e25\") " Apr 22 21:00:00.793721 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:00.793695 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ffda091-c78c-454c-a956-bd7fb9751e25-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8ffda091-c78c-454c-a956-bd7fb9751e25" (UID: "8ffda091-c78c-454c-a956-bd7fb9751e25"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 21:00:00.794287 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:00.794262 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ffda091-c78c-454c-a956-bd7fb9751e25-kube-api-access-58jmm" (OuterVolumeSpecName: "kube-api-access-58jmm") pod "8ffda091-c78c-454c-a956-bd7fb9751e25" (UID: "8ffda091-c78c-454c-a956-bd7fb9751e25"). InnerVolumeSpecName "kube-api-access-58jmm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 21:00:00.892891 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:00.892866 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-58jmm\" (UniqueName: \"kubernetes.io/projected/8ffda091-c78c-454c-a956-bd7fb9751e25-kube-api-access-58jmm\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 21:00:00.892891 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:00.892889 2574 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8ffda091-c78c-454c-a956-bd7fb9751e25-must-gather-output\") on node \"ip-10-0-128-61.ec2.internal\" DevicePath \"\"" Apr 22 21:00:01.595992 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:01.595964 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qn67f_must-gather-p8pgh_8ffda091-c78c-454c-a956-bd7fb9751e25/copy/0.log" Apr 22 21:00:01.596439 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:01.596262 2574 scope.go:117] "RemoveContainer" containerID="584cd63fca48cbbd38ae75bb8f698ff5c99d9aae5f46ea459574af20cfce3ea4" Apr 22 21:00:01.596439 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:01.596282 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qn67f/must-gather-p8pgh" Apr 22 21:00:01.598521 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:01.598489 2574 status_manager.go:895] "Failed to get status for pod" podUID="8ffda091-c78c-454c-a956-bd7fb9751e25" pod="openshift-must-gather-qn67f/must-gather-p8pgh" err="pods \"must-gather-p8pgh\" is forbidden: User \"system:node:ip-10-0-128-61.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-qn67f\": no relationship found between node 'ip-10-0-128-61.ec2.internal' and this object" Apr 22 21:00:01.604298 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:01.604280 2574 scope.go:117] "RemoveContainer" containerID="45fab98f85be704646dd6430b9e74c9775fae3b8eb68d72f9d4be8504e9e13ea" Apr 22 21:00:01.606559 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:01.606533 2574 status_manager.go:895] "Failed to get status for pod" podUID="8ffda091-c78c-454c-a956-bd7fb9751e25" pod="openshift-must-gather-qn67f/must-gather-p8pgh" err="pods \"must-gather-p8pgh\" is forbidden: User \"system:node:ip-10-0-128-61.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-qn67f\": no relationship found between node 'ip-10-0-128-61.ec2.internal' and this object" Apr 22 21:00:01.892452 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:01.892380 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ffda091-c78c-454c-a956-bd7fb9751e25" path="/var/lib/kubelet/pods/8ffda091-c78c-454c-a956-bd7fb9751e25/volumes" Apr 22 21:00:02.048425 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:02.048389 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-c9bk2_be2244ac-77cc-4970-bda6-135ba736f55c/cluster-monitoring-operator/0.log" Apr 22 21:00:02.070062 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:02.070036 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-pmwbs_446e489c-3e3e-41c2-b640-067654480e5c/kube-state-metrics/0.log" Apr 22 21:00:02.091127 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:02.091106 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-pmwbs_446e489c-3e3e-41c2-b640-067654480e5c/kube-rbac-proxy-main/0.log" Apr 22 21:00:02.118333 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:02.118313 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-pmwbs_446e489c-3e3e-41c2-b640-067654480e5c/kube-rbac-proxy-self/0.log" Apr 22 21:00:02.257553 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:02.257484 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h4qgm_53cf1257-d4b4-4aa5-95ad-282a875175a9/node-exporter/0.log" Apr 22 21:00:02.276728 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:02.276705 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h4qgm_53cf1257-d4b4-4aa5-95ad-282a875175a9/kube-rbac-proxy/0.log" Apr 22 21:00:02.294816 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:02.294799 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h4qgm_53cf1257-d4b4-4aa5-95ad-282a875175a9/init-textfile/0.log" Apr 22 21:00:02.465470 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:02.465452 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_abed7e62-f018-4bfc-97ca-4168a22c9b00/prometheus/0.log" Apr 22 21:00:02.482984 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:02.482965 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_abed7e62-f018-4bfc-97ca-4168a22c9b00/config-reloader/0.log" Apr 22 21:00:02.502094 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:02.502077 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_abed7e62-f018-4bfc-97ca-4168a22c9b00/thanos-sidecar/0.log" Apr 22 21:00:02.521750 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:02.521698 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_abed7e62-f018-4bfc-97ca-4168a22c9b00/kube-rbac-proxy-web/0.log" Apr 22 21:00:02.542451 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:02.542432 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_abed7e62-f018-4bfc-97ca-4168a22c9b00/kube-rbac-proxy/0.log" Apr 22 21:00:02.560287 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:02.560269 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_abed7e62-f018-4bfc-97ca-4168a22c9b00/kube-rbac-proxy-thanos/0.log" Apr 22 21:00:02.578439 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:02.578425 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_abed7e62-f018-4bfc-97ca-4168a22c9b00/init-config-reloader/0.log" Apr 22 21:00:02.602099 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:02.602083 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-pct54_b301d43e-1b56-4056-b42e-684724967abb/prometheus-operator/0.log" Apr 22 21:00:02.619909 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:02.619891 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-pct54_b301d43e-1b56-4056-b42e-684724967abb/kube-rbac-proxy/0.log" Apr 22 21:00:02.732953 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:02.732937 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5559fc8748-cqjgv_c1ea0334-75a8-451e-b739-774bae8cf624/thanos-query/0.log" Apr 22 21:00:02.753029 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:02.753015 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5559fc8748-cqjgv_c1ea0334-75a8-451e-b739-774bae8cf624/kube-rbac-proxy-web/0.log" Apr 22 21:00:02.774294 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:02.774243 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5559fc8748-cqjgv_c1ea0334-75a8-451e-b739-774bae8cf624/kube-rbac-proxy/0.log" Apr 22 21:00:02.791822 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:02.791808 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5559fc8748-cqjgv_c1ea0334-75a8-451e-b739-774bae8cf624/prom-label-proxy/0.log" Apr 22 21:00:02.809963 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:02.809946 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5559fc8748-cqjgv_c1ea0334-75a8-451e-b739-774bae8cf624/kube-rbac-proxy-rules/0.log" Apr 22 21:00:02.827674 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:02.827660 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5559fc8748-cqjgv_c1ea0334-75a8-451e-b739-774bae8cf624/kube-rbac-proxy-metrics/0.log" Apr 22 21:00:04.006165 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:04.006138 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-j65gx_421f7c2c-ae76-458c-aaf2-422f5b7a1f27/networking-console-plugin/0.log" Apr 22 21:00:04.447133 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:04.447100 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/2.log" Apr 22 21:00:04.450517 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:04.450497 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-5hfh2_e9453f8c-04f5-4b72-b26e-c5ccc3bfed06/console-operator/3.log" Apr 22 21:00:05.937180 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:05.937145 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-x2267_41de400a-d8c0-4987-9f2a-ec97460903ec/dns/0.log" Apr 22 21:00:05.954913 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:05.954891 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-x2267_41de400a-d8c0-4987-9f2a-ec97460903ec/kube-rbac-proxy/0.log" Apr 22 21:00:05.973864 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:05.973840 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s75m7_64949cb3-7087-4f51-8a7a-81b46c0895c9/dns-node-resolver/0.log" Apr 22 21:00:06.355797 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.355760 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk"] Apr 22 21:00:06.356141 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.356123 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ffda091-c78c-454c-a956-bd7fb9751e25" containerName="copy" Apr 22 21:00:06.356223 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.356143 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ffda091-c78c-454c-a956-bd7fb9751e25" containerName="copy" Apr 22 21:00:06.356223 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.356159 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ffda091-c78c-454c-a956-bd7fb9751e25" containerName="gather" Apr 22 21:00:06.356223 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.356166 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ffda091-c78c-454c-a956-bd7fb9751e25" containerName="gather" Apr 22 21:00:06.356407 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.356258 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ffda091-c78c-454c-a956-bd7fb9751e25" containerName="copy" Apr 22 21:00:06.356407 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.356271 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ffda091-c78c-454c-a956-bd7fb9751e25" containerName="gather" Apr 22 21:00:06.359979 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.359958 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk" Apr 22 21:00:06.362395 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.362353 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mrjvk\"/\"openshift-service-ca.crt\"" Apr 22 21:00:06.364068 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.363568 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mrjvk\"/\"kube-root-ca.crt\"" Apr 22 21:00:06.364273 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.364254 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-mrjvk\"/\"default-dockercfg-l7rpd\"" Apr 22 21:00:06.365694 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.365674 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk"] Apr 22 21:00:06.430567 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.430546 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-798f685885-kbmjw_8e4a343e-1122-478c-8882-f7bdc03c0cb4/registry/0.log" Apr 22 21:00:06.437197 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.437178 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4df3dffd-5ea3-414a-a73e-10820d09de6e-podres\") pod \"perf-node-gather-daemonset-x2bxk\" (UID: \"4df3dffd-5ea3-414a-a73e-10820d09de6e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk" Apr 22 21:00:06.437266 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.437232 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4df3dffd-5ea3-414a-a73e-10820d09de6e-sys\") pod \"perf-node-gather-daemonset-x2bxk\" (UID: \"4df3dffd-5ea3-414a-a73e-10820d09de6e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk" Apr 22 21:00:06.437266 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.437256 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pggx2\" (UniqueName: \"kubernetes.io/projected/4df3dffd-5ea3-414a-a73e-10820d09de6e-kube-api-access-pggx2\") pod \"perf-node-gather-daemonset-x2bxk\" (UID: \"4df3dffd-5ea3-414a-a73e-10820d09de6e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk" Apr 22 21:00:06.437337 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.437313 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4df3dffd-5ea3-414a-a73e-10820d09de6e-proc\") pod \"perf-node-gather-daemonset-x2bxk\" (UID: \"4df3dffd-5ea3-414a-a73e-10820d09de6e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk" Apr 22 21:00:06.437388 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.437340 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4df3dffd-5ea3-414a-a73e-10820d09de6e-lib-modules\") pod \"perf-node-gather-daemonset-x2bxk\" (UID: \"4df3dffd-5ea3-414a-a73e-10820d09de6e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk" Apr 22 21:00:06.537695 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.537670 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4df3dffd-5ea3-414a-a73e-10820d09de6e-sys\") pod \"perf-node-gather-daemonset-x2bxk\" (UID: \"4df3dffd-5ea3-414a-a73e-10820d09de6e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk" Apr 22 21:00:06.537695 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.537695 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-m58j4_77f8f3a1-aebf-4a43-97ef-0a217a8920be/node-ca/0.log" Apr 22 21:00:06.537868 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.537705 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pggx2\" (UniqueName: \"kubernetes.io/projected/4df3dffd-5ea3-414a-a73e-10820d09de6e-kube-api-access-pggx2\") pod \"perf-node-gather-daemonset-x2bxk\" (UID: \"4df3dffd-5ea3-414a-a73e-10820d09de6e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk" Apr 22 21:00:06.537868 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.537764 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4df3dffd-5ea3-414a-a73e-10820d09de6e-proc\") pod \"perf-node-gather-daemonset-x2bxk\" (UID: \"4df3dffd-5ea3-414a-a73e-10820d09de6e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk" Apr 22 21:00:06.537868 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.537785 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4df3dffd-5ea3-414a-a73e-10820d09de6e-sys\") pod \"perf-node-gather-daemonset-x2bxk\" (UID: \"4df3dffd-5ea3-414a-a73e-10820d09de6e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk" Apr 22 21:00:06.537868 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.537789 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4df3dffd-5ea3-414a-a73e-10820d09de6e-lib-modules\") pod \"perf-node-gather-daemonset-x2bxk\" (UID: \"4df3dffd-5ea3-414a-a73e-10820d09de6e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk" Apr 22 21:00:06.537868 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.537826 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4df3dffd-5ea3-414a-a73e-10820d09de6e-proc\") pod \"perf-node-gather-daemonset-x2bxk\" (UID: \"4df3dffd-5ea3-414a-a73e-10820d09de6e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk" Apr 22 21:00:06.538088 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.537882 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4df3dffd-5ea3-414a-a73e-10820d09de6e-podres\") pod \"perf-node-gather-daemonset-x2bxk\" (UID: \"4df3dffd-5ea3-414a-a73e-10820d09de6e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk" Apr 22 21:00:06.538088 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.537951 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4df3dffd-5ea3-414a-a73e-10820d09de6e-lib-modules\") pod \"perf-node-gather-daemonset-x2bxk\" (UID: \"4df3dffd-5ea3-414a-a73e-10820d09de6e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk" Apr 22 21:00:06.538088 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.538023 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4df3dffd-5ea3-414a-a73e-10820d09de6e-podres\") pod \"perf-node-gather-daemonset-x2bxk\" (UID: \"4df3dffd-5ea3-414a-a73e-10820d09de6e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk" Apr 22 21:00:06.544888 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.544869 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pggx2\" (UniqueName: \"kubernetes.io/projected/4df3dffd-5ea3-414a-a73e-10820d09de6e-kube-api-access-pggx2\") pod \"perf-node-gather-daemonset-x2bxk\" (UID: \"4df3dffd-5ea3-414a-a73e-10820d09de6e\") " pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk" Apr 22 21:00:06.670322 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.670264 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk" Apr 22 21:00:06.988625 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:06.988454 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk"] Apr 22 21:00:06.990877 ip-10-0-128-61 kubenswrapper[2574]: W0422 21:00:06.990843 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4df3dffd_5ea3_414a_a73e_10820d09de6e.slice/crio-3a18b260d853ab383165407aef7130aa62a96a64509028541f3b31d204f55cdc WatchSource:0}: Error finding container 3a18b260d853ab383165407aef7130aa62a96a64509028541f3b31d204f55cdc: Status 404 returned error can't find the container with id 3a18b260d853ab383165407aef7130aa62a96a64509028541f3b31d204f55cdc Apr 22 21:00:07.615249 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:07.615211 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk" event={"ID":"4df3dffd-5ea3-414a-a73e-10820d09de6e","Type":"ContainerStarted","Data":"b44fce0936e883ff1228fa132a249ca14fea423c6d7f870575a94d673ea85662"} Apr 22 21:00:07.615249 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:07.615249 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk" event={"ID":"4df3dffd-5ea3-414a-a73e-10820d09de6e","Type":"ContainerStarted","Data":"3a18b260d853ab383165407aef7130aa62a96a64509028541f3b31d204f55cdc"} Apr 22 21:00:07.615475 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:07.615272 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk" Apr 22 21:00:07.631750 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:07.631704 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk" podStartSLOduration=1.631690498 podStartE2EDuration="1.631690498s" podCreationTimestamp="2026-04-22 21:00:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 21:00:07.630793184 +0000 UTC m=+3700.266176824" watchObservedRunningTime="2026-04-22 21:00:07.631690498 +0000 UTC m=+3700.267074136" Apr 22 21:00:13.628502 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:13.628466 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-mrjvk/perf-node-gather-daemonset-x2bxk" Apr 22 21:00:53.564536 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:53.564462 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-s2p2q_52f5be1a-f9aa-4bf7-992f-277ec5922772/serve-healthcheck-canary/0.log" Apr 22 21:00:54.040327 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:54.040257 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rjvgq_7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5/kube-rbac-proxy/0.log" Apr 22 21:00:54.057684 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:54.057664 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rjvgq_7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5/exporter/0.log" Apr 22 21:00:54.075164 ip-10-0-128-61 kubenswrapper[2574]: I0422 21:00:54.075141 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-rjvgq_7a889f06-e0b0-4f8f-8b7b-5f1b94796ab5/extractor/0.log"