Apr 21 07:49:19.699307 ip-10-0-138-20 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 07:49:19.699320 ip-10-0-138-20 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 07:49:19.699329 ip-10-0-138-20 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 07:49:19.699692 ip-10-0-138-20 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 07:49:29.930032 ip-10-0-138-20 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 07:49:29.930052 ip-10-0-138-20 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot b22c141931074622b9ca7421374602ee -- Apr 21 07:51:53.479202 ip-10-0-138-20 systemd[1]: Starting Kubernetes Kubelet... Apr 21 07:51:53.869771 ip-10-0-138-20 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 07:51:53.869771 ip-10-0-138-20 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 07:51:53.869771 ip-10-0-138-20 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 07:51:53.869771 ip-10-0-138-20 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 07:51:53.869771 ip-10-0-138-20 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 07:51:53.872229 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.872083 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 07:51:53.876351 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876335 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:51:53.876351 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876351 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:51:53.876415 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876355 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:51:53.876415 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876358 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:51:53.876415 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876361 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:51:53.876415 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876364 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:51:53.876415 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876367 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:51:53.876415 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876369 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:51:53.876415 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876378 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:51:53.876415 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876380 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:51:53.876415 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876383 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:51:53.876415 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876385 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:51:53.876415 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876388 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:51:53.876415 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876390 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:51:53.876415 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876393 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:51:53.876415 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876395 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:51:53.876415 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876398 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:51:53.876415 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876400 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:51:53.876415 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876403 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:51:53.876415 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876405 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:51:53.876415 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876408 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:51:53.876415 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876410 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:51:53.876895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876414 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:51:53.876895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876416 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:51:53.876895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876419 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:51:53.876895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876421 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:51:53.876895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876424 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:51:53.876895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876427 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:51:53.876895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876429 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:51:53.876895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876432 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:51:53.876895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876434 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:51:53.876895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876436 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:51:53.876895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876439 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:51:53.876895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876441 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:51:53.876895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876444 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:51:53.876895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876448 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:51:53.876895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876452 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:51:53.876895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876454 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:51:53.876895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876457 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:51:53.876895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876460 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:51:53.876895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876462 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:51:53.877413 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876465 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:51:53.877413 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876467 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:51:53.877413 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876470 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:51:53.877413 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876472 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:51:53.877413 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876475 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:51:53.877413 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876478 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:51:53.877413 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876480 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:51:53.877413 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876482 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:51:53.877413 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876485 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:51:53.877413 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876487 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:51:53.877413 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876489 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:51:53.877413 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876492 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:51:53.877413 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876495 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:51:53.877413 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876497 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:51:53.877413 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876500 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:51:53.877413 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876503 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:51:53.877413 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876506 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:51:53.877413 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876508 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:51:53.877413 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876511 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:51:53.877413 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876514 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:51:53.877895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876517 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:51:53.877895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876520 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:51:53.877895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876522 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:51:53.877895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876525 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:51:53.877895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876527 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:51:53.877895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876531 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:51:53.877895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876535 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:51:53.877895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876538 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:51:53.877895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876540 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:51:53.877895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876543 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:51:53.877895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876545 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:51:53.877895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876548 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:51:53.877895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876550 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:51:53.877895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876553 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:51:53.877895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876555 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:51:53.877895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876558 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:51:53.877895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876560 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:51:53.877895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876563 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:51:53.877895 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876566 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:51:53.878374 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876568 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:51:53.878374 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876571 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:51:53.878374 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876574 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:51:53.878374 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876576 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:51:53.878374 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876579 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:51:53.878374 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876581 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:51:53.878374 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876991 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:51:53.878374 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.876998 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:51:53.878374 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877002 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:51:53.878374 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877004 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:51:53.878374 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877007 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:51:53.878374 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877010 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:51:53.878374 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877013 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:51:53.878374 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877016 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:51:53.878374 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877019 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:51:53.878374 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877022 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:51:53.878374 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877025 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:51:53.878374 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877028 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:51:53.878374 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877030 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:51:53.878374 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877032 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:51:53.878856 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877035 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:51:53.878856 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877037 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:51:53.878856 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877040 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:51:53.878856 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877042 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:51:53.878856 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877045 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:51:53.878856 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877047 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:51:53.878856 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877051 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:51:53.878856 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877053 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:51:53.878856 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877056 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:51:53.878856 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877059 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:51:53.878856 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877061 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:51:53.878856 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877064 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:51:53.878856 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877066 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:51:53.878856 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877068 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:51:53.878856 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877071 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:51:53.878856 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877073 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:51:53.878856 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877076 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:51:53.878856 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877078 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:51:53.878856 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877081 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:51:53.879336 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877084 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:51:53.879336 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877087 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:51:53.879336 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877089 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:51:53.879336 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877092 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:51:53.879336 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877094 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:51:53.879336 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877097 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:51:53.879336 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877099 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:51:53.879336 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877101 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:51:53.879336 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877104 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:51:53.879336 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877106 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:51:53.879336 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877109 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:51:53.879336 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877112 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:51:53.879336 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877116 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:51:53.879336 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877119 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:51:53.879336 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877121 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:51:53.879336 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877124 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:51:53.879336 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877126 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:51:53.879336 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877129 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:51:53.879336 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877132 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:51:53.879838 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877134 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:51:53.879838 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877137 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:51:53.879838 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877141 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:51:53.879838 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877144 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:51:53.879838 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877147 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:51:53.879838 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877150 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:51:53.879838 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877153 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:51:53.879838 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877155 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:51:53.879838 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877158 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:51:53.879838 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877160 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:51:53.879838 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877163 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:51:53.879838 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877166 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:51:53.879838 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877169 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:51:53.879838 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877172 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:51:53.879838 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877175 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:51:53.879838 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877178 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:51:53.879838 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877181 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:51:53.879838 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877183 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:51:53.879838 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877187 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:51:53.880323 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877190 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:51:53.880323 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877192 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:51:53.880323 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877195 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:51:53.880323 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877197 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:51:53.880323 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877199 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:51:53.880323 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877202 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:51:53.880323 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877204 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:51:53.880323 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877207 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:51:53.880323 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877209 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:51:53.880323 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877211 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:51:53.880323 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877214 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:51:53.880323 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877216 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:51:53.880323 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877219 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:51:53.880323 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877221 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:51:53.880323 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.877223 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:51:53.880323 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877895 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 07:51:53.880323 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877904 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 07:51:53.880323 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877911 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 07:51:53.880323 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877915 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 07:51:53.880323 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877931 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 07:51:53.880323 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877935 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877939 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877944 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877947 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877950 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877953 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877958 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877962 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877965 2574 flags.go:64] FLAG: --cgroup-root="" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877968 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877971 2574 flags.go:64] FLAG: --client-ca-file="" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877973 2574 flags.go:64] FLAG: --cloud-config="" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877976 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877979 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877983 2574 flags.go:64] FLAG: --cluster-domain="" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877986 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877989 2574 flags.go:64] FLAG: --config-dir="" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877992 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877995 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.877999 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878002 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878005 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878008 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878011 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 21 07:51:53.880853 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878013 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878016 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878019 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878022 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878027 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878030 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878033 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878035 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878038 2574 flags.go:64] FLAG: --enable-server="true" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878041 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878045 2574 flags.go:64] FLAG: --event-burst="100" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878049 2574 flags.go:64] FLAG: --event-qps="50" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878051 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878054 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878058 2574 flags.go:64] FLAG: --eviction-hard="" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878061 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878064 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878067 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878070 2574 flags.go:64] FLAG: --eviction-soft="" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878073 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878075 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878079 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878082 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878084 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878087 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 07:51:53.881473 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878090 2574 flags.go:64] FLAG: --feature-gates="" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878094 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878096 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878099 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878102 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878105 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878108 2574 flags.go:64] FLAG: --help="false" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878111 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-138-20.ec2.internal" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878114 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878117 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878119 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878123 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878127 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878130 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878132 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878135 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878138 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878140 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878143 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878146 2574 flags.go:64] FLAG: --kube-reserved="" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878148 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878151 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878154 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878157 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 07:51:53.882179 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878160 2574 flags.go:64] FLAG: --lock-file="" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878163 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878165 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878168 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878173 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878176 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878179 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878182 2574 flags.go:64] FLAG: --logging-format="text" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878185 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878188 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878190 2574 flags.go:64] FLAG: --manifest-url="" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878193 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878198 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878201 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878205 2574 flags.go:64] FLAG: --max-pods="110" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878208 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878211 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878213 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878216 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878219 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878222 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878225 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878232 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878235 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878238 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 07:51:53.882765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878241 2574 flags.go:64] FLAG: --pod-cidr="" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878244 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878249 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878251 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878254 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878257 2574 flags.go:64] FLAG: --port="10250" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878260 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878263 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f85cda4624704c36" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878266 2574 flags.go:64] FLAG: --qos-reserved="" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878269 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878272 2574 flags.go:64] FLAG: --register-node="true" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878275 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878278 2574 flags.go:64] FLAG: --register-with-taints="" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878281 2574 flags.go:64] FLAG: --registry-burst="10" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878284 2574 flags.go:64] FLAG: --registry-qps="5" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878287 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878290 2574 flags.go:64] FLAG: --reserved-memory="" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878294 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878296 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878299 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878302 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878305 2574 flags.go:64] FLAG: --runonce="false" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878308 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878310 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878313 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 21 07:51:53.883380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878316 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878319 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878321 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878325 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878328 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878331 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878334 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878337 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878340 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878342 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878345 2574 flags.go:64] FLAG: --system-cgroups="" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878348 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878353 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878356 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878359 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878362 2574 flags.go:64] FLAG: --tls-min-version="" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878365 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878368 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878371 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878374 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878377 2574 flags.go:64] FLAG: --v="2" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878381 2574 flags.go:64] FLAG: --version="false" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878385 2574 flags.go:64] FLAG: --vmodule="" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878389 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878392 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 07:51:53.884193 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878478 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:51:53.885132 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878484 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:51:53.885132 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878487 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:51:53.885132 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878490 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:51:53.885132 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878493 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:51:53.885132 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878495 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:51:53.885132 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878498 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:51:53.885132 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878500 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:51:53.885132 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878503 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:51:53.885132 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878505 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:51:53.885132 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878508 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:51:53.885132 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878510 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:51:53.885132 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878513 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:51:53.885132 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878516 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:51:53.885132 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878519 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:51:53.885132 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878521 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:51:53.885132 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878524 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:51:53.885132 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878527 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:51:53.885132 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878529 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:51:53.885132 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878535 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:51:53.885132 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878537 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:51:53.885677 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878540 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:51:53.885677 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878542 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:51:53.885677 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878545 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:51:53.885677 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878548 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:51:53.885677 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878550 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:51:53.885677 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878553 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:51:53.885677 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878555 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:51:53.885677 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878558 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:51:53.885677 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878560 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:51:53.885677 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878563 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:51:53.885677 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878565 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:51:53.885677 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878568 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:51:53.885677 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878572 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:51:53.885677 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878574 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:51:53.885677 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878577 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:51:53.885677 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878580 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:51:53.885677 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878584 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:51:53.885677 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878587 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:51:53.885677 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878590 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:51:53.885677 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878592 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:51:53.886186 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878596 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:51:53.886186 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878599 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:51:53.886186 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878603 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:51:53.886186 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878605 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:51:53.886186 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878608 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:51:53.886186 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878611 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:51:53.886186 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878614 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:51:53.886186 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878616 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:51:53.886186 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878619 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:51:53.886186 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878621 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:51:53.886186 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878625 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:51:53.886186 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878627 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:51:53.886186 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878630 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:51:53.886186 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878632 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:51:53.886186 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878635 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:51:53.886186 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878637 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:51:53.886186 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878640 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:51:53.886186 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878642 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:51:53.886186 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878645 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:51:53.886661 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878647 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:51:53.886661 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878650 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:51:53.886661 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878652 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:51:53.886661 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878655 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:51:53.886661 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878658 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:51:53.886661 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878661 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:51:53.886661 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878664 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:51:53.886661 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878666 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:51:53.886661 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878669 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:51:53.886661 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878671 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:51:53.886661 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878674 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:51:53.886661 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878676 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:51:53.886661 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878678 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:51:53.886661 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878681 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:51:53.886661 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878684 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:51:53.886661 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878686 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:51:53.886661 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878689 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:51:53.886661 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878691 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:51:53.886661 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878694 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:51:53.886661 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878697 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:51:53.887168 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878699 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:51:53.887168 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878701 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:51:53.887168 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878704 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:51:53.887168 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878708 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:51:53.887168 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878710 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:51:53.887168 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.878713 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:51:53.887168 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.878718 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 07:51:53.887168 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.886089 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 07:51:53.887168 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.886106 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 07:51:53.887168 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886158 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:51:53.887168 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886163 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:51:53.887168 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886169 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:51:53.887168 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886173 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:51:53.887168 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886176 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:51:53.887168 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886180 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:51:53.887543 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886183 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:51:53.887543 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886186 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:51:53.887543 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886189 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:51:53.887543 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886192 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:51:53.887543 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886195 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:51:53.887543 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886198 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:51:53.887543 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886201 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:51:53.887543 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886204 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:51:53.887543 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886206 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:51:53.887543 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886209 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:51:53.887543 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886212 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:51:53.887543 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886214 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:51:53.887543 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886217 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:51:53.887543 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886219 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:51:53.887543 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886222 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:51:53.887543 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886224 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:51:53.887543 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886227 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:51:53.887543 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886229 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:51:53.887543 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886232 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:51:53.887543 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886234 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:51:53.888128 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886237 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:51:53.888128 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886239 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:51:53.888128 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886249 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:51:53.888128 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886251 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:51:53.888128 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886255 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:51:53.888128 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886258 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:51:53.888128 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886260 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:51:53.888128 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886263 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:51:53.888128 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886266 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:51:53.888128 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886268 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:51:53.888128 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886271 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:51:53.888128 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886274 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:51:53.888128 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886276 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:51:53.888128 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886279 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:51:53.888128 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886281 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:51:53.888128 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886284 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:51:53.888128 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886286 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:51:53.888128 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886288 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:51:53.888128 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886291 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:51:53.888128 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886293 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:51:53.888638 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886296 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:51:53.888638 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886298 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:51:53.888638 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886301 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:51:53.888638 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886303 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:51:53.888638 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886306 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:51:53.888638 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886309 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:51:53.888638 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886312 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:51:53.888638 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886314 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:51:53.888638 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886317 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:51:53.888638 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886320 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:51:53.888638 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886322 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:51:53.888638 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886325 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:51:53.888638 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886329 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:51:53.888638 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886332 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:51:53.888638 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886335 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:51:53.888638 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886338 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:51:53.888638 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886340 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:51:53.888638 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886343 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:51:53.888638 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886345 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:51:53.888638 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886348 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:51:53.889137 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886350 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:51:53.889137 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886353 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:51:53.889137 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886355 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:51:53.889137 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886358 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:51:53.889137 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886360 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:51:53.889137 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886363 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:51:53.889137 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886365 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:51:53.889137 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886367 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:51:53.889137 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886370 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:51:53.889137 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886372 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:51:53.889137 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886375 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:51:53.889137 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886378 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:51:53.889137 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886380 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:51:53.889137 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886383 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:51:53.889137 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886385 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:51:53.889137 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886388 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:51:53.889137 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886390 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:51:53.889137 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886393 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:51:53.889137 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886395 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:51:53.889628 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886397 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:51:53.889628 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.886402 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 07:51:53.889628 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886503 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:51:53.889628 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886508 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:51:53.889628 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886511 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:51:53.889628 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886514 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:51:53.889628 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886516 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:51:53.889628 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886519 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:51:53.889628 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886522 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:51:53.889628 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886524 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:51:53.889628 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886527 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:51:53.889628 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886529 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:51:53.889628 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886532 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:51:53.889628 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886534 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:51:53.889628 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886536 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:51:53.890013 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886539 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:51:53.890013 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886541 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:51:53.890013 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886544 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:51:53.890013 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886546 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:51:53.890013 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886548 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:51:53.890013 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886551 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:51:53.890013 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886553 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:51:53.890013 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886556 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:51:53.890013 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886558 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:51:53.890013 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886561 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:51:53.890013 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886563 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:51:53.890013 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886566 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:51:53.890013 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886568 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:51:53.890013 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886571 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:51:53.890013 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886573 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:51:53.890013 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886576 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:51:53.890013 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886578 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:51:53.890013 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886581 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:51:53.890013 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886584 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:51:53.890013 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886586 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:51:53.890493 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886589 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:51:53.890493 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886591 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:51:53.890493 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886594 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:51:53.890493 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886597 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:51:53.890493 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886599 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:51:53.890493 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886601 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:51:53.890493 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886604 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:51:53.890493 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886607 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:51:53.890493 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886609 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:51:53.890493 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886611 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:51:53.890493 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886614 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:51:53.890493 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886616 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:51:53.890493 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886619 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:51:53.890493 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886621 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:51:53.890493 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886623 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:51:53.890493 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886627 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:51:53.890493 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886630 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:51:53.890493 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886633 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:51:53.890493 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886636 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:51:53.890493 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886639 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:51:53.890998 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886642 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:51:53.890998 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886644 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:51:53.890998 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886648 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:51:53.890998 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886651 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:51:53.890998 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886653 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:51:53.890998 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886656 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:51:53.890998 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886659 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:51:53.890998 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886661 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:51:53.890998 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886664 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:51:53.890998 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886666 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:51:53.890998 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886669 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:51:53.890998 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886671 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:51:53.890998 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886674 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:51:53.890998 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886677 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:51:53.890998 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886679 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:51:53.890998 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886681 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:51:53.890998 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886684 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:51:53.890998 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886686 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:51:53.890998 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886688 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:51:53.890998 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886691 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:51:53.891477 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886694 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:51:53.891477 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886696 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:51:53.891477 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886699 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:51:53.891477 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886701 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:51:53.891477 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886704 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:51:53.891477 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886708 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:51:53.891477 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886711 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:51:53.891477 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886713 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:51:53.891477 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886715 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:51:53.891477 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886718 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:51:53.891477 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886720 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:51:53.891477 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886723 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:51:53.891477 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:53.886725 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:51:53.891477 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.886730 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 07:51:53.891477 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.887456 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 07:51:53.891840 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.890078 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 07:51:53.891840 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.890809 2574 server.go:1019] "Starting client certificate rotation" Apr 21 07:51:53.891840 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.890900 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 07:51:53.891840 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.891036 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 07:51:53.912817 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.912799 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 07:51:53.917668 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.917644 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 07:51:53.929133 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.929112 2574 log.go:25] "Validated CRI v1 runtime API" Apr 21 07:51:53.933959 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.933943 2574 log.go:25] "Validated CRI v1 image API" Apr 21 07:51:53.935688 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.935674 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 07:51:53.935914 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.935898 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 07:51:53.938487 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.938463 2574 fs.go:135] Filesystem UUIDs: map[5bacedc2-af17-4675-88fc-2232cdbdfb81:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 b93464c1-bf54-4cd9-bdf9-cd4efe9d6499:/dev/nvme0n1p3] Apr 21 07:51:53.938567 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.938488 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 07:51:53.945792 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.945681 2574 manager.go:217] Machine: {Timestamp:2026-04-21 07:51:53.94374775 +0000 UTC m=+0.356532857 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3102907 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec247f848875b600c310706000bfa969 SystemUUID:ec247f84-8875-b600-c310-706000bfa969 BootID:b22c1419-3107-4622-b9ca-7421374602ee Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:75:2d:8c:77:39 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:75:2d:8c:77:39 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ca:cb:35:fe:fd:99 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 07:51:53.945792 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.945780 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 07:51:53.945973 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.945886 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 07:51:53.947844 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.947814 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 07:51:53.948022 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.947847 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-20.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 07:51:53.948102 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.948037 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 07:51:53.948102 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.948050 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 07:51:53.948102 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.948067 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 07:51:53.949647 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.949634 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 07:51:53.950863 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.950851 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 21 07:51:53.951003 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.950992 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 07:51:53.953338 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.953327 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 21 07:51:53.953395 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.953349 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 07:51:53.953395 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.953370 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 07:51:53.953395 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.953383 2574 kubelet.go:397] "Adding apiserver pod source" Apr 21 07:51:53.953395 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.953395 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 07:51:53.954366 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.954353 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 07:51:53.954430 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.954376 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 07:51:53.956380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.956363 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gz25x" Apr 21 07:51:53.957239 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.957227 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 07:51:53.958855 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.958840 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 07:51:53.960018 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.960002 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 07:51:53.960060 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.960032 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 07:51:53.960060 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.960045 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 07:51:53.960060 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.960056 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 07:51:53.960147 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.960064 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 07:51:53.960147 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.960070 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 07:51:53.960147 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.960075 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 07:51:53.960147 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.960081 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 07:51:53.960147 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.960088 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 07:51:53.960147 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.960094 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 07:51:53.960147 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.960104 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 07:51:53.960147 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.960113 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 07:51:53.961026 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.961015 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 07:51:53.961070 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.961029 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 07:51:53.961395 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.961380 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gz25x" Apr 21 07:51:53.963475 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.963460 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-20.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 07:51:53.964502 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:53.964480 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-20.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 07:51:53.964549 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:53.964479 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 07:51:53.964865 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.964853 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 07:51:53.964899 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.964887 2574 server.go:1295] "Started kubelet" Apr 21 07:51:53.964986 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.964963 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 07:51:53.965038 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.964995 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 07:51:53.965069 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.965057 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 07:51:53.965729 ip-10-0-138-20 systemd[1]: Started Kubernetes Kubelet. Apr 21 07:51:53.966225 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.966127 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 07:51:53.967598 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.967583 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 21 07:51:53.971254 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.971232 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 07:51:53.971739 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.971726 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 07:51:53.972300 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.972281 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 07:51:53.972372 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.972317 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 07:51:53.972372 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.972334 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 07:51:53.972458 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.972425 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 21 07:51:53.972458 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.972434 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 21 07:51:53.972458 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.972442 2574 factory.go:55] Registering systemd factory Apr 21 07:51:53.972587 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.972462 2574 factory.go:223] Registration of the systemd container factory successfully Apr 21 07:51:53.972587 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:53.972523 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 21 07:51:53.972854 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.972687 2574 factory.go:153] Registering CRI-O factory Apr 21 07:51:53.972854 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.972701 2574 factory.go:223] Registration of the crio container factory successfully Apr 21 07:51:53.972854 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.972759 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 07:51:53.972854 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.972790 2574 factory.go:103] Registering Raw factory Apr 21 07:51:53.972854 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.972806 2574 manager.go:1196] Started watching for new ooms in manager Apr 21 07:51:53.973949 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.973916 2574 manager.go:319] Starting recovery of all containers Apr 21 07:51:53.974211 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.974194 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:51:53.979398 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:53.979319 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-138-20.ec2.internal\" not found" node="ip-10-0-138-20.ec2.internal" Apr 21 07:51:53.989363 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.989211 2574 manager.go:324] Recovery completed Apr 21 07:51:53.993519 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.993500 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:51:53.995815 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.995800 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:51:53.995860 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.995828 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:51:53.995860 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.995838 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:51:53.996299 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.996285 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 07:51:53.996299 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.996297 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 07:51:53.996372 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.996331 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 21 07:51:53.998224 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.998212 2574 policy_none.go:49] "None policy: Start" Apr 21 07:51:53.998259 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.998228 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 07:51:53.998259 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:53.998238 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 21 07:51:54.044505 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.044480 2574 manager.go:341] "Starting Device Plugin manager" Apr 21 07:51:54.044598 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:54.044525 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 07:51:54.044598 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.044540 2574 server.go:85] "Starting device plugin registration server" Apr 21 07:51:54.044813 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.044799 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 07:51:54.044882 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.044829 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 07:51:54.045051 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.045024 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 07:51:54.045140 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.045119 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 07:51:54.045140 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.045131 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 07:51:54.045557 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:54.045538 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 07:51:54.045630 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:54.045579 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-20.ec2.internal\" not found" Apr 21 07:51:54.087856 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.087828 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 07:51:54.089174 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.089157 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 07:51:54.089239 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.089188 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 07:51:54.089239 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.089226 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 07:51:54.089239 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.089236 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 07:51:54.089340 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:54.089274 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 07:51:54.092831 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.092813 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:51:54.145823 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.145778 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:51:54.146537 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.146523 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:51:54.146614 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.146554 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:51:54.146614 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.146569 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:51:54.146614 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.146607 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-20.ec2.internal" Apr 21 07:51:54.154430 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.154416 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-20.ec2.internal" Apr 21 07:51:54.154495 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:54.154437 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-20.ec2.internal\": node \"ip-10-0-138-20.ec2.internal\" not found" Apr 21 07:51:54.169424 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:54.169406 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 21 07:51:54.190339 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.190310 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-138-20.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal"] Apr 21 07:51:54.190416 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.190393 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:51:54.191665 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.191651 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:51:54.191706 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.191678 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:51:54.191706 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.191687 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:51:54.193039 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.193027 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:51:54.193190 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.193176 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-20.ec2.internal" Apr 21 07:51:54.193237 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.193204 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:51:54.193708 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.193695 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:51:54.193765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.193714 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:51:54.193765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.193723 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:51:54.193765 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.193752 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:51:54.193880 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.193771 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:51:54.193880 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.193802 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:51:54.194972 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.194959 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" Apr 21 07:51:54.195013 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.194983 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:51:54.195614 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.195598 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:51:54.195696 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.195620 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:51:54.195696 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.195629 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:51:54.220543 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:54.220512 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-20.ec2.internal\" not found" node="ip-10-0-138-20.ec2.internal" Apr 21 07:51:54.224796 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:54.224777 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-20.ec2.internal\" not found" node="ip-10-0-138-20.ec2.internal" Apr 21 07:51:54.270070 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:54.270046 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 21 07:51:54.370566 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:54.370535 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 21 07:51:54.374871 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.374851 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2219300bfb8a5cea9f09d55fabfc69ab-config\") pod \"kube-apiserver-proxy-ip-10-0-138-20.ec2.internal\" (UID: \"2219300bfb8a5cea9f09d55fabfc69ab\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-20.ec2.internal" Apr 21 07:51:54.374916 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.374879 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/87b55359ad2fcf9ee78f0e3dd4c3711d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal\" (UID: \"87b55359ad2fcf9ee78f0e3dd4c3711d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" Apr 21 07:51:54.374916 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.374897 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/87b55359ad2fcf9ee78f0e3dd4c3711d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal\" (UID: \"87b55359ad2fcf9ee78f0e3dd4c3711d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" Apr 21 07:51:54.471351 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:54.471279 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 21 07:51:54.475564 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.475547 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/87b55359ad2fcf9ee78f0e3dd4c3711d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal\" (UID: \"87b55359ad2fcf9ee78f0e3dd4c3711d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" Apr 21 07:51:54.475612 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.475572 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2219300bfb8a5cea9f09d55fabfc69ab-config\") pod \"kube-apiserver-proxy-ip-10-0-138-20.ec2.internal\" (UID: \"2219300bfb8a5cea9f09d55fabfc69ab\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-20.ec2.internal" Apr 21 07:51:54.475612 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.475589 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/87b55359ad2fcf9ee78f0e3dd4c3711d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal\" (UID: \"87b55359ad2fcf9ee78f0e3dd4c3711d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" Apr 21 07:51:54.475706 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.475631 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/87b55359ad2fcf9ee78f0e3dd4c3711d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal\" (UID: \"87b55359ad2fcf9ee78f0e3dd4c3711d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" Apr 21 07:51:54.475706 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.475642 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/87b55359ad2fcf9ee78f0e3dd4c3711d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal\" (UID: \"87b55359ad2fcf9ee78f0e3dd4c3711d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" Apr 21 07:51:54.475706 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.475667 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2219300bfb8a5cea9f09d55fabfc69ab-config\") pod \"kube-apiserver-proxy-ip-10-0-138-20.ec2.internal\" (UID: \"2219300bfb8a5cea9f09d55fabfc69ab\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-20.ec2.internal" Apr 21 07:51:54.522714 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.522688 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-20.ec2.internal" Apr 21 07:51:54.527261 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.527242 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" Apr 21 07:51:54.571754 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:54.571735 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 21 07:51:54.672173 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:54.672146 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 21 07:51:54.772578 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:54.772531 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 21 07:51:54.872913 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:54.872888 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-20.ec2.internal\" not found" Apr 21 07:51:54.891186 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.891172 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 07:51:54.891321 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.891305 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 07:51:54.891358 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.891337 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 07:51:54.913995 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.913974 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:51:54.953694 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.953660 2574 apiserver.go:52] "Watching apiserver" Apr 21 07:51:54.962526 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.962506 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 07:51:54.963470 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.963446 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 07:46:53 +0000 UTC" deadline="2028-01-28 01:12:41.588683802 +0000 UTC" Apr 21 07:51:54.963526 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.963470 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15521h20m46.625215678s" Apr 21 07:51:54.964121 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.964101 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-kl7xh","openshift-network-diagnostics/network-check-target-q86fn","openshift-ovn-kubernetes/ovnkube-node-452pv","openshift-multus/multus-hmr5g","openshift-multus/network-metrics-daemon-jjnl5","openshift-network-operator/iptables-alerter-59f92","kube-system/konnectivity-agent-l96hv","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s","openshift-cluster-node-tuning-operator/tuned-6dmvw","openshift-dns/node-resolver-zglz2","openshift-image-registry/node-ca-gktqp"] Apr 21 07:51:54.966694 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.966675 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:54.966910 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.966859 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:51:54.967027 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:54.966919 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q86fn" podUID="e5633692-d3ef-4f27-aec7-1ddc39fd1781" Apr 21 07:51:54.967879 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.967858 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:54.968843 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.968823 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hmr5g" Apr 21 07:51:54.969188 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.969170 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 07:51:54.969250 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.969205 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 07:51:54.969250 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.969172 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 07:51:54.969529 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.969516 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hv82z\"" Apr 21 07:51:54.969763 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.969746 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 07:51:54.970185 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.970165 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-6vth2\"" Apr 21 07:51:54.970264 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.970171 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 07:51:54.970508 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.970484 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 07:51:54.970581 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.970488 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 07:51:54.970635 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.970620 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 07:51:54.970671 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.970569 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 07:51:54.970781 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.970766 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 07:51:54.971105 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.971000 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:51:54.971105 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:54.971065 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjnl5" podUID="40e690b0-0cf8-4414-b2e4-2f3c492f2196" Apr 21 07:51:54.971105 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.971105 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 07:51:54.971305 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.971106 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qfw2d\"" Apr 21 07:51:54.971305 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.971106 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 07:51:54.971305 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.971285 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 07:51:54.971962 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.971910 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" Apr 21 07:51:54.972192 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.972176 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-59f92" Apr 21 07:51:54.972296 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.972281 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-l96hv" Apr 21 07:51:54.973552 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.973535 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:54.974184 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.974165 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-vdrb4\"" Apr 21 07:51:54.974266 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.974193 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 07:51:54.974266 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.974174 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:51:54.974402 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.974383 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 07:51:54.974585 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.974571 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 07:51:54.974640 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.974611 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 07:51:54.974640 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.974621 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qmsx6\"" Apr 21 07:51:54.974799 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.974783 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:54.975512 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.975494 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 07:51:54.975618 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.975530 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-w28fq\"" Apr 21 07:51:54.975618 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.975533 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 07:51:54.975618 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.975610 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 07:51:54.975975 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.975957 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zglz2" Apr 21 07:51:54.977495 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.977436 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-node-log\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:54.977595 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.977509 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-host-cni-bin\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:54.977657 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.977575 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-host-run-k8s-cni-cncf-io\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:54.977710 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.977691 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-host-var-lib-cni-multus\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:54.977761 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.977722 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 07:51:54.977761 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.977742 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5d721100-b3b1-4a7b-b896-0f8f1e63c33b-socket-dir\") pod \"aws-ebs-csi-driver-node-ppf7s\" (UID: \"5d721100-b3b1-4a7b-b896-0f8f1e63c33b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:54.977862 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.977796 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2d12ced6-a6eb-40bb-8087-c53f467d8c26-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:54.978014 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.977863 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5fed3a2e-1505-4288-acc9-d9b3aabca4d0-host-slash\") pod \"iptables-alerter-59f92\" (UID: \"5fed3a2e-1505-4288-acc9-d9b3aabca4d0\") " pod="openshift-network-operator/iptables-alerter-59f92" Apr 21 07:51:54.978014 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.977889 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-run-openvswitch\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:54.978014 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.977943 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-host-slash\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:54.978014 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.977971 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-hostroot\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:54.978014 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978002 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-lpxf4\"" Apr 21 07:51:54.978271 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978002 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2d12ced6-a6eb-40bb-8087-c53f467d8c26-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:54.978271 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978211 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-var-lib-openvswitch\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:54.978271 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978244 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74d5z\" (UniqueName: \"kubernetes.io/projected/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-kube-api-access-74d5z\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:54.978415 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978278 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-multus-cni-dir\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:54.978415 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978296 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gwrj\" (UniqueName: \"kubernetes.io/projected/5fed3a2e-1505-4288-acc9-d9b3aabca4d0-kube-api-access-5gwrj\") pod \"iptables-alerter-59f92\" (UID: \"5fed3a2e-1505-4288-acc9-d9b3aabca4d0\") " pod="openshift-network-operator/iptables-alerter-59f92" Apr 21 07:51:54.978415 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978327 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mppjl\" (UniqueName: \"kubernetes.io/projected/5d721100-b3b1-4a7b-b896-0f8f1e63c33b-kube-api-access-mppjl\") pod \"aws-ebs-csi-driver-node-ppf7s\" (UID: \"5d721100-b3b1-4a7b-b896-0f8f1e63c33b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:54.978415 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978360 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-host-run-ovn-kubernetes\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:54.978415 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978376 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2d12ced6-a6eb-40bb-8087-c53f467d8c26-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:54.978415 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978392 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-system-cni-dir\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:54.978694 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978442 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:51:54.978694 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978439 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-host-run-multus-certs\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:54.978694 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978514 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-etc-openvswitch\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:54.978694 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978553 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-env-overrides\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:54.978694 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978585 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-multus-socket-dir-parent\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:54.978694 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978615 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs\") pod \"network-metrics-daemon-jjnl5\" (UID: \"40e690b0-0cf8-4414-b2e4-2f3c492f2196\") " pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:51:54.978694 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978645 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5d721100-b3b1-4a7b-b896-0f8f1e63c33b-sys-fs\") pod \"aws-ebs-csi-driver-node-ppf7s\" (UID: \"5d721100-b3b1-4a7b-b896-0f8f1e63c33b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:54.978694 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978674 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4gjg\" (UniqueName: \"kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg\") pod \"network-check-target-q86fn\" (UID: \"e5633692-d3ef-4f27-aec7-1ddc39fd1781\") " pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:51:54.979081 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978722 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:54.979081 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978754 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-cnibin\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:54.979081 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978783 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c5afc987-f5b7-46b4-91c1-5f015f3b2010-multus-daemon-config\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:54.979081 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978815 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2d12ced6-a6eb-40bb-8087-c53f467d8c26-cni-binary-copy\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:54.979081 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978848 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5fed3a2e-1505-4288-acc9-d9b3aabca4d0-iptables-alerter-script\") pod \"iptables-alerter-59f92\" (UID: \"5fed3a2e-1505-4288-acc9-d9b3aabca4d0\") " pod="openshift-network-operator/iptables-alerter-59f92" Apr 21 07:51:54.979081 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978884 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a7cb393b-d3ae-4c28-9350-a79f012ee6c8-konnectivity-ca\") pod \"konnectivity-agent-l96hv\" (UID: \"a7cb393b-d3ae-4c28-9350-a79f012ee6c8\") " pod="kube-system/konnectivity-agent-l96hv" Apr 21 07:51:54.979081 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978914 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-ovnkube-script-lib\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:54.979081 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978961 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-run-ovn\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:54.979081 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978990 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gktqp" Apr 21 07:51:54.979081 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.978997 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-ovnkube-config\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:54.979081 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.979033 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2d12ced6-a6eb-40bb-8087-c53f467d8c26-cnibin\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:54.979081 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.979071 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvv9r\" (UniqueName: \"kubernetes.io/projected/2d12ced6-a6eb-40bb-8087-c53f467d8c26-kube-api-access-xvv9r\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:54.979627 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.979099 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-log-socket\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:54.979627 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.979135 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-multus-conf-dir\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:54.979627 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.979282 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8q9sp\"" Apr 21 07:51:54.979627 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.979372 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbg48\" (UniqueName: \"kubernetes.io/projected/40e690b0-0cf8-4414-b2e4-2f3c492f2196-kube-api-access-cbg48\") pod \"network-metrics-daemon-jjnl5\" (UID: \"40e690b0-0cf8-4414-b2e4-2f3c492f2196\") " pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:51:54.979812 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.979675 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5d721100-b3b1-4a7b-b896-0f8f1e63c33b-registration-dir\") pod \"aws-ebs-csi-driver-node-ppf7s\" (UID: \"5d721100-b3b1-4a7b-b896-0f8f1e63c33b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:54.979812 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.979728 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-systemd-units\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:54.979812 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.979752 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 07:51:54.979812 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.979774 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-run-systemd\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:54.980803 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.979840 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d12ced6-a6eb-40bb-8087-c53f467d8c26-system-cni-dir\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:54.980803 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.980071 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-os-release\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:54.980803 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.980128 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-host-var-lib-kubelet\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:54.980803 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.980160 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-etc-kubernetes\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:54.980803 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.980186 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5d721100-b3b1-4a7b-b896-0f8f1e63c33b-device-dir\") pod \"aws-ebs-csi-driver-node-ppf7s\" (UID: \"5d721100-b3b1-4a7b-b896-0f8f1e63c33b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:54.980803 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.980251 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 07:51:54.980803 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.980301 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-host-kubelet\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:54.980803 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.980342 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-host-run-netns\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:54.980803 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.980454 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-host-cni-netd\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:54.980803 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.980537 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-ovn-node-metrics-cert\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:54.980803 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.980582 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5d721100-b3b1-4a7b-b896-0f8f1e63c33b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ppf7s\" (UID: \"5d721100-b3b1-4a7b-b896-0f8f1e63c33b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:54.980803 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.980644 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2d12ced6-a6eb-40bb-8087-c53f467d8c26-os-release\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:54.980803 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.980684 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c5afc987-f5b7-46b4-91c1-5f015f3b2010-cni-binary-copy\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:54.981544 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.980772 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7jz7\" (UniqueName: \"kubernetes.io/projected/c5afc987-f5b7-46b4-91c1-5f015f3b2010-kube-api-access-w7jz7\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:54.981544 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.980891 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5d721100-b3b1-4a7b-b896-0f8f1e63c33b-etc-selinux\") pod \"aws-ebs-csi-driver-node-ppf7s\" (UID: \"5d721100-b3b1-4a7b-b896-0f8f1e63c33b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:54.981544 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.980956 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a7cb393b-d3ae-4c28-9350-a79f012ee6c8-agent-certs\") pod \"konnectivity-agent-l96hv\" (UID: \"a7cb393b-d3ae-4c28-9350-a79f012ee6c8\") " pod="kube-system/konnectivity-agent-l96hv" Apr 21 07:51:54.981544 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.981022 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-host-run-netns\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:54.981544 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.981079 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-host-var-lib-cni-bin\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:54.981544 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.981230 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 07:51:54.981874 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.981854 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 07:51:54.981951 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.981903 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 07:51:54.981951 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.981859 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-jxjsm\"" Apr 21 07:51:54.985278 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.985257 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 07:51:54.985405 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.985326 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-20.ec2.internal" Apr 21 07:51:54.985769 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.985749 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal"] Apr 21 07:51:54.986403 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.986388 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 07:51:54.993298 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.993281 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 07:51:54.993413 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:54.993399 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-138-20.ec2.internal"] Apr 21 07:51:55.004476 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.004460 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-pp6sb" Apr 21 07:51:55.014073 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.014052 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-pp6sb" Apr 21 07:51:55.035143 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:55.035122 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2219300bfb8a5cea9f09d55fabfc69ab.slice/crio-82138a0fcd9e7672bbefb351b843ef33ed708f1927eb74a4ef0eadbac9246467 WatchSource:0}: Error finding container 82138a0fcd9e7672bbefb351b843ef33ed708f1927eb74a4ef0eadbac9246467: Status 404 returned error can't find the container with id 82138a0fcd9e7672bbefb351b843ef33ed708f1927eb74a4ef0eadbac9246467 Apr 21 07:51:55.035329 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:55.035308 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87b55359ad2fcf9ee78f0e3dd4c3711d.slice/crio-68b49720c26b1bb09270374edb6f71b77bb24ef080ad2ccec7123b54cefff9ee WatchSource:0}: Error finding container 68b49720c26b1bb09270374edb6f71b77bb24ef080ad2ccec7123b54cefff9ee: Status 404 returned error can't find the container with id 68b49720c26b1bb09270374edb6f71b77bb24ef080ad2ccec7123b54cefff9ee Apr 21 07:51:55.041141 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.041116 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 07:51:55.073819 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.073802 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 07:51:55.081342 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081325 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5d721100-b3b1-4a7b-b896-0f8f1e63c33b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ppf7s\" (UID: \"5d721100-b3b1-4a7b-b896-0f8f1e63c33b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:55.081419 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081350 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2d12ced6-a6eb-40bb-8087-c53f467d8c26-os-release\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:55.081419 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081375 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-etc-sysconfig\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.081484 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081435 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5d721100-b3b1-4a7b-b896-0f8f1e63c33b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-ppf7s\" (UID: \"5d721100-b3b1-4a7b-b896-0f8f1e63c33b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:55.081484 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081455 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2d12ced6-a6eb-40bb-8087-c53f467d8c26-os-release\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:55.081484 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081457 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-etc-sysctl-conf\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.081566 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081495 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c5afc987-f5b7-46b4-91c1-5f015f3b2010-cni-binary-copy\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.081566 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081514 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7jz7\" (UniqueName: \"kubernetes.io/projected/c5afc987-f5b7-46b4-91c1-5f015f3b2010-kube-api-access-w7jz7\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.081622 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081607 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5d721100-b3b1-4a7b-b896-0f8f1e63c33b-etc-selinux\") pod \"aws-ebs-csi-driver-node-ppf7s\" (UID: \"5d721100-b3b1-4a7b-b896-0f8f1e63c33b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:55.081653 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081630 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a7cb393b-d3ae-4c28-9350-a79f012ee6c8-agent-certs\") pod \"konnectivity-agent-l96hv\" (UID: \"a7cb393b-d3ae-4c28-9350-a79f012ee6c8\") " pod="kube-system/konnectivity-agent-l96hv" Apr 21 07:51:55.081653 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081648 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-etc-systemd\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.081761 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081663 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-run\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.081761 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081678 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-sys\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.081761 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081696 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-host-run-netns\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.081761 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081740 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-host-var-lib-cni-bin\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.081975 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081764 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-var-lib-kubelet\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.081975 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081764 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5d721100-b3b1-4a7b-b896-0f8f1e63c33b-etc-selinux\") pod \"aws-ebs-csi-driver-node-ppf7s\" (UID: \"5d721100-b3b1-4a7b-b896-0f8f1e63c33b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:55.081975 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081811 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff084e90-da5c-4aa6-97ea-239d7b8f0827-tmp\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.081975 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081850 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-node-log\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.081975 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081814 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-host-var-lib-cni-bin\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.081975 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081862 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-host-run-netns\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.081975 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081891 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-host-cni-bin\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.081975 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081900 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-node-log\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.081975 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081911 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-host-run-k8s-cni-cncf-io\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.081975 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081914 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-host-cni-bin\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.081975 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081954 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-host-var-lib-cni-multus\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.081975 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.081979 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5d721100-b3b1-4a7b-b896-0f8f1e63c33b-socket-dir\") pod \"aws-ebs-csi-driver-node-ppf7s\" (UID: \"5d721100-b3b1-4a7b-b896-0f8f1e63c33b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:55.082512 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082010 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-host-run-k8s-cni-cncf-io\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.082512 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082034 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2d12ced6-a6eb-40bb-8087-c53f467d8c26-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:55.082512 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082051 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-host-var-lib-cni-multus\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.082512 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082054 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c5afc987-f5b7-46b4-91c1-5f015f3b2010-cni-binary-copy\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.082512 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082051 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 07:51:55.082512 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082078 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5fed3a2e-1505-4288-acc9-d9b3aabca4d0-host-slash\") pod \"iptables-alerter-59f92\" (UID: \"5fed3a2e-1505-4288-acc9-d9b3aabca4d0\") " pod="openshift-network-operator/iptables-alerter-59f92" Apr 21 07:51:55.082512 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082099 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-etc-modprobe-d\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.082512 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082106 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5d721100-b3b1-4a7b-b896-0f8f1e63c33b-socket-dir\") pod \"aws-ebs-csi-driver-node-ppf7s\" (UID: \"5d721100-b3b1-4a7b-b896-0f8f1e63c33b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:55.082512 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082127 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5fed3a2e-1505-4288-acc9-d9b3aabca4d0-host-slash\") pod \"iptables-alerter-59f92\" (UID: \"5fed3a2e-1505-4288-acc9-d9b3aabca4d0\") " pod="openshift-network-operator/iptables-alerter-59f92" Apr 21 07:51:55.082512 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082151 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-run-openvswitch\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.082512 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082184 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-host-slash\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.082512 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082193 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-run-openvswitch\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.082512 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082215 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-hostroot\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.082512 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082228 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-host-slash\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.082512 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082269 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-hostroot\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.082512 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082269 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2d12ced6-a6eb-40bb-8087-c53f467d8c26-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:55.082512 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082310 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-etc-sysctl-d\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.082512 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082340 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-lib-modules\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.083186 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082363 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbmdw\" (UniqueName: \"kubernetes.io/projected/2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1-kube-api-access-xbmdw\") pod \"node-resolver-zglz2\" (UID: \"2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1\") " pod="openshift-dns/node-resolver-zglz2" Apr 21 07:51:55.083186 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082389 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-var-lib-openvswitch\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.083186 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082421 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74d5z\" (UniqueName: \"kubernetes.io/projected/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-kube-api-access-74d5z\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.083186 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082437 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-multus-cni-dir\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.083186 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082461 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5gwrj\" (UniqueName: \"kubernetes.io/projected/5fed3a2e-1505-4288-acc9-d9b3aabca4d0-kube-api-access-5gwrj\") pod \"iptables-alerter-59f92\" (UID: \"5fed3a2e-1505-4288-acc9-d9b3aabca4d0\") " pod="openshift-network-operator/iptables-alerter-59f92" Apr 21 07:51:55.083186 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082482 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-var-lib-openvswitch\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.083186 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082486 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mppjl\" (UniqueName: \"kubernetes.io/projected/5d721100-b3b1-4a7b-b896-0f8f1e63c33b-kube-api-access-mppjl\") pod \"aws-ebs-csi-driver-node-ppf7s\" (UID: \"5d721100-b3b1-4a7b-b896-0f8f1e63c33b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:55.083186 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082513 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db4c65d3-fa4f-4575-b518-0d9e5c9215b9-host\") pod \"node-ca-gktqp\" (UID: \"db4c65d3-fa4f-4575-b518-0d9e5c9215b9\") " pod="openshift-image-registry/node-ca-gktqp" Apr 21 07:51:55.083186 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082541 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-host-run-ovn-kubernetes\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.083186 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082546 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-multus-cni-dir\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.083186 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082567 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2d12ced6-a6eb-40bb-8087-c53f467d8c26-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:55.083186 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082592 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-system-cni-dir\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.083186 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082604 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2d12ced6-a6eb-40bb-8087-c53f467d8c26-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:55.083186 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082615 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-host-run-multus-certs\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.083186 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082638 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1-hosts-file\") pod \"node-resolver-zglz2\" (UID: \"2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1\") " pod="openshift-dns/node-resolver-zglz2" Apr 21 07:51:55.083186 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082663 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-etc-openvswitch\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.083186 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082694 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2d12ced6-a6eb-40bb-8087-c53f467d8c26-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:55.083895 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082695 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-host-run-multus-certs\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.083895 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082695 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-env-overrides\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.083895 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082707 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-etc-openvswitch\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.083895 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082742 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-multus-socket-dir-parent\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.083895 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082756 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-host-run-ovn-kubernetes\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.083895 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082772 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs\") pod \"network-metrics-daemon-jjnl5\" (UID: \"40e690b0-0cf8-4414-b2e4-2f3c492f2196\") " pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:51:55.083895 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082818 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-multus-socket-dir-parent\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.083895 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082771 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-system-cni-dir\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.083895 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:55.082866 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:55.083895 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.082775 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2d12ced6-a6eb-40bb-8087-c53f467d8c26-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:55.083895 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:55.082944 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs podName:40e690b0-0cf8-4414-b2e4-2f3c492f2196 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:55.582908312 +0000 UTC m=+1.995693404 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs") pod "network-metrics-daemon-jjnl5" (UID: "40e690b0-0cf8-4414-b2e4-2f3c492f2196") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:55.083895 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.083614 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5d721100-b3b1-4a7b-b896-0f8f1e63c33b-sys-fs\") pod \"aws-ebs-csi-driver-node-ppf7s\" (UID: \"5d721100-b3b1-4a7b-b896-0f8f1e63c33b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:55.083895 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.083678 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4gjg\" (UniqueName: \"kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg\") pod \"network-check-target-q86fn\" (UID: \"e5633692-d3ef-4f27-aec7-1ddc39fd1781\") " pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:51:55.083895 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.083694 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5d721100-b3b1-4a7b-b896-0f8f1e63c33b-sys-fs\") pod \"aws-ebs-csi-driver-node-ppf7s\" (UID: \"5d721100-b3b1-4a7b-b896-0f8f1e63c33b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:55.083895 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.083702 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-env-overrides\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.083895 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.083719 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.083895 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.083798 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.084687 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.083911 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-cnibin\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.084687 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.083991 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c5afc987-f5b7-46b4-91c1-5f015f3b2010-multus-daemon-config\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.084687 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.083998 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-cnibin\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.084687 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.084037 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2d12ced6-a6eb-40bb-8087-c53f467d8c26-cni-binary-copy\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:55.084687 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.084070 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5fed3a2e-1505-4288-acc9-d9b3aabca4d0-iptables-alerter-script\") pod \"iptables-alerter-59f92\" (UID: \"5fed3a2e-1505-4288-acc9-d9b3aabca4d0\") " pod="openshift-network-operator/iptables-alerter-59f92" Apr 21 07:51:55.084687 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.084633 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2d12ced6-a6eb-40bb-8087-c53f467d8c26-cni-binary-copy\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:55.084687 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.084636 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c5afc987-f5b7-46b4-91c1-5f015f3b2010-multus-daemon-config\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.084687 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.084679 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5fed3a2e-1505-4288-acc9-d9b3aabca4d0-iptables-alerter-script\") pod \"iptables-alerter-59f92\" (UID: \"5fed3a2e-1505-4288-acc9-d9b3aabca4d0\") " pod="openshift-network-operator/iptables-alerter-59f92" Apr 21 07:51:55.085062 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.084705 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a7cb393b-d3ae-4c28-9350-a79f012ee6c8-konnectivity-ca\") pod \"konnectivity-agent-l96hv\" (UID: \"a7cb393b-d3ae-4c28-9350-a79f012ee6c8\") " pod="kube-system/konnectivity-agent-l96hv" Apr 21 07:51:55.085062 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.084768 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/db4c65d3-fa4f-4575-b518-0d9e5c9215b9-serviceca\") pod \"node-ca-gktqp\" (UID: \"db4c65d3-fa4f-4575-b518-0d9e5c9215b9\") " pod="openshift-image-registry/node-ca-gktqp" Apr 21 07:51:55.085062 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.084884 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-ovnkube-script-lib\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.085062 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.084940 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-run-ovn\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.085062 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.084977 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-ovnkube-config\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.085062 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085000 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2d12ced6-a6eb-40bb-8087-c53f467d8c26-cnibin\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:55.085310 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085062 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-run-ovn\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.085310 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085126 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvv9r\" (UniqueName: \"kubernetes.io/projected/2d12ced6-a6eb-40bb-8087-c53f467d8c26-kube-api-access-xvv9r\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:55.085310 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085165 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-log-socket\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.085310 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085210 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-multus-conf-dir\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.085310 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085228 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbg48\" (UniqueName: \"kubernetes.io/projected/40e690b0-0cf8-4414-b2e4-2f3c492f2196-kube-api-access-cbg48\") pod \"network-metrics-daemon-jjnl5\" (UID: \"40e690b0-0cf8-4414-b2e4-2f3c492f2196\") " pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:51:55.085310 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085266 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5d721100-b3b1-4a7b-b896-0f8f1e63c33b-registration-dir\") pod \"aws-ebs-csi-driver-node-ppf7s\" (UID: \"5d721100-b3b1-4a7b-b896-0f8f1e63c33b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:55.085310 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085291 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-etc-kubernetes\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.085632 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085311 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-host\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.085632 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085343 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-systemd-units\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.085632 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085345 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a7cb393b-d3ae-4c28-9350-a79f012ee6c8-konnectivity-ca\") pod \"konnectivity-agent-l96hv\" (UID: \"a7cb393b-d3ae-4c28-9350-a79f012ee6c8\") " pod="kube-system/konnectivity-agent-l96hv" Apr 21 07:51:55.085632 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085363 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-run-systemd\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.085632 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085383 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d12ced6-a6eb-40bb-8087-c53f467d8c26-system-cni-dir\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:55.085632 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085405 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ff084e90-da5c-4aa6-97ea-239d7b8f0827-etc-tuned\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.085632 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085432 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t76q\" (UniqueName: \"kubernetes.io/projected/ff084e90-da5c-4aa6-97ea-239d7b8f0827-kube-api-access-8t76q\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.085632 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085438 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2d12ced6-a6eb-40bb-8087-c53f467d8c26-cnibin\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:55.085632 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085454 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-os-release\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.085632 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085471 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-host-var-lib-kubelet\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.085632 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085492 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-etc-kubernetes\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.085632 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085515 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5d721100-b3b1-4a7b-b896-0f8f1e63c33b-device-dir\") pod \"aws-ebs-csi-driver-node-ppf7s\" (UID: \"5d721100-b3b1-4a7b-b896-0f8f1e63c33b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:55.085632 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085530 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-ovnkube-script-lib\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.085632 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085534 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1-tmp-dir\") pod \"node-resolver-zglz2\" (UID: \"2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1\") " pod="openshift-dns/node-resolver-zglz2" Apr 21 07:51:55.085632 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085586 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5d721100-b3b1-4a7b-b896-0f8f1e63c33b-registration-dir\") pod \"aws-ebs-csi-driver-node-ppf7s\" (UID: \"5d721100-b3b1-4a7b-b896-0f8f1e63c33b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:55.085632 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085603 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qk9t\" (UniqueName: \"kubernetes.io/projected/db4c65d3-fa4f-4575-b518-0d9e5c9215b9-kube-api-access-5qk9t\") pod \"node-ca-gktqp\" (UID: \"db4c65d3-fa4f-4575-b518-0d9e5c9215b9\") " pod="openshift-image-registry/node-ca-gktqp" Apr 21 07:51:55.085632 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085639 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-host-kubelet\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.086371 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085653 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a7cb393b-d3ae-4c28-9350-a79f012ee6c8-agent-certs\") pod \"konnectivity-agent-l96hv\" (UID: \"a7cb393b-d3ae-4c28-9350-a79f012ee6c8\") " pod="kube-system/konnectivity-agent-l96hv" Apr 21 07:51:55.086371 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085678 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-run-systemd\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.086371 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085706 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-systemd-units\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.086371 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085747 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2d12ced6-a6eb-40bb-8087-c53f467d8c26-system-cni-dir\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:55.086371 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085757 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-host-run-netns\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.086371 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085857 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-os-release\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.086371 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085918 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5d721100-b3b1-4a7b-b896-0f8f1e63c33b-device-dir\") pod \"aws-ebs-csi-driver-node-ppf7s\" (UID: \"5d721100-b3b1-4a7b-b896-0f8f1e63c33b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:55.086371 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085970 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-multus-conf-dir\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.086371 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085979 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-host-run-netns\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.086371 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.085992 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-host-var-lib-kubelet\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.086371 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.086026 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-log-socket\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.086371 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.086044 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-host-kubelet\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.086371 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.086064 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-host-cni-netd\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.086371 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.086084 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5afc987-f5b7-46b4-91c1-5f015f3b2010-etc-kubernetes\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.086371 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.086101 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-ovn-node-metrics-cert\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.086371 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.086136 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-host-cni-netd\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.086371 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.086297 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-ovnkube-config\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.088880 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.088862 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-ovn-node-metrics-cert\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.091153 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:55.091136 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:51:55.091253 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:55.091157 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:51:55.091253 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:55.091170 2574 projected.go:194] Error preparing data for projected volume kube-api-access-d4gjg for pod openshift-network-diagnostics/network-check-target-q86fn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:55.091253 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:55.091227 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg podName:e5633692-d3ef-4f27-aec7-1ddc39fd1781 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:55.591210366 +0000 UTC m=+2.003995461 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-d4gjg" (UniqueName: "kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg") pod "network-check-target-q86fn" (UID: "e5633692-d3ef-4f27-aec7-1ddc39fd1781") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:55.092895 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.092848 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-20.ec2.internal" event={"ID":"2219300bfb8a5cea9f09d55fabfc69ab","Type":"ContainerStarted","Data":"82138a0fcd9e7672bbefb351b843ef33ed708f1927eb74a4ef0eadbac9246467"} Apr 21 07:51:55.094009 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.093973 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" event={"ID":"87b55359ad2fcf9ee78f0e3dd4c3711d","Type":"ContainerStarted","Data":"68b49720c26b1bb09270374edb6f71b77bb24ef080ad2ccec7123b54cefff9ee"} Apr 21 07:51:55.094129 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.094107 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mppjl\" (UniqueName: \"kubernetes.io/projected/5d721100-b3b1-4a7b-b896-0f8f1e63c33b-kube-api-access-mppjl\") pod \"aws-ebs-csi-driver-node-ppf7s\" (UID: \"5d721100-b3b1-4a7b-b896-0f8f1e63c33b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:55.094194 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.094149 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7jz7\" (UniqueName: \"kubernetes.io/projected/c5afc987-f5b7-46b4-91c1-5f015f3b2010-kube-api-access-w7jz7\") pod \"multus-hmr5g\" (UID: \"c5afc987-f5b7-46b4-91c1-5f015f3b2010\") " pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.094461 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.094439 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbg48\" (UniqueName: \"kubernetes.io/projected/40e690b0-0cf8-4414-b2e4-2f3c492f2196-kube-api-access-cbg48\") pod \"network-metrics-daemon-jjnl5\" (UID: \"40e690b0-0cf8-4414-b2e4-2f3c492f2196\") " pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:51:55.094672 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.094654 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gwrj\" (UniqueName: \"kubernetes.io/projected/5fed3a2e-1505-4288-acc9-d9b3aabca4d0-kube-api-access-5gwrj\") pod \"iptables-alerter-59f92\" (UID: \"5fed3a2e-1505-4288-acc9-d9b3aabca4d0\") " pod="openshift-network-operator/iptables-alerter-59f92" Apr 21 07:51:55.094961 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.094942 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvv9r\" (UniqueName: \"kubernetes.io/projected/2d12ced6-a6eb-40bb-8087-c53f467d8c26-kube-api-access-xvv9r\") pod \"multus-additional-cni-plugins-kl7xh\" (UID: \"2d12ced6-a6eb-40bb-8087-c53f467d8c26\") " pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:55.095162 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.095145 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74d5z\" (UniqueName: \"kubernetes.io/projected/5ca5f8c6-71ae-48cf-87d8-4190acb7d09e-kube-api-access-74d5z\") pod \"ovnkube-node-452pv\" (UID: \"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e\") " pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.186780 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.186752 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/db4c65d3-fa4f-4575-b518-0d9e5c9215b9-serviceca\") pod \"node-ca-gktqp\" (UID: \"db4c65d3-fa4f-4575-b518-0d9e5c9215b9\") " pod="openshift-image-registry/node-ca-gktqp" Apr 21 07:51:55.186780 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.186784 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-etc-kubernetes\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.187030 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.186798 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-host\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.187030 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.186815 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ff084e90-da5c-4aa6-97ea-239d7b8f0827-etc-tuned\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.187030 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.186830 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8t76q\" (UniqueName: \"kubernetes.io/projected/ff084e90-da5c-4aa6-97ea-239d7b8f0827-kube-api-access-8t76q\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.187030 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.186847 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1-tmp-dir\") pod \"node-resolver-zglz2\" (UID: \"2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1\") " pod="openshift-dns/node-resolver-zglz2" Apr 21 07:51:55.187030 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.186862 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qk9t\" (UniqueName: \"kubernetes.io/projected/db4c65d3-fa4f-4575-b518-0d9e5c9215b9-kube-api-access-5qk9t\") pod \"node-ca-gktqp\" (UID: \"db4c65d3-fa4f-4575-b518-0d9e5c9215b9\") " pod="openshift-image-registry/node-ca-gktqp" Apr 21 07:51:55.187030 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.186881 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-etc-sysconfig\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.187030 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.186894 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-etc-kubernetes\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.187030 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.186898 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-host\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.187030 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.186903 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-etc-sysctl-conf\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.187434 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187056 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-etc-systemd\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.187434 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187056 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-etc-sysctl-conf\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.187434 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187069 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-etc-sysconfig\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.187434 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187077 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-run\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.187434 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187110 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-etc-systemd\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.187434 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187122 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-sys\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.187434 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187127 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-run\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.187434 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187155 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-var-lib-kubelet\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.187434 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187181 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff084e90-da5c-4aa6-97ea-239d7b8f0827-tmp\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.187434 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187202 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-sys\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.187434 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187213 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-etc-modprobe-d\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.187434 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187235 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-var-lib-kubelet\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.187434 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187240 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-etc-sysctl-d\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.187434 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187272 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-lib-modules\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.187434 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187285 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1-tmp-dir\") pod \"node-resolver-zglz2\" (UID: \"2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1\") " pod="openshift-dns/node-resolver-zglz2" Apr 21 07:51:55.187434 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187299 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbmdw\" (UniqueName: \"kubernetes.io/projected/2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1-kube-api-access-xbmdw\") pod \"node-resolver-zglz2\" (UID: \"2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1\") " pod="openshift-dns/node-resolver-zglz2" Apr 21 07:51:55.187434 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187330 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db4c65d3-fa4f-4575-b518-0d9e5c9215b9-host\") pod \"node-ca-gktqp\" (UID: \"db4c65d3-fa4f-4575-b518-0d9e5c9215b9\") " pod="openshift-image-registry/node-ca-gktqp" Apr 21 07:51:55.187434 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187338 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-etc-sysctl-d\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.188038 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187297 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/db4c65d3-fa4f-4575-b518-0d9e5c9215b9-serviceca\") pod \"node-ca-gktqp\" (UID: \"db4c65d3-fa4f-4575-b518-0d9e5c9215b9\") " pod="openshift-image-registry/node-ca-gktqp" Apr 21 07:51:55.188038 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187360 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-etc-modprobe-d\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.188038 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187377 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/db4c65d3-fa4f-4575-b518-0d9e5c9215b9-host\") pod \"node-ca-gktqp\" (UID: \"db4c65d3-fa4f-4575-b518-0d9e5c9215b9\") " pod="openshift-image-registry/node-ca-gktqp" Apr 21 07:51:55.188038 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187387 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1-hosts-file\") pod \"node-resolver-zglz2\" (UID: \"2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1\") " pod="openshift-dns/node-resolver-zglz2" Apr 21 07:51:55.188038 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187405 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ff084e90-da5c-4aa6-97ea-239d7b8f0827-lib-modules\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.188038 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.187468 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1-hosts-file\") pod \"node-resolver-zglz2\" (UID: \"2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1\") " pod="openshift-dns/node-resolver-zglz2" Apr 21 07:51:55.189111 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.189093 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ff084e90-da5c-4aa6-97ea-239d7b8f0827-etc-tuned\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.189199 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.189184 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff084e90-da5c-4aa6-97ea-239d7b8f0827-tmp\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.196777 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.196757 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbmdw\" (UniqueName: \"kubernetes.io/projected/2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1-kube-api-access-xbmdw\") pod \"node-resolver-zglz2\" (UID: \"2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1\") " pod="openshift-dns/node-resolver-zglz2" Apr 21 07:51:55.197108 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.197089 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t76q\" (UniqueName: \"kubernetes.io/projected/ff084e90-da5c-4aa6-97ea-239d7b8f0827-kube-api-access-8t76q\") pod \"tuned-6dmvw\" (UID: \"ff084e90-da5c-4aa6-97ea-239d7b8f0827\") " pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.197193 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.197092 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qk9t\" (UniqueName: \"kubernetes.io/projected/db4c65d3-fa4f-4575-b518-0d9e5c9215b9-kube-api-access-5qk9t\") pod \"node-ca-gktqp\" (UID: \"db4c65d3-fa4f-4575-b518-0d9e5c9215b9\") " pod="openshift-image-registry/node-ca-gktqp" Apr 21 07:51:55.269538 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.269509 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:51:55.298566 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.298519 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kl7xh" Apr 21 07:51:55.303157 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.303136 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:51:55.305076 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:55.305052 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d12ced6_a6eb_40bb_8087_c53f467d8c26.slice/crio-cf911412b4b5b9390830e83975da612929e56daf7339932960824e4ae8102a67 WatchSource:0}: Error finding container cf911412b4b5b9390830e83975da612929e56daf7339932960824e4ae8102a67: Status 404 returned error can't find the container with id cf911412b4b5b9390830e83975da612929e56daf7339932960824e4ae8102a67 Apr 21 07:51:55.310070 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:55.310052 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ca5f8c6_71ae_48cf_87d8_4190acb7d09e.slice/crio-8a153b8e64a31c4df5770ab4e7912d9ae84702b2c9cbdfc8b14ceb35e516ac39 WatchSource:0}: Error finding container 8a153b8e64a31c4df5770ab4e7912d9ae84702b2c9cbdfc8b14ceb35e516ac39: Status 404 returned error can't find the container with id 8a153b8e64a31c4df5770ab4e7912d9ae84702b2c9cbdfc8b14ceb35e516ac39 Apr 21 07:51:55.323416 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.323394 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hmr5g" Apr 21 07:51:55.329328 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:55.329311 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5afc987_f5b7_46b4_91c1_5f015f3b2010.slice/crio-aff791fdc9b6fceb9cc25e2e5a6bb1c54ea25841e98854e59fcba9b5a726caf7 WatchSource:0}: Error finding container aff791fdc9b6fceb9cc25e2e5a6bb1c54ea25841e98854e59fcba9b5a726caf7: Status 404 returned error can't find the container with id aff791fdc9b6fceb9cc25e2e5a6bb1c54ea25841e98854e59fcba9b5a726caf7 Apr 21 07:51:55.342906 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.342890 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-59f92" Apr 21 07:51:55.348414 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.348388 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-l96hv" Apr 21 07:51:55.349387 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:55.349366 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fed3a2e_1505_4288_acc9_d9b3aabca4d0.slice/crio-7e0f945a189b512b17d307b12e793d485cd5b9ecbb3a77493223fe2574a4a384 WatchSource:0}: Error finding container 7e0f945a189b512b17d307b12e793d485cd5b9ecbb3a77493223fe2574a4a384: Status 404 returned error can't find the container with id 7e0f945a189b512b17d307b12e793d485cd5b9ecbb3a77493223fe2574a4a384 Apr 21 07:51:55.353967 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.353945 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" Apr 21 07:51:55.356837 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:55.355003 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7cb393b_d3ae_4c28_9350_a79f012ee6c8.slice/crio-8112684a24837fc06addbcf25eb69168c498f6f93bd14f7fe0dadbc4a08addde WatchSource:0}: Error finding container 8112684a24837fc06addbcf25eb69168c498f6f93bd14f7fe0dadbc4a08addde: Status 404 returned error can't find the container with id 8112684a24837fc06addbcf25eb69168c498f6f93bd14f7fe0dadbc4a08addde Apr 21 07:51:55.358762 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.358740 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" Apr 21 07:51:55.361968 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:55.361941 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d721100_b3b1_4a7b_b896_0f8f1e63c33b.slice/crio-9a94d1e2143079dc91067db0148ec9b50bb50b202ab20a5662a2d977b19ee6a9 WatchSource:0}: Error finding container 9a94d1e2143079dc91067db0148ec9b50bb50b202ab20a5662a2d977b19ee6a9: Status 404 returned error can't find the container with id 9a94d1e2143079dc91067db0148ec9b50bb50b202ab20a5662a2d977b19ee6a9 Apr 21 07:51:55.364362 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.364022 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zglz2" Apr 21 07:51:55.368147 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.368128 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gktqp" Apr 21 07:51:55.369398 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:55.369376 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff084e90_da5c_4aa6_97ea_239d7b8f0827.slice/crio-f11e76e39826f030c86c8ca8f69da64b3b89fa77b24c2b2b50dab3f748b7ab71 WatchSource:0}: Error finding container f11e76e39826f030c86c8ca8f69da64b3b89fa77b24c2b2b50dab3f748b7ab71: Status 404 returned error can't find the container with id f11e76e39826f030c86c8ca8f69da64b3b89fa77b24c2b2b50dab3f748b7ab71 Apr 21 07:51:55.372815 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:55.372794 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e1e8c1e_08e4_4ee4_a7bf_e84dfd5e0fa1.slice/crio-34ab4e1530546c3fd03dbbaa9ede539831f4cbabb10a871b4f422d49fbbbdf19 WatchSource:0}: Error finding container 34ab4e1530546c3fd03dbbaa9ede539831f4cbabb10a871b4f422d49fbbbdf19: Status 404 returned error can't find the container with id 34ab4e1530546c3fd03dbbaa9ede539831f4cbabb10a871b4f422d49fbbbdf19 Apr 21 07:51:55.376770 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:51:55.376736 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb4c65d3_fa4f_4575_b518_0d9e5c9215b9.slice/crio-12897ea6f8d13a6f2031b7cfb320c3f8a7d33bbe53b8e1d6d64eebb18928d785 WatchSource:0}: Error finding container 12897ea6f8d13a6f2031b7cfb320c3f8a7d33bbe53b8e1d6d64eebb18928d785: Status 404 returned error can't find the container with id 12897ea6f8d13a6f2031b7cfb320c3f8a7d33bbe53b8e1d6d64eebb18928d785 Apr 21 07:51:55.591337 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.590818 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs\") pod \"network-metrics-daemon-jjnl5\" (UID: \"40e690b0-0cf8-4414-b2e4-2f3c492f2196\") " pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:51:55.591337 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:55.590983 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:55.591337 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:55.591042 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs podName:40e690b0-0cf8-4414-b2e4-2f3c492f2196 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:56.591024108 +0000 UTC m=+3.003809206 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs") pod "network-metrics-daemon-jjnl5" (UID: "40e690b0-0cf8-4414-b2e4-2f3c492f2196") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:55.691441 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.691411 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4gjg\" (UniqueName: \"kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg\") pod \"network-check-target-q86fn\" (UID: \"e5633692-d3ef-4f27-aec7-1ddc39fd1781\") " pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:51:55.691594 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:55.691551 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:51:55.691594 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:55.691570 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:51:55.691594 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:55.691582 2574 projected.go:194] Error preparing data for projected volume kube-api-access-d4gjg for pod openshift-network-diagnostics/network-check-target-q86fn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:55.691767 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:55.691635 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg podName:e5633692-d3ef-4f27-aec7-1ddc39fd1781 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:56.691616608 +0000 UTC m=+3.104401704 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-d4gjg" (UniqueName: "kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg") pod "network-check-target-q86fn" (UID: "e5633692-d3ef-4f27-aec7-1ddc39fd1781") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:55.765168 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:55.764650 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:51:56.015597 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:56.015513 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 07:46:55 +0000 UTC" deadline="2027-10-20 21:47:38.017807154 +0000 UTC" Apr 21 07:51:56.015597 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:56.015550 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13141h55m42.002262272s" Apr 21 07:51:56.091196 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:56.091168 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:51:56.091351 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:56.091290 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q86fn" podUID="e5633692-d3ef-4f27-aec7-1ddc39fd1781" Apr 21 07:51:56.105005 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:56.104347 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" event={"ID":"ff084e90-da5c-4aa6-97ea-239d7b8f0827","Type":"ContainerStarted","Data":"f11e76e39826f030c86c8ca8f69da64b3b89fa77b24c2b2b50dab3f748b7ab71"} Apr 21 07:51:56.112660 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:56.112633 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" event={"ID":"5d721100-b3b1-4a7b-b896-0f8f1e63c33b","Type":"ContainerStarted","Data":"9a94d1e2143079dc91067db0148ec9b50bb50b202ab20a5662a2d977b19ee6a9"} Apr 21 07:51:56.144798 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:56.144769 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kl7xh" event={"ID":"2d12ced6-a6eb-40bb-8087-c53f467d8c26","Type":"ContainerStarted","Data":"cf911412b4b5b9390830e83975da612929e56daf7339932960824e4ae8102a67"} Apr 21 07:51:56.148764 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:56.148729 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-l96hv" event={"ID":"a7cb393b-d3ae-4c28-9350-a79f012ee6c8","Type":"ContainerStarted","Data":"8112684a24837fc06addbcf25eb69168c498f6f93bd14f7fe0dadbc4a08addde"} Apr 21 07:51:56.157631 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:56.156730 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-59f92" event={"ID":"5fed3a2e-1505-4288-acc9-d9b3aabca4d0","Type":"ContainerStarted","Data":"7e0f945a189b512b17d307b12e793d485cd5b9ecbb3a77493223fe2574a4a384"} Apr 21 07:51:56.169985 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:56.169958 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hmr5g" event={"ID":"c5afc987-f5b7-46b4-91c1-5f015f3b2010","Type":"ContainerStarted","Data":"aff791fdc9b6fceb9cc25e2e5a6bb1c54ea25841e98854e59fcba9b5a726caf7"} Apr 21 07:51:56.181700 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:56.181667 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-452pv" event={"ID":"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e","Type":"ContainerStarted","Data":"8a153b8e64a31c4df5770ab4e7912d9ae84702b2c9cbdfc8b14ceb35e516ac39"} Apr 21 07:51:56.194653 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:56.192763 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gktqp" event={"ID":"db4c65d3-fa4f-4575-b518-0d9e5c9215b9","Type":"ContainerStarted","Data":"12897ea6f8d13a6f2031b7cfb320c3f8a7d33bbe53b8e1d6d64eebb18928d785"} Apr 21 07:51:56.197461 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:56.197436 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zglz2" event={"ID":"2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1","Type":"ContainerStarted","Data":"34ab4e1530546c3fd03dbbaa9ede539831f4cbabb10a871b4f422d49fbbbdf19"} Apr 21 07:51:56.248051 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:56.248025 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:51:56.598829 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:56.598791 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs\") pod \"network-metrics-daemon-jjnl5\" (UID: \"40e690b0-0cf8-4414-b2e4-2f3c492f2196\") " pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:51:56.611690 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:56.608045 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:56.611690 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:56.608131 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs podName:40e690b0-0cf8-4414-b2e4-2f3c492f2196 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:58.608109579 +0000 UTC m=+5.020894689 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs") pod "network-metrics-daemon-jjnl5" (UID: "40e690b0-0cf8-4414-b2e4-2f3c492f2196") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:56.699476 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:56.699407 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4gjg\" (UniqueName: \"kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg\") pod \"network-check-target-q86fn\" (UID: \"e5633692-d3ef-4f27-aec7-1ddc39fd1781\") " pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:51:56.699635 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:56.699553 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:51:56.699635 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:56.699574 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:51:56.699635 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:56.699586 2574 projected.go:194] Error preparing data for projected volume kube-api-access-d4gjg for pod openshift-network-diagnostics/network-check-target-q86fn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:56.699811 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:56.699644 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg podName:e5633692-d3ef-4f27-aec7-1ddc39fd1781 nodeName:}" failed. No retries permitted until 2026-04-21 07:51:58.699627079 +0000 UTC m=+5.112412182 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-d4gjg" (UniqueName: "kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg") pod "network-check-target-q86fn" (UID: "e5633692-d3ef-4f27-aec7-1ddc39fd1781") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:57.016698 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:57.016591 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 07:46:55 +0000 UTC" deadline="2027-10-12 05:08:41.815700418 +0000 UTC" Apr 21 07:51:57.016698 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:57.016650 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12933h16m44.799054354s" Apr 21 07:51:57.090352 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:57.090323 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:51:57.090500 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:57.090450 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjnl5" podUID="40e690b0-0cf8-4414-b2e4-2f3c492f2196" Apr 21 07:51:58.092462 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:58.092433 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:51:58.092940 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:58.092534 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q86fn" podUID="e5633692-d3ef-4f27-aec7-1ddc39fd1781" Apr 21 07:51:58.615890 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:58.615819 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs\") pod \"network-metrics-daemon-jjnl5\" (UID: \"40e690b0-0cf8-4414-b2e4-2f3c492f2196\") " pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:51:58.616085 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:58.615984 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:58.616085 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:58.616058 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs podName:40e690b0-0cf8-4414-b2e4-2f3c492f2196 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:02.616039659 +0000 UTC m=+9.028824764 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs") pod "network-metrics-daemon-jjnl5" (UID: "40e690b0-0cf8-4414-b2e4-2f3c492f2196") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:58.717244 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:58.717207 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4gjg\" (UniqueName: \"kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg\") pod \"network-check-target-q86fn\" (UID: \"e5633692-d3ef-4f27-aec7-1ddc39fd1781\") " pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:51:58.717423 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:58.717404 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:51:58.717482 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:58.717433 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:51:58.717482 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:58.717447 2574 projected.go:194] Error preparing data for projected volume kube-api-access-d4gjg for pod openshift-network-diagnostics/network-check-target-q86fn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:58.717579 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:58.717510 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg podName:e5633692-d3ef-4f27-aec7-1ddc39fd1781 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:02.717487896 +0000 UTC m=+9.130272993 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-d4gjg" (UniqueName: "kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg") pod "network-check-target-q86fn" (UID: "e5633692-d3ef-4f27-aec7-1ddc39fd1781") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:59.089887 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:51:59.089613 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:51:59.089887 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:51:59.089749 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjnl5" podUID="40e690b0-0cf8-4414-b2e4-2f3c492f2196" Apr 21 07:52:00.090184 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:00.090148 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:52:00.090644 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:00.090268 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q86fn" podUID="e5633692-d3ef-4f27-aec7-1ddc39fd1781" Apr 21 07:52:01.089885 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:01.089849 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:52:01.090061 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:01.090016 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjnl5" podUID="40e690b0-0cf8-4414-b2e4-2f3c492f2196" Apr 21 07:52:02.094124 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:02.093965 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:52:02.096874 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:02.094986 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q86fn" podUID="e5633692-d3ef-4f27-aec7-1ddc39fd1781" Apr 21 07:52:02.649907 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:02.649870 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs\") pod \"network-metrics-daemon-jjnl5\" (UID: \"40e690b0-0cf8-4414-b2e4-2f3c492f2196\") " pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:52:02.650100 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:02.650068 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:52:02.650163 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:02.650137 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs podName:40e690b0-0cf8-4414-b2e4-2f3c492f2196 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:10.650115932 +0000 UTC m=+17.062901038 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs") pod "network-metrics-daemon-jjnl5" (UID: "40e690b0-0cf8-4414-b2e4-2f3c492f2196") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:52:02.751345 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:02.751293 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4gjg\" (UniqueName: \"kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg\") pod \"network-check-target-q86fn\" (UID: \"e5633692-d3ef-4f27-aec7-1ddc39fd1781\") " pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:52:02.751518 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:02.751459 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:52:02.751518 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:02.751480 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:52:02.751518 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:02.751492 2574 projected.go:194] Error preparing data for projected volume kube-api-access-d4gjg for pod openshift-network-diagnostics/network-check-target-q86fn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:52:02.751685 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:02.751547 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg podName:e5633692-d3ef-4f27-aec7-1ddc39fd1781 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:10.7515294 +0000 UTC m=+17.164314504 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-d4gjg" (UniqueName: "kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg") pod "network-check-target-q86fn" (UID: "e5633692-d3ef-4f27-aec7-1ddc39fd1781") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:52:03.089498 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:03.089464 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:52:03.089685 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:03.089605 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjnl5" podUID="40e690b0-0cf8-4414-b2e4-2f3c492f2196" Apr 21 07:52:04.090669 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:04.090642 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:52:04.091122 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:04.090734 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q86fn" podUID="e5633692-d3ef-4f27-aec7-1ddc39fd1781" Apr 21 07:52:05.090302 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:05.090082 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:52:05.090467 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:05.090440 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjnl5" podUID="40e690b0-0cf8-4414-b2e4-2f3c492f2196" Apr 21 07:52:06.090074 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:06.090033 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:52:06.090527 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:06.090144 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q86fn" podUID="e5633692-d3ef-4f27-aec7-1ddc39fd1781" Apr 21 07:52:07.089740 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:07.089699 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:52:07.089898 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:07.089815 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjnl5" podUID="40e690b0-0cf8-4414-b2e4-2f3c492f2196" Apr 21 07:52:08.090160 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:08.090115 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:52:08.090595 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:08.090232 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q86fn" podUID="e5633692-d3ef-4f27-aec7-1ddc39fd1781" Apr 21 07:52:09.089790 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:09.089759 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:52:09.089989 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:09.089889 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjnl5" podUID="40e690b0-0cf8-4414-b2e4-2f3c492f2196" Apr 21 07:52:10.090405 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:10.090364 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:52:10.090901 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:10.090471 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q86fn" podUID="e5633692-d3ef-4f27-aec7-1ddc39fd1781" Apr 21 07:52:10.710605 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:10.710567 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs\") pod \"network-metrics-daemon-jjnl5\" (UID: \"40e690b0-0cf8-4414-b2e4-2f3c492f2196\") " pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:52:10.710773 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:10.710713 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:52:10.710773 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:10.710773 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs podName:40e690b0-0cf8-4414-b2e4-2f3c492f2196 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:26.710759086 +0000 UTC m=+33.123544178 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs") pod "network-metrics-daemon-jjnl5" (UID: "40e690b0-0cf8-4414-b2e4-2f3c492f2196") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:52:10.811577 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:10.811539 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4gjg\" (UniqueName: \"kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg\") pod \"network-check-target-q86fn\" (UID: \"e5633692-d3ef-4f27-aec7-1ddc39fd1781\") " pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:52:10.811750 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:10.811680 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:52:10.811750 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:10.811702 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:52:10.811750 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:10.811714 2574 projected.go:194] Error preparing data for projected volume kube-api-access-d4gjg for pod openshift-network-diagnostics/network-check-target-q86fn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:52:10.811898 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:10.811778 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg podName:e5633692-d3ef-4f27-aec7-1ddc39fd1781 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:26.811759643 +0000 UTC m=+33.224544754 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-d4gjg" (UniqueName: "kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg") pod "network-check-target-q86fn" (UID: "e5633692-d3ef-4f27-aec7-1ddc39fd1781") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:52:11.089821 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:11.089750 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:52:11.090053 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:11.089871 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjnl5" podUID="40e690b0-0cf8-4414-b2e4-2f3c492f2196" Apr 21 07:52:12.090156 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:12.090123 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:52:12.090660 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:12.090253 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q86fn" podUID="e5633692-d3ef-4f27-aec7-1ddc39fd1781" Apr 21 07:52:13.089682 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:13.089649 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:52:13.089840 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:13.089749 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjnl5" podUID="40e690b0-0cf8-4414-b2e4-2f3c492f2196" Apr 21 07:52:14.094187 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:14.094159 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:52:14.094510 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:14.094252 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q86fn" podUID="e5633692-d3ef-4f27-aec7-1ddc39fd1781" Apr 21 07:52:14.232025 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:14.231694 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-20.ec2.internal" event={"ID":"2219300bfb8a5cea9f09d55fabfc69ab","Type":"ContainerStarted","Data":"2abae42f056cbcec26816e8bf03c2b8494e6ffdcaeb561c3c0a52e34d36f847f"} Apr 21 07:52:14.237767 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:14.237591 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" event={"ID":"ff084e90-da5c-4aa6-97ea-239d7b8f0827","Type":"ContainerStarted","Data":"4581d474cd565931cefbc4d7eafc0f2dae673f4db0aa807bc5443dbd944795e7"} Apr 21 07:52:14.244396 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:14.244354 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hmr5g" event={"ID":"c5afc987-f5b7-46b4-91c1-5f015f3b2010","Type":"ContainerStarted","Data":"177d5eed20bdede40ad5c2080afeaee44a4485607c88a76fbf819589492d1e59"} Apr 21 07:52:14.245887 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:14.245666 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-20.ec2.internal" podStartSLOduration=20.245651666 podStartE2EDuration="20.245651666s" podCreationTimestamp="2026-04-21 07:51:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:52:14.244808042 +0000 UTC m=+20.657593156" watchObservedRunningTime="2026-04-21 07:52:14.245651666 +0000 UTC m=+20.658436780" Apr 21 07:52:14.247565 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:14.247539 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-452pv" event={"ID":"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e","Type":"ContainerStarted","Data":"5a14ee72e120d60d20761e0234c8e3231c333916a0060031304066fdffeca34c"} Apr 21 07:52:14.247642 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:14.247575 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-452pv" event={"ID":"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e","Type":"ContainerStarted","Data":"7b2e5365575d8a945f1380f4d17c397ce5fe88963ebe01f3011747fd92b95be6"} Apr 21 07:52:14.247642 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:14.247586 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-452pv" event={"ID":"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e","Type":"ContainerStarted","Data":"0d81602846fa61834b96777e213a4a85b163a1af8ca29bd63e9185d1cc80d694"} Apr 21 07:52:14.281341 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:14.281298 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6dmvw" podStartSLOduration=1.801963199 podStartE2EDuration="20.281284844s" podCreationTimestamp="2026-04-21 07:51:54 +0000 UTC" firstStartedPulling="2026-04-21 07:51:55.371547647 +0000 UTC m=+1.784332742" lastFinishedPulling="2026-04-21 07:52:13.85086928 +0000 UTC m=+20.263654387" observedRunningTime="2026-04-21 07:52:14.261490692 +0000 UTC m=+20.674275797" watchObservedRunningTime="2026-04-21 07:52:14.281284844 +0000 UTC m=+20.694069957" Apr 21 07:52:14.281684 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:14.281648 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hmr5g" podStartSLOduration=1.6471525360000001 podStartE2EDuration="20.281638494s" podCreationTimestamp="2026-04-21 07:51:54 +0000 UTC" firstStartedPulling="2026-04-21 07:51:55.330983016 +0000 UTC m=+1.743768111" lastFinishedPulling="2026-04-21 07:52:13.965468963 +0000 UTC m=+20.378254069" observedRunningTime="2026-04-21 07:52:14.281032724 +0000 UTC m=+20.693817838" watchObservedRunningTime="2026-04-21 07:52:14.281638494 +0000 UTC m=+20.694423609" Apr 21 07:52:15.089803 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.089609 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:52:15.089938 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:15.089902 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjnl5" podUID="40e690b0-0cf8-4414-b2e4-2f3c492f2196" Apr 21 07:52:15.250572 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.250538 2574 generic.go:358] "Generic (PLEG): container finished" podID="87b55359ad2fcf9ee78f0e3dd4c3711d" containerID="b6a6ceabbb54c4a4dfdfc867c153bbd3a235f9a2f8ce90ed4911c5b87af2d0b0" exitCode=0 Apr 21 07:52:15.251324 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.250610 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" event={"ID":"87b55359ad2fcf9ee78f0e3dd4c3711d","Type":"ContainerDied","Data":"b6a6ceabbb54c4a4dfdfc867c153bbd3a235f9a2f8ce90ed4911c5b87af2d0b0"} Apr 21 07:52:15.251973 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.251952 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gktqp" event={"ID":"db4c65d3-fa4f-4575-b518-0d9e5c9215b9","Type":"ContainerStarted","Data":"0aace4994716c221cc7ddd775486aecedf5c8eede8ede27f1161f21bd5eda4cb"} Apr 21 07:52:15.253401 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.253377 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zglz2" event={"ID":"2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1","Type":"ContainerStarted","Data":"3e82f089229c7167be439e7fd4af124bb27303cb4dbeed2d3b1fd8097f504843"} Apr 21 07:52:15.254763 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.254743 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" event={"ID":"5d721100-b3b1-4a7b-b896-0f8f1e63c33b","Type":"ContainerStarted","Data":"86b5608f33793b03f6c5496bc0c4df22b78faf39022dab2d7b800c855ae789a4"} Apr 21 07:52:15.256390 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.256356 2574 generic.go:358] "Generic (PLEG): container finished" podID="2d12ced6-a6eb-40bb-8087-c53f467d8c26" containerID="2ca2375b6591495e5f7389883abff1079c5d147c2f714a2fd63fefb265ea1347" exitCode=0 Apr 21 07:52:15.256685 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.256630 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kl7xh" event={"ID":"2d12ced6-a6eb-40bb-8087-c53f467d8c26","Type":"ContainerDied","Data":"2ca2375b6591495e5f7389883abff1079c5d147c2f714a2fd63fefb265ea1347"} Apr 21 07:52:15.259018 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.258997 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-l96hv" event={"ID":"a7cb393b-d3ae-4c28-9350-a79f012ee6c8","Type":"ContainerStarted","Data":"4a925825b724782c57d53322db1d8ff414ac3f220892c8f169a538c4365fc023"} Apr 21 07:52:15.260253 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.260230 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-59f92" event={"ID":"5fed3a2e-1505-4288-acc9-d9b3aabca4d0","Type":"ContainerStarted","Data":"666dc11cc1d216c1bb7594ce7da36d753fe91d9d96d982d259a8459e08dcb0b8"} Apr 21 07:52:15.262848 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.262828 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-452pv_5ca5f8c6-71ae-48cf-87d8-4190acb7d09e/ovn-acl-logging/0.log" Apr 21 07:52:15.263194 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.263172 2574 generic.go:358] "Generic (PLEG): container finished" podID="5ca5f8c6-71ae-48cf-87d8-4190acb7d09e" containerID="7b2e5365575d8a945f1380f4d17c397ce5fe88963ebe01f3011747fd92b95be6" exitCode=1 Apr 21 07:52:15.263319 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.263295 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-452pv" event={"ID":"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e","Type":"ContainerDied","Data":"7b2e5365575d8a945f1380f4d17c397ce5fe88963ebe01f3011747fd92b95be6"} Apr 21 07:52:15.263380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.263329 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-452pv" event={"ID":"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e","Type":"ContainerStarted","Data":"8434538f190954069ae7b71b29f7bb023261d99656943fa4caf06ae5d0a0bdcf"} Apr 21 07:52:15.263380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.263343 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-452pv" event={"ID":"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e","Type":"ContainerStarted","Data":"7e8e1d78c266eea743f8e4942b2d230a1727cb82b0fc45b8e318321a99585ee7"} Apr 21 07:52:15.263380 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.263355 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-452pv" event={"ID":"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e","Type":"ContainerStarted","Data":"b4af26772b8766c8665080990dc89d443f6aec48c8307fd81b8fe8bfca35cc69"} Apr 21 07:52:15.294331 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.294292 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-59f92" podStartSLOduration=2.81905435 podStartE2EDuration="21.294281048s" podCreationTimestamp="2026-04-21 07:51:54 +0000 UTC" firstStartedPulling="2026-04-21 07:51:55.351406057 +0000 UTC m=+1.764191151" lastFinishedPulling="2026-04-21 07:52:13.826632743 +0000 UTC m=+20.239417849" observedRunningTime="2026-04-21 07:52:15.294210462 +0000 UTC m=+21.706995579" watchObservedRunningTime="2026-04-21 07:52:15.294281048 +0000 UTC m=+21.707066161" Apr 21 07:52:15.306515 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.306479 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-l96hv" podStartSLOduration=2.838282825 podStartE2EDuration="21.306469171s" podCreationTimestamp="2026-04-21 07:51:54 +0000 UTC" firstStartedPulling="2026-04-21 07:51:55.358438552 +0000 UTC m=+1.771223659" lastFinishedPulling="2026-04-21 07:52:13.826624913 +0000 UTC m=+20.239410005" observedRunningTime="2026-04-21 07:52:15.306124103 +0000 UTC m=+21.718909216" watchObservedRunningTime="2026-04-21 07:52:15.306469171 +0000 UTC m=+21.719254357" Apr 21 07:52:15.318230 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.318182 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zglz2" podStartSLOduration=2.8668912669999997 podStartE2EDuration="21.318170456s" podCreationTimestamp="2026-04-21 07:51:54 +0000 UTC" firstStartedPulling="2026-04-21 07:51:55.375351557 +0000 UTC m=+1.788136657" lastFinishedPulling="2026-04-21 07:52:13.826630754 +0000 UTC m=+20.239415846" observedRunningTime="2026-04-21 07:52:15.317901154 +0000 UTC m=+21.730686267" watchObservedRunningTime="2026-04-21 07:52:15.318170456 +0000 UTC m=+21.730955569" Apr 21 07:52:15.331141 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.331068 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gktqp" podStartSLOduration=11.63314253 podStartE2EDuration="21.331058372s" podCreationTimestamp="2026-04-21 07:51:54 +0000 UTC" firstStartedPulling="2026-04-21 07:51:55.380350613 +0000 UTC m=+1.793135719" lastFinishedPulling="2026-04-21 07:52:05.078266469 +0000 UTC m=+11.491051561" observedRunningTime="2026-04-21 07:52:15.330843345 +0000 UTC m=+21.743628458" watchObservedRunningTime="2026-04-21 07:52:15.331058372 +0000 UTC m=+21.743843491" Apr 21 07:52:15.602153 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.602133 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 07:52:15.867157 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.867085 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-l96hv" Apr 21 07:52:15.868257 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:15.868225 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-l96hv" Apr 21 07:52:16.058112 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:16.057850 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T07:52:15.60215023Z","UUID":"aaa42400-33f3-45bf-b7fd-f2c264e342e3","Handler":null,"Name":"","Endpoint":""} Apr 21 07:52:16.059712 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:16.059676 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 07:52:16.059840 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:16.059722 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 07:52:16.089758 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:16.089720 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:52:16.089904 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:16.089835 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q86fn" podUID="e5633692-d3ef-4f27-aec7-1ddc39fd1781" Apr 21 07:52:16.268364 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:16.268332 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" event={"ID":"5d721100-b3b1-4a7b-b896-0f8f1e63c33b","Type":"ContainerStarted","Data":"f70fa1369a1ea781cbcf5d0952276eaf3f2ffa2a2194226c96d923082e31d0b0"} Apr 21 07:52:16.271062 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:16.271032 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" event={"ID":"87b55359ad2fcf9ee78f0e3dd4c3711d","Type":"ContainerStarted","Data":"a320f42a7af31ea3545d506cdabd4824cec12590dc50518a73e406d3f014fb71"} Apr 21 07:52:16.284289 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:16.284255 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-20.ec2.internal" podStartSLOduration=22.284243428 podStartE2EDuration="22.284243428s" podCreationTimestamp="2026-04-21 07:51:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:52:16.284126291 +0000 UTC m=+22.696911406" watchObservedRunningTime="2026-04-21 07:52:16.284243428 +0000 UTC m=+22.697028541" Apr 21 07:52:17.090221 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:17.090186 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:52:17.090387 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:17.090292 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjnl5" podUID="40e690b0-0cf8-4414-b2e4-2f3c492f2196" Apr 21 07:52:17.273973 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:17.273882 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" event={"ID":"5d721100-b3b1-4a7b-b896-0f8f1e63c33b","Type":"ContainerStarted","Data":"b8b2f17b2dfad875940f06fbeac1903db8bdcea356dee527c71dcfa481823ee2"} Apr 21 07:52:17.277052 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:17.277027 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-452pv_5ca5f8c6-71ae-48cf-87d8-4190acb7d09e/ovn-acl-logging/0.log" Apr 21 07:52:17.277511 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:17.277479 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-452pv" event={"ID":"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e","Type":"ContainerStarted","Data":"4f56f70497475d7c4e65a4b0022d419711f7ab6db3bbc68484698d28a237aedc"} Apr 21 07:52:17.277619 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:17.277523 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 21 07:52:17.779874 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:17.779832 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-l96hv" Apr 21 07:52:17.780496 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:17.780470 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-l96hv" Apr 21 07:52:17.794289 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:17.794239 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-ppf7s" podStartSLOduration=2.694674962 podStartE2EDuration="23.794198143s" podCreationTimestamp="2026-04-21 07:51:54 +0000 UTC" firstStartedPulling="2026-04-21 07:51:55.364256178 +0000 UTC m=+1.777041285" lastFinishedPulling="2026-04-21 07:52:16.463779356 +0000 UTC m=+22.876564466" observedRunningTime="2026-04-21 07:52:17.288978278 +0000 UTC m=+23.701763394" watchObservedRunningTime="2026-04-21 07:52:17.794198143 +0000 UTC m=+24.206983258" Apr 21 07:52:18.090394 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:18.090321 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:52:18.090529 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:18.090431 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q86fn" podUID="e5633692-d3ef-4f27-aec7-1ddc39fd1781" Apr 21 07:52:19.090480 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:19.090262 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:52:19.090820 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:19.090564 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjnl5" podUID="40e690b0-0cf8-4414-b2e4-2f3c492f2196" Apr 21 07:52:20.090345 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:20.090170 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:52:20.090551 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:20.090425 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q86fn" podUID="e5633692-d3ef-4f27-aec7-1ddc39fd1781" Apr 21 07:52:20.285190 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:20.285157 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-452pv_5ca5f8c6-71ae-48cf-87d8-4190acb7d09e/ovn-acl-logging/0.log" Apr 21 07:52:20.285546 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:20.285523 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-452pv" event={"ID":"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e","Type":"ContainerStarted","Data":"bdf4de7e57788861cb0196fa49d7c4efaf840c9bc3f7f3cb520895d2c74e5b6e"} Apr 21 07:52:20.285841 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:20.285805 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:52:20.286057 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:20.286039 2574 scope.go:117] "RemoveContainer" containerID="7b2e5365575d8a945f1380f4d17c397ce5fe88963ebe01f3011747fd92b95be6" Apr 21 07:52:20.287235 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:20.287213 2574 generic.go:358] "Generic (PLEG): container finished" podID="2d12ced6-a6eb-40bb-8087-c53f467d8c26" containerID="3c8afaf4467ff5733e5ebea7da3db30cb4378711a51045214dda1921b2553e42" exitCode=0 Apr 21 07:52:20.287332 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:20.287246 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kl7xh" event={"ID":"2d12ced6-a6eb-40bb-8087-c53f467d8c26","Type":"ContainerDied","Data":"3c8afaf4467ff5733e5ebea7da3db30cb4378711a51045214dda1921b2553e42"} Apr 21 07:52:20.301265 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:20.301245 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:52:21.090046 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:21.090017 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:52:21.090204 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:21.090148 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjnl5" podUID="40e690b0-0cf8-4414-b2e4-2f3c492f2196" Apr 21 07:52:21.291395 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:21.291317 2574 generic.go:358] "Generic (PLEG): container finished" podID="2d12ced6-a6eb-40bb-8087-c53f467d8c26" containerID="4ec96234db09eda73df51d1444ea557b5b1df6b5314d69a7ad524c83a6b09dfe" exitCode=0 Apr 21 07:52:21.291830 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:21.291400 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kl7xh" event={"ID":"2d12ced6-a6eb-40bb-8087-c53f467d8c26","Type":"ContainerDied","Data":"4ec96234db09eda73df51d1444ea557b5b1df6b5314d69a7ad524c83a6b09dfe"} Apr 21 07:52:21.294758 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:21.294732 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-452pv_5ca5f8c6-71ae-48cf-87d8-4190acb7d09e/ovn-acl-logging/0.log" Apr 21 07:52:21.295092 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:21.295072 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-452pv" event={"ID":"5ca5f8c6-71ae-48cf-87d8-4190acb7d09e","Type":"ContainerStarted","Data":"735ee609677ee4818e1daec42392a08bfb1ccc6936a4a2298cc7a3fbb1bd39d2"} Apr 21 07:52:21.295293 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:21.295275 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:52:21.295390 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:21.295298 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:52:21.310383 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:21.310362 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:52:21.330021 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:21.329994 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-q86fn"] Apr 21 07:52:21.330143 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:21.330113 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:52:21.330235 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:21.330201 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q86fn" podUID="e5633692-d3ef-4f27-aec7-1ddc39fd1781" Apr 21 07:52:21.330656 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:21.330636 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jjnl5"] Apr 21 07:52:21.330727 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:21.330717 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:52:21.330825 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:21.330803 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjnl5" podUID="40e690b0-0cf8-4414-b2e4-2f3c492f2196" Apr 21 07:52:21.334073 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:21.334035 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-452pv" podStartSLOduration=8.750710325 podStartE2EDuration="27.334021488s" podCreationTimestamp="2026-04-21 07:51:54 +0000 UTC" firstStartedPulling="2026-04-21 07:51:55.311574211 +0000 UTC m=+1.724359305" lastFinishedPulling="2026-04-21 07:52:13.894885364 +0000 UTC m=+20.307670468" observedRunningTime="2026-04-21 07:52:21.333974962 +0000 UTC m=+27.746760075" watchObservedRunningTime="2026-04-21 07:52:21.334021488 +0000 UTC m=+27.746806603" Apr 21 07:52:22.298761 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:22.298727 2574 generic.go:358] "Generic (PLEG): container finished" podID="2d12ced6-a6eb-40bb-8087-c53f467d8c26" containerID="2eb12ba70ec269d184ac1592754ee3a1e08f9dbc80fd26f6029a0dfaca7bfd01" exitCode=0 Apr 21 07:52:22.299231 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:22.298813 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kl7xh" event={"ID":"2d12ced6-a6eb-40bb-8087-c53f467d8c26","Type":"ContainerDied","Data":"2eb12ba70ec269d184ac1592754ee3a1e08f9dbc80fd26f6029a0dfaca7bfd01"} Apr 21 07:52:23.089829 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:23.089798 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:52:23.090030 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:23.089798 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:52:23.090030 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:23.089945 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjnl5" podUID="40e690b0-0cf8-4414-b2e4-2f3c492f2196" Apr 21 07:52:23.090030 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:23.090012 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q86fn" podUID="e5633692-d3ef-4f27-aec7-1ddc39fd1781" Apr 21 07:52:25.089795 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:25.089725 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:52:25.090175 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:25.089726 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:52:25.090175 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:25.089864 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jjnl5" podUID="40e690b0-0cf8-4414-b2e4-2f3c492f2196" Apr 21 07:52:25.090175 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:25.089943 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-q86fn" podUID="e5633692-d3ef-4f27-aec7-1ddc39fd1781" Apr 21 07:52:26.742221 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:26.742185 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs\") pod \"network-metrics-daemon-jjnl5\" (UID: \"40e690b0-0cf8-4414-b2e4-2f3c492f2196\") " pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:52:26.742626 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:26.742328 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:52:26.742626 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:26.742395 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs podName:40e690b0-0cf8-4414-b2e4-2f3c492f2196 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:58.742378634 +0000 UTC m=+65.155163726 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs") pod "network-metrics-daemon-jjnl5" (UID: "40e690b0-0cf8-4414-b2e4-2f3c492f2196") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:52:26.842674 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:26.842638 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4gjg\" (UniqueName: \"kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg\") pod \"network-check-target-q86fn\" (UID: \"e5633692-d3ef-4f27-aec7-1ddc39fd1781\") " pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:52:26.842835 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:26.842798 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:52:26.842835 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:26.842819 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:52:26.842835 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:26.842829 2574 projected.go:194] Error preparing data for projected volume kube-api-access-d4gjg for pod openshift-network-diagnostics/network-check-target-q86fn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:52:26.843001 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:26.842885 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg podName:e5633692-d3ef-4f27-aec7-1ddc39fd1781 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:58.842867886 +0000 UTC m=+65.255652995 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-d4gjg" (UniqueName: "kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg") pod "network-check-target-q86fn" (UID: "e5633692-d3ef-4f27-aec7-1ddc39fd1781") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:52:26.890707 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:26.890680 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-20.ec2.internal" event="NodeReady" Apr 21 07:52:26.890850 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:26.890822 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 07:52:26.935031 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:26.935004 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xmj7g"] Apr 21 07:52:26.963742 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:26.963707 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fw8gv"] Apr 21 07:52:26.963888 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:26.963775 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xmj7g" Apr 21 07:52:26.966146 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:26.966120 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4xjgf\"" Apr 21 07:52:26.966265 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:26.966170 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 07:52:26.966265 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:26.966170 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 07:52:26.978374 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:26.978353 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fw8gv"] Apr 21 07:52:26.978500 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:26.978379 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xmj7g"] Apr 21 07:52:26.978500 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:26.978474 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fw8gv" Apr 21 07:52:26.980631 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:26.980608 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 07:52:26.980745 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:26.980640 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pcpsl\"" Apr 21 07:52:26.980745 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:26.980641 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 07:52:26.980864 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:26.980617 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 07:52:27.090358 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.090326 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:52:27.090514 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.090328 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:52:27.093065 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.093040 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 07:52:27.093191 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.093111 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-f27q5\"" Apr 21 07:52:27.093689 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.093312 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 07:52:27.093689 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.093512 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9jd8h\"" Apr 21 07:52:27.093689 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.093543 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 07:52:27.144472 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.144454 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a3e6d1e-7020-4e01-b08c-6965f9908a29-config-volume\") pod \"dns-default-xmj7g\" (UID: \"9a3e6d1e-7020-4e01-b08c-6965f9908a29\") " pod="openshift-dns/dns-default-xmj7g" Apr 21 07:52:27.144612 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.144523 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9a3e6d1e-7020-4e01-b08c-6965f9908a29-tmp-dir\") pod \"dns-default-xmj7g\" (UID: \"9a3e6d1e-7020-4e01-b08c-6965f9908a29\") " pod="openshift-dns/dns-default-xmj7g" Apr 21 07:52:27.144666 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.144620 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnmfk\" (UniqueName: \"kubernetes.io/projected/cc6f988e-e651-47c1-b9ef-5edf69838385-kube-api-access-cnmfk\") pod \"ingress-canary-fw8gv\" (UID: \"cc6f988e-e651-47c1-b9ef-5edf69838385\") " pod="openshift-ingress-canary/ingress-canary-fw8gv" Apr 21 07:52:27.144715 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.144663 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndtlk\" (UniqueName: \"kubernetes.io/projected/9a3e6d1e-7020-4e01-b08c-6965f9908a29-kube-api-access-ndtlk\") pod \"dns-default-xmj7g\" (UID: \"9a3e6d1e-7020-4e01-b08c-6965f9908a29\") " pod="openshift-dns/dns-default-xmj7g" Apr 21 07:52:27.144715 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.144694 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert\") pod \"ingress-canary-fw8gv\" (UID: \"cc6f988e-e651-47c1-b9ef-5edf69838385\") " pod="openshift-ingress-canary/ingress-canary-fw8gv" Apr 21 07:52:27.144804 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.144743 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls\") pod \"dns-default-xmj7g\" (UID: \"9a3e6d1e-7020-4e01-b08c-6965f9908a29\") " pod="openshift-dns/dns-default-xmj7g" Apr 21 07:52:27.245284 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.245221 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert\") pod \"ingress-canary-fw8gv\" (UID: \"cc6f988e-e651-47c1-b9ef-5edf69838385\") " pod="openshift-ingress-canary/ingress-canary-fw8gv" Apr 21 07:52:27.245284 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.245257 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls\") pod \"dns-default-xmj7g\" (UID: \"9a3e6d1e-7020-4e01-b08c-6965f9908a29\") " pod="openshift-dns/dns-default-xmj7g" Apr 21 07:52:27.245284 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.245283 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a3e6d1e-7020-4e01-b08c-6965f9908a29-config-volume\") pod \"dns-default-xmj7g\" (UID: \"9a3e6d1e-7020-4e01-b08c-6965f9908a29\") " pod="openshift-dns/dns-default-xmj7g" Apr 21 07:52:27.245506 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.245354 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9a3e6d1e-7020-4e01-b08c-6965f9908a29-tmp-dir\") pod \"dns-default-xmj7g\" (UID: \"9a3e6d1e-7020-4e01-b08c-6965f9908a29\") " pod="openshift-dns/dns-default-xmj7g" Apr 21 07:52:27.245506 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:27.245385 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:52:27.245506 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.245399 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnmfk\" (UniqueName: \"kubernetes.io/projected/cc6f988e-e651-47c1-b9ef-5edf69838385-kube-api-access-cnmfk\") pod \"ingress-canary-fw8gv\" (UID: \"cc6f988e-e651-47c1-b9ef-5edf69838385\") " pod="openshift-ingress-canary/ingress-canary-fw8gv" Apr 21 07:52:27.245506 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.245436 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndtlk\" (UniqueName: \"kubernetes.io/projected/9a3e6d1e-7020-4e01-b08c-6965f9908a29-kube-api-access-ndtlk\") pod \"dns-default-xmj7g\" (UID: \"9a3e6d1e-7020-4e01-b08c-6965f9908a29\") " pod="openshift-dns/dns-default-xmj7g" Apr 21 07:52:27.245506 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:27.245452 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls podName:9a3e6d1e-7020-4e01-b08c-6965f9908a29 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:27.745430053 +0000 UTC m=+34.158215145 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls") pod "dns-default-xmj7g" (UID: "9a3e6d1e-7020-4e01-b08c-6965f9908a29") : secret "dns-default-metrics-tls" not found Apr 21 07:52:27.245506 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:27.245387 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:52:27.245775 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:27.245524 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert podName:cc6f988e-e651-47c1-b9ef-5edf69838385 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:27.745512179 +0000 UTC m=+34.158297272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert") pod "ingress-canary-fw8gv" (UID: "cc6f988e-e651-47c1-b9ef-5edf69838385") : secret "canary-serving-cert" not found Apr 21 07:52:27.246091 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.246065 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9a3e6d1e-7020-4e01-b08c-6965f9908a29-tmp-dir\") pod \"dns-default-xmj7g\" (UID: \"9a3e6d1e-7020-4e01-b08c-6965f9908a29\") " pod="openshift-dns/dns-default-xmj7g" Apr 21 07:52:27.246191 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.246156 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9a3e6d1e-7020-4e01-b08c-6965f9908a29-config-volume\") pod \"dns-default-xmj7g\" (UID: \"9a3e6d1e-7020-4e01-b08c-6965f9908a29\") " pod="openshift-dns/dns-default-xmj7g" Apr 21 07:52:27.257787 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.257768 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnmfk\" (UniqueName: \"kubernetes.io/projected/cc6f988e-e651-47c1-b9ef-5edf69838385-kube-api-access-cnmfk\") pod \"ingress-canary-fw8gv\" (UID: \"cc6f988e-e651-47c1-b9ef-5edf69838385\") " pod="openshift-ingress-canary/ingress-canary-fw8gv" Apr 21 07:52:27.258444 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.258421 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndtlk\" (UniqueName: \"kubernetes.io/projected/9a3e6d1e-7020-4e01-b08c-6965f9908a29-kube-api-access-ndtlk\") pod \"dns-default-xmj7g\" (UID: \"9a3e6d1e-7020-4e01-b08c-6965f9908a29\") " pod="openshift-dns/dns-default-xmj7g" Apr 21 07:52:27.749720 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.749687 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert\") pod \"ingress-canary-fw8gv\" (UID: \"cc6f988e-e651-47c1-b9ef-5edf69838385\") " pod="openshift-ingress-canary/ingress-canary-fw8gv" Apr 21 07:52:27.750374 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:27.749729 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls\") pod \"dns-default-xmj7g\" (UID: \"9a3e6d1e-7020-4e01-b08c-6965f9908a29\") " pod="openshift-dns/dns-default-xmj7g" Apr 21 07:52:27.750374 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:27.749865 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:52:27.750374 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:27.749955 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert podName:cc6f988e-e651-47c1-b9ef-5edf69838385 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:28.749915299 +0000 UTC m=+35.162700392 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert") pod "ingress-canary-fw8gv" (UID: "cc6f988e-e651-47c1-b9ef-5edf69838385") : secret "canary-serving-cert" not found Apr 21 07:52:27.750374 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:27.750023 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:52:27.750374 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:27.750082 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls podName:9a3e6d1e-7020-4e01-b08c-6965f9908a29 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:28.750070242 +0000 UTC m=+35.162855334 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls") pod "dns-default-xmj7g" (UID: "9a3e6d1e-7020-4e01-b08c-6965f9908a29") : secret "dns-default-metrics-tls" not found Apr 21 07:52:28.756051 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:28.755979 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert\") pod \"ingress-canary-fw8gv\" (UID: \"cc6f988e-e651-47c1-b9ef-5edf69838385\") " pod="openshift-ingress-canary/ingress-canary-fw8gv" Apr 21 07:52:28.756051 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:28.756017 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls\") pod \"dns-default-xmj7g\" (UID: \"9a3e6d1e-7020-4e01-b08c-6965f9908a29\") " pod="openshift-dns/dns-default-xmj7g" Apr 21 07:52:28.756525 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:28.756129 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:52:28.756525 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:28.756148 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:52:28.756525 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:28.756196 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert podName:cc6f988e-e651-47c1-b9ef-5edf69838385 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:30.756180786 +0000 UTC m=+37.168965878 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert") pod "ingress-canary-fw8gv" (UID: "cc6f988e-e651-47c1-b9ef-5edf69838385") : secret "canary-serving-cert" not found Apr 21 07:52:28.756525 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:28.756211 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls podName:9a3e6d1e-7020-4e01-b08c-6965f9908a29 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:30.756204668 +0000 UTC m=+37.168989759 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls") pod "dns-default-xmj7g" (UID: "9a3e6d1e-7020-4e01-b08c-6965f9908a29") : secret "dns-default-metrics-tls" not found Apr 21 07:52:29.314469 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:29.314426 2574 generic.go:358] "Generic (PLEG): container finished" podID="2d12ced6-a6eb-40bb-8087-c53f467d8c26" containerID="66293ebbdc63317af81ecf478619088882dcc0be43f22d1c1e8cad50720268ef" exitCode=0 Apr 21 07:52:29.314626 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:29.314488 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kl7xh" event={"ID":"2d12ced6-a6eb-40bb-8087-c53f467d8c26","Type":"ContainerDied","Data":"66293ebbdc63317af81ecf478619088882dcc0be43f22d1c1e8cad50720268ef"} Apr 21 07:52:30.318940 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:30.318884 2574 generic.go:358] "Generic (PLEG): container finished" podID="2d12ced6-a6eb-40bb-8087-c53f467d8c26" containerID="24c0786dcbb5028eebb33453e833ea0a25738b76d28fae06308ebd9f84d99b9e" exitCode=0 Apr 21 07:52:30.319463 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:30.318980 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kl7xh" event={"ID":"2d12ced6-a6eb-40bb-8087-c53f467d8c26","Type":"ContainerDied","Data":"24c0786dcbb5028eebb33453e833ea0a25738b76d28fae06308ebd9f84d99b9e"} Apr 21 07:52:30.771908 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:30.771733 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert\") pod \"ingress-canary-fw8gv\" (UID: \"cc6f988e-e651-47c1-b9ef-5edf69838385\") " pod="openshift-ingress-canary/ingress-canary-fw8gv" Apr 21 07:52:30.772068 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:30.771916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls\") pod \"dns-default-xmj7g\" (UID: \"9a3e6d1e-7020-4e01-b08c-6965f9908a29\") " pod="openshift-dns/dns-default-xmj7g" Apr 21 07:52:30.772068 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:30.771874 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:52:30.772068 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:30.772029 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:52:30.772068 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:30.772032 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert podName:cc6f988e-e651-47c1-b9ef-5edf69838385 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:34.772014689 +0000 UTC m=+41.184799793 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert") pod "ingress-canary-fw8gv" (UID: "cc6f988e-e651-47c1-b9ef-5edf69838385") : secret "canary-serving-cert" not found Apr 21 07:52:30.772201 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:30.772073 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls podName:9a3e6d1e-7020-4e01-b08c-6965f9908a29 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:34.772062571 +0000 UTC m=+41.184847663 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls") pod "dns-default-xmj7g" (UID: "9a3e6d1e-7020-4e01-b08c-6965f9908a29") : secret "dns-default-metrics-tls" not found Apr 21 07:52:31.323939 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:31.323900 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kl7xh" event={"ID":"2d12ced6-a6eb-40bb-8087-c53f467d8c26","Type":"ContainerStarted","Data":"22ec64d9a007dc79e77b85266d57bceb95322a2347ecf5146042df30d6f65356"} Apr 21 07:52:31.344294 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:31.344245 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kl7xh" podStartSLOduration=4.413487006 podStartE2EDuration="37.344232037s" podCreationTimestamp="2026-04-21 07:51:54 +0000 UTC" firstStartedPulling="2026-04-21 07:51:55.306623642 +0000 UTC m=+1.719408737" lastFinishedPulling="2026-04-21 07:52:28.237368664 +0000 UTC m=+34.650153768" observedRunningTime="2026-04-21 07:52:31.342760603 +0000 UTC m=+37.755545720" watchObservedRunningTime="2026-04-21 07:52:31.344232037 +0000 UTC m=+37.757017151" Apr 21 07:52:34.799975 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:34.799939 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls\") pod \"dns-default-xmj7g\" (UID: \"9a3e6d1e-7020-4e01-b08c-6965f9908a29\") " pod="openshift-dns/dns-default-xmj7g" Apr 21 07:52:34.800383 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:34.800053 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert\") pod \"ingress-canary-fw8gv\" (UID: \"cc6f988e-e651-47c1-b9ef-5edf69838385\") " pod="openshift-ingress-canary/ingress-canary-fw8gv" Apr 21 07:52:34.800383 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:34.800078 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:52:34.800383 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:34.800126 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:52:34.800383 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:34.800157 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls podName:9a3e6d1e-7020-4e01-b08c-6965f9908a29 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:42.80013759 +0000 UTC m=+49.212922683 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls") pod "dns-default-xmj7g" (UID: "9a3e6d1e-7020-4e01-b08c-6965f9908a29") : secret "dns-default-metrics-tls" not found Apr 21 07:52:34.800383 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:34.800173 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert podName:cc6f988e-e651-47c1-b9ef-5edf69838385 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:42.800165876 +0000 UTC m=+49.212950968 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert") pod "ingress-canary-fw8gv" (UID: "cc6f988e-e651-47c1-b9ef-5edf69838385") : secret "canary-serving-cert" not found Apr 21 07:52:42.849345 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:42.849308 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert\") pod \"ingress-canary-fw8gv\" (UID: \"cc6f988e-e651-47c1-b9ef-5edf69838385\") " pod="openshift-ingress-canary/ingress-canary-fw8gv" Apr 21 07:52:42.849345 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:42.849346 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls\") pod \"dns-default-xmj7g\" (UID: \"9a3e6d1e-7020-4e01-b08c-6965f9908a29\") " pod="openshift-dns/dns-default-xmj7g" Apr 21 07:52:42.849730 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:42.849457 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:52:42.849730 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:42.849462 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:52:42.849730 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:42.849508 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls podName:9a3e6d1e-7020-4e01-b08c-6965f9908a29 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:58.849495147 +0000 UTC m=+65.262280239 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls") pod "dns-default-xmj7g" (UID: "9a3e6d1e-7020-4e01-b08c-6965f9908a29") : secret "dns-default-metrics-tls" not found Apr 21 07:52:42.849730 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:42.849522 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert podName:cc6f988e-e651-47c1-b9ef-5edf69838385 nodeName:}" failed. No retries permitted until 2026-04-21 07:52:58.849515216 +0000 UTC m=+65.262300308 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert") pod "ingress-canary-fw8gv" (UID: "cc6f988e-e651-47c1-b9ef-5edf69838385") : secret "canary-serving-cert" not found Apr 21 07:52:53.311957 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:53.311908 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-452pv" Apr 21 07:52:58.754907 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:58.754861 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs\") pod \"network-metrics-daemon-jjnl5\" (UID: \"40e690b0-0cf8-4414-b2e4-2f3c492f2196\") " pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:52:58.757552 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:58.757533 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 07:52:58.765348 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:58.765328 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 07:52:58.765417 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:58.765407 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs podName:40e690b0-0cf8-4414-b2e4-2f3c492f2196 nodeName:}" failed. No retries permitted until 2026-04-21 07:54:02.765385754 +0000 UTC m=+129.178170847 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs") pod "network-metrics-daemon-jjnl5" (UID: "40e690b0-0cf8-4414-b2e4-2f3c492f2196") : secret "metrics-daemon-secret" not found Apr 21 07:52:58.855197 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:58.855171 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert\") pod \"ingress-canary-fw8gv\" (UID: \"cc6f988e-e651-47c1-b9ef-5edf69838385\") " pod="openshift-ingress-canary/ingress-canary-fw8gv" Apr 21 07:52:58.855254 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:58.855205 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls\") pod \"dns-default-xmj7g\" (UID: \"9a3e6d1e-7020-4e01-b08c-6965f9908a29\") " pod="openshift-dns/dns-default-xmj7g" Apr 21 07:52:58.855254 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:58.855240 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4gjg\" (UniqueName: \"kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg\") pod \"network-check-target-q86fn\" (UID: \"e5633692-d3ef-4f27-aec7-1ddc39fd1781\") " pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:52:58.855346 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:58.855323 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:52:58.855408 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:58.855383 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:52:58.855472 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:58.855389 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert podName:cc6f988e-e651-47c1-b9ef-5edf69838385 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:30.855375006 +0000 UTC m=+97.268160112 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert") pod "ingress-canary-fw8gv" (UID: "cc6f988e-e651-47c1-b9ef-5edf69838385") : secret "canary-serving-cert" not found Apr 21 07:52:58.855530 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:52:58.855496 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls podName:9a3e6d1e-7020-4e01-b08c-6965f9908a29 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:30.855473783 +0000 UTC m=+97.268258886 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls") pod "dns-default-xmj7g" (UID: "9a3e6d1e-7020-4e01-b08c-6965f9908a29") : secret "dns-default-metrics-tls" not found Apr 21 07:52:58.857392 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:58.857376 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 07:52:58.867151 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:58.867134 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 07:52:58.879129 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:58.879112 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4gjg\" (UniqueName: \"kubernetes.io/projected/e5633692-d3ef-4f27-aec7-1ddc39fd1781-kube-api-access-d4gjg\") pod \"network-check-target-q86fn\" (UID: \"e5633692-d3ef-4f27-aec7-1ddc39fd1781\") " pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:52:58.911140 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:58.911113 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-f27q5\"" Apr 21 07:52:58.919855 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:58.919835 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:52:59.111881 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:59.111852 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-q86fn"] Apr 21 07:52:59.116527 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:52:59.116501 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5633692_d3ef_4f27_aec7_1ddc39fd1781.slice/crio-77c5191303313d52547c6dc2477d7704bfc2f334b3c6f83027546351f0b45099 WatchSource:0}: Error finding container 77c5191303313d52547c6dc2477d7704bfc2f334b3c6f83027546351f0b45099: Status 404 returned error can't find the container with id 77c5191303313d52547c6dc2477d7704bfc2f334b3c6f83027546351f0b45099 Apr 21 07:52:59.373705 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:52:59.373621 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-q86fn" event={"ID":"e5633692-d3ef-4f27-aec7-1ddc39fd1781","Type":"ContainerStarted","Data":"77c5191303313d52547c6dc2477d7704bfc2f334b3c6f83027546351f0b45099"} Apr 21 07:53:02.380438 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:02.380350 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-q86fn" event={"ID":"e5633692-d3ef-4f27-aec7-1ddc39fd1781","Type":"ContainerStarted","Data":"67840fa630c55411dc82b99d55737804d789e60ebcdd7fa1c17760f093c4b2c7"} Apr 21 07:53:02.380869 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:02.380471 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:53:02.395497 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:02.395440 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-q86fn" podStartSLOduration=65.572142718 podStartE2EDuration="1m8.395427828s" podCreationTimestamp="2026-04-21 07:51:54 +0000 UTC" firstStartedPulling="2026-04-21 07:52:59.118218681 +0000 UTC m=+65.531003777" lastFinishedPulling="2026-04-21 07:53:01.941503793 +0000 UTC m=+68.354288887" observedRunningTime="2026-04-21 07:53:02.395024036 +0000 UTC m=+68.807809167" watchObservedRunningTime="2026-04-21 07:53:02.395427828 +0000 UTC m=+68.808212968" Apr 21 07:53:22.672014 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.671854 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf"] Apr 21 07:53:22.687508 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.687471 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-78688"] Apr 21 07:53:22.687661 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.687631 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf" Apr 21 07:53:22.690139 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.690118 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-78688" Apr 21 07:53:22.690782 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.690761 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 07:53:22.690999 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.690984 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-mtsdp\"" Apr 21 07:53:22.691504 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.691487 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 21 07:53:22.692972 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.692948 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 21 07:53:22.693065 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.692948 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 07:53:22.693141 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.693122 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 07:53:22.693219 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.693204 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 21 07:53:22.693611 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.693592 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-km62d\"" Apr 21 07:53:22.693813 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.693768 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 07:53:22.693903 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.693864 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 21 07:53:22.700518 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.700500 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 21 07:53:22.708844 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.706301 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf"] Apr 21 07:53:22.708844 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.707037 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-78688"] Apr 21 07:53:22.816406 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.816376 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c63570c8-68d2-427e-8592-5b3ee57b7d7a-tmp\") pod \"insights-operator-585dfdc468-78688\" (UID: \"c63570c8-68d2-427e-8592-5b3ee57b7d7a\") " pod="openshift-insights/insights-operator-585dfdc468-78688" Apr 21 07:53:22.816520 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.816413 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c63570c8-68d2-427e-8592-5b3ee57b7d7a-service-ca-bundle\") pod \"insights-operator-585dfdc468-78688\" (UID: \"c63570c8-68d2-427e-8592-5b3ee57b7d7a\") " pod="openshift-insights/insights-operator-585dfdc468-78688" Apr 21 07:53:22.816520 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.816437 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mslj\" (UniqueName: \"kubernetes.io/projected/c63570c8-68d2-427e-8592-5b3ee57b7d7a-kube-api-access-4mslj\") pod \"insights-operator-585dfdc468-78688\" (UID: \"c63570c8-68d2-427e-8592-5b3ee57b7d7a\") " pod="openshift-insights/insights-operator-585dfdc468-78688" Apr 21 07:53:22.816520 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.816461 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhgfk\" (UniqueName: \"kubernetes.io/projected/60f40b44-8487-4f5b-af99-1681ffa94740-kube-api-access-qhgfk\") pod \"cluster-monitoring-operator-75587bd455-8jlgf\" (UID: \"60f40b44-8487-4f5b-af99-1681ffa94740\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf" Apr 21 07:53:22.816666 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.816524 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c63570c8-68d2-427e-8592-5b3ee57b7d7a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-78688\" (UID: \"c63570c8-68d2-427e-8592-5b3ee57b7d7a\") " pod="openshift-insights/insights-operator-585dfdc468-78688" Apr 21 07:53:22.816666 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.816591 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/60f40b44-8487-4f5b-af99-1681ffa94740-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-8jlgf\" (UID: \"60f40b44-8487-4f5b-af99-1681ffa94740\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf" Apr 21 07:53:22.816666 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.816659 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/60f40b44-8487-4f5b-af99-1681ffa94740-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8jlgf\" (UID: \"60f40b44-8487-4f5b-af99-1681ffa94740\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf" Apr 21 07:53:22.816813 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.816704 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c63570c8-68d2-427e-8592-5b3ee57b7d7a-serving-cert\") pod \"insights-operator-585dfdc468-78688\" (UID: \"c63570c8-68d2-427e-8592-5b3ee57b7d7a\") " pod="openshift-insights/insights-operator-585dfdc468-78688" Apr 21 07:53:22.816813 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.816744 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c63570c8-68d2-427e-8592-5b3ee57b7d7a-snapshots\") pod \"insights-operator-585dfdc468-78688\" (UID: \"c63570c8-68d2-427e-8592-5b3ee57b7d7a\") " pod="openshift-insights/insights-operator-585dfdc468-78688" Apr 21 07:53:22.918009 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.917975 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c63570c8-68d2-427e-8592-5b3ee57b7d7a-snapshots\") pod \"insights-operator-585dfdc468-78688\" (UID: \"c63570c8-68d2-427e-8592-5b3ee57b7d7a\") " pod="openshift-insights/insights-operator-585dfdc468-78688" Apr 21 07:53:22.918090 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.918022 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c63570c8-68d2-427e-8592-5b3ee57b7d7a-tmp\") pod \"insights-operator-585dfdc468-78688\" (UID: \"c63570c8-68d2-427e-8592-5b3ee57b7d7a\") " pod="openshift-insights/insights-operator-585dfdc468-78688" Apr 21 07:53:22.918090 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.918039 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c63570c8-68d2-427e-8592-5b3ee57b7d7a-service-ca-bundle\") pod \"insights-operator-585dfdc468-78688\" (UID: \"c63570c8-68d2-427e-8592-5b3ee57b7d7a\") " pod="openshift-insights/insights-operator-585dfdc468-78688" Apr 21 07:53:22.918090 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.918056 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mslj\" (UniqueName: \"kubernetes.io/projected/c63570c8-68d2-427e-8592-5b3ee57b7d7a-kube-api-access-4mslj\") pod \"insights-operator-585dfdc468-78688\" (UID: \"c63570c8-68d2-427e-8592-5b3ee57b7d7a\") " pod="openshift-insights/insights-operator-585dfdc468-78688" Apr 21 07:53:22.918090 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.918073 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhgfk\" (UniqueName: \"kubernetes.io/projected/60f40b44-8487-4f5b-af99-1681ffa94740-kube-api-access-qhgfk\") pod \"cluster-monitoring-operator-75587bd455-8jlgf\" (UID: \"60f40b44-8487-4f5b-af99-1681ffa94740\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf" Apr 21 07:53:22.918237 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.918197 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c63570c8-68d2-427e-8592-5b3ee57b7d7a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-78688\" (UID: \"c63570c8-68d2-427e-8592-5b3ee57b7d7a\") " pod="openshift-insights/insights-operator-585dfdc468-78688" Apr 21 07:53:22.918292 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.918251 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/60f40b44-8487-4f5b-af99-1681ffa94740-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-8jlgf\" (UID: \"60f40b44-8487-4f5b-af99-1681ffa94740\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf" Apr 21 07:53:22.918454 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.918431 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/60f40b44-8487-4f5b-af99-1681ffa94740-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8jlgf\" (UID: \"60f40b44-8487-4f5b-af99-1681ffa94740\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf" Apr 21 07:53:22.918546 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.918490 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c63570c8-68d2-427e-8592-5b3ee57b7d7a-serving-cert\") pod \"insights-operator-585dfdc468-78688\" (UID: \"c63570c8-68d2-427e-8592-5b3ee57b7d7a\") " pod="openshift-insights/insights-operator-585dfdc468-78688" Apr 21 07:53:22.918591 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:22.918563 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 07:53:22.918666 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:22.918653 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f40b44-8487-4f5b-af99-1681ffa94740-cluster-monitoring-operator-tls podName:60f40b44-8487-4f5b-af99-1681ffa94740 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:23.418631582 +0000 UTC m=+89.831416675 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/60f40b44-8487-4f5b-af99-1681ffa94740-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8jlgf" (UID: "60f40b44-8487-4f5b-af99-1681ffa94740") : secret "cluster-monitoring-operator-tls" not found Apr 21 07:53:22.918724 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.918668 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c63570c8-68d2-427e-8592-5b3ee57b7d7a-tmp\") pod \"insights-operator-585dfdc468-78688\" (UID: \"c63570c8-68d2-427e-8592-5b3ee57b7d7a\") " pod="openshift-insights/insights-operator-585dfdc468-78688" Apr 21 07:53:22.918834 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.918809 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c63570c8-68d2-427e-8592-5b3ee57b7d7a-service-ca-bundle\") pod \"insights-operator-585dfdc468-78688\" (UID: \"c63570c8-68d2-427e-8592-5b3ee57b7d7a\") " pod="openshift-insights/insights-operator-585dfdc468-78688" Apr 21 07:53:22.919079 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.919053 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/60f40b44-8487-4f5b-af99-1681ffa94740-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-8jlgf\" (UID: \"60f40b44-8487-4f5b-af99-1681ffa94740\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf" Apr 21 07:53:22.919434 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.919415 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c63570c8-68d2-427e-8592-5b3ee57b7d7a-snapshots\") pod \"insights-operator-585dfdc468-78688\" (UID: \"c63570c8-68d2-427e-8592-5b3ee57b7d7a\") " pod="openshift-insights/insights-operator-585dfdc468-78688" Apr 21 07:53:22.919716 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.919699 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c63570c8-68d2-427e-8592-5b3ee57b7d7a-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-78688\" (UID: \"c63570c8-68d2-427e-8592-5b3ee57b7d7a\") " pod="openshift-insights/insights-operator-585dfdc468-78688" Apr 21 07:53:22.921150 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.921128 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c63570c8-68d2-427e-8592-5b3ee57b7d7a-serving-cert\") pod \"insights-operator-585dfdc468-78688\" (UID: \"c63570c8-68d2-427e-8592-5b3ee57b7d7a\") " pod="openshift-insights/insights-operator-585dfdc468-78688" Apr 21 07:53:22.926713 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.926659 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhgfk\" (UniqueName: \"kubernetes.io/projected/60f40b44-8487-4f5b-af99-1681ffa94740-kube-api-access-qhgfk\") pod \"cluster-monitoring-operator-75587bd455-8jlgf\" (UID: \"60f40b44-8487-4f5b-af99-1681ffa94740\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf" Apr 21 07:53:22.929298 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:22.929276 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mslj\" (UniqueName: \"kubernetes.io/projected/c63570c8-68d2-427e-8592-5b3ee57b7d7a-kube-api-access-4mslj\") pod \"insights-operator-585dfdc468-78688\" (UID: \"c63570c8-68d2-427e-8592-5b3ee57b7d7a\") " pod="openshift-insights/insights-operator-585dfdc468-78688" Apr 21 07:53:23.006500 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:23.006467 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-78688" Apr 21 07:53:23.116219 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:23.116186 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-78688"] Apr 21 07:53:23.119643 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:53:23.119617 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc63570c8_68d2_427e_8592_5b3ee57b7d7a.slice/crio-8f2a1638a0b8435e91c457c10fe72e089e58d3dd89fa1008436c11f6b2187ea7 WatchSource:0}: Error finding container 8f2a1638a0b8435e91c457c10fe72e089e58d3dd89fa1008436c11f6b2187ea7: Status 404 returned error can't find the container with id 8f2a1638a0b8435e91c457c10fe72e089e58d3dd89fa1008436c11f6b2187ea7 Apr 21 07:53:23.421573 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:23.421545 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-78688" event={"ID":"c63570c8-68d2-427e-8592-5b3ee57b7d7a","Type":"ContainerStarted","Data":"8f2a1638a0b8435e91c457c10fe72e089e58d3dd89fa1008436c11f6b2187ea7"} Apr 21 07:53:23.422819 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:23.422804 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/60f40b44-8487-4f5b-af99-1681ffa94740-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8jlgf\" (UID: \"60f40b44-8487-4f5b-af99-1681ffa94740\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf" Apr 21 07:53:23.422938 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:23.422912 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 07:53:23.423003 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:23.422994 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f40b44-8487-4f5b-af99-1681ffa94740-cluster-monitoring-operator-tls podName:60f40b44-8487-4f5b-af99-1681ffa94740 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:24.422980454 +0000 UTC m=+90.835765546 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/60f40b44-8487-4f5b-af99-1681ffa94740-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8jlgf" (UID: "60f40b44-8487-4f5b-af99-1681ffa94740") : secret "cluster-monitoring-operator-tls" not found Apr 21 07:53:24.430743 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:24.430703 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/60f40b44-8487-4f5b-af99-1681ffa94740-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8jlgf\" (UID: \"60f40b44-8487-4f5b-af99-1681ffa94740\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf" Apr 21 07:53:24.431223 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:24.430856 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 07:53:24.431223 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:24.430939 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f40b44-8487-4f5b-af99-1681ffa94740-cluster-monitoring-operator-tls podName:60f40b44-8487-4f5b-af99-1681ffa94740 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:26.430904972 +0000 UTC m=+92.843690064 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/60f40b44-8487-4f5b-af99-1681ffa94740-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8jlgf" (UID: "60f40b44-8487-4f5b-af99-1681ffa94740") : secret "cluster-monitoring-operator-tls" not found Apr 21 07:53:26.429257 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:26.429217 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-78688" event={"ID":"c63570c8-68d2-427e-8592-5b3ee57b7d7a","Type":"ContainerStarted","Data":"5b5f435604c6e883453e1e5ac76f3475b24e17dd958a8e7bfe7ca9ada08c8cf3"} Apr 21 07:53:26.446744 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:26.444391 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-78688" podStartSLOduration=2.168481216 podStartE2EDuration="4.44437426s" podCreationTimestamp="2026-04-21 07:53:22 +0000 UTC" firstStartedPulling="2026-04-21 07:53:23.121247188 +0000 UTC m=+89.534032284" lastFinishedPulling="2026-04-21 07:53:25.397140234 +0000 UTC m=+91.809925328" observedRunningTime="2026-04-21 07:53:26.443178894 +0000 UTC m=+92.855964014" watchObservedRunningTime="2026-04-21 07:53:26.44437426 +0000 UTC m=+92.857159375" Apr 21 07:53:26.446744 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:26.444676 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/60f40b44-8487-4f5b-af99-1681ffa94740-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8jlgf\" (UID: \"60f40b44-8487-4f5b-af99-1681ffa94740\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf" Apr 21 07:53:26.446744 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:26.444839 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 07:53:26.446744 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:26.444905 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f40b44-8487-4f5b-af99-1681ffa94740-cluster-monitoring-operator-tls podName:60f40b44-8487-4f5b-af99-1681ffa94740 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:30.444887885 +0000 UTC m=+96.857672978 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/60f40b44-8487-4f5b-af99-1681ffa94740-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8jlgf" (UID: "60f40b44-8487-4f5b-af99-1681ffa94740") : secret "cluster-monitoring-operator-tls" not found Apr 21 07:53:27.656764 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:27.656732 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-n27nl"] Apr 21 07:53:27.659612 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:27.659596 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-n27nl" Apr 21 07:53:27.661790 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:27.661759 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:53:27.662621 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:27.662604 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-xc7gr\"" Apr 21 07:53:27.662692 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:27.662621 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 21 07:53:27.667862 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:27.667837 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-n27nl"] Apr 21 07:53:27.752251 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:27.752223 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs5jh\" (UniqueName: \"kubernetes.io/projected/d9a5643a-cb31-4bd1-aab6-1b040f232ad8-kube-api-access-rs5jh\") pod \"volume-data-source-validator-7c6cbb6c87-n27nl\" (UID: \"d9a5643a-cb31-4bd1-aab6-1b040f232ad8\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-n27nl" Apr 21 07:53:27.852900 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:27.852873 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rs5jh\" (UniqueName: \"kubernetes.io/projected/d9a5643a-cb31-4bd1-aab6-1b040f232ad8-kube-api-access-rs5jh\") pod \"volume-data-source-validator-7c6cbb6c87-n27nl\" (UID: \"d9a5643a-cb31-4bd1-aab6-1b040f232ad8\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-n27nl" Apr 21 07:53:27.859790 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:27.859766 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs5jh\" (UniqueName: \"kubernetes.io/projected/d9a5643a-cb31-4bd1-aab6-1b040f232ad8-kube-api-access-rs5jh\") pod \"volume-data-source-validator-7c6cbb6c87-n27nl\" (UID: \"d9a5643a-cb31-4bd1-aab6-1b040f232ad8\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-n27nl" Apr 21 07:53:27.969502 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:27.969476 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-n27nl" Apr 21 07:53:28.078298 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:28.078263 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-n27nl"] Apr 21 07:53:28.082635 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:53:28.082608 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9a5643a_cb31_4bd1_aab6_1b040f232ad8.slice/crio-2958dc771181b2d8db799cd8d7c04dade508d7ae675c1dc45bf49418dff94cb2 WatchSource:0}: Error finding container 2958dc771181b2d8db799cd8d7c04dade508d7ae675c1dc45bf49418dff94cb2: Status 404 returned error can't find the container with id 2958dc771181b2d8db799cd8d7c04dade508d7ae675c1dc45bf49418dff94cb2 Apr 21 07:53:28.432796 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:28.432755 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zglz2_2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1/dns-node-resolver/0.log" Apr 21 07:53:28.433194 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:28.433172 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-n27nl" event={"ID":"d9a5643a-cb31-4bd1-aab6-1b040f232ad8","Type":"ContainerStarted","Data":"2958dc771181b2d8db799cd8d7c04dade508d7ae675c1dc45bf49418dff94cb2"} Apr 21 07:53:28.832943 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:28.832850 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gktqp_db4c65d3-fa4f-4575-b518-0d9e5c9215b9/node-ca/0.log" Apr 21 07:53:29.436913 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:29.436877 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-n27nl" event={"ID":"d9a5643a-cb31-4bd1-aab6-1b040f232ad8","Type":"ContainerStarted","Data":"28f2de591ba9b37d7cdbeffd175978b78766ff9cf0d2cf75e7a7caf262195dcc"} Apr 21 07:53:30.473767 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:30.473733 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/60f40b44-8487-4f5b-af99-1681ffa94740-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8jlgf\" (UID: \"60f40b44-8487-4f5b-af99-1681ffa94740\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf" Apr 21 07:53:30.474245 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:30.473873 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 07:53:30.474245 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:30.473958 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f40b44-8487-4f5b-af99-1681ffa94740-cluster-monitoring-operator-tls podName:60f40b44-8487-4f5b-af99-1681ffa94740 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:38.473939297 +0000 UTC m=+104.886724404 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/60f40b44-8487-4f5b-af99-1681ffa94740-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8jlgf" (UID: "60f40b44-8487-4f5b-af99-1681ffa94740") : secret "cluster-monitoring-operator-tls" not found Apr 21 07:53:30.876055 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:30.876028 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert\") pod \"ingress-canary-fw8gv\" (UID: \"cc6f988e-e651-47c1-b9ef-5edf69838385\") " pod="openshift-ingress-canary/ingress-canary-fw8gv" Apr 21 07:53:30.876163 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:30.876065 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls\") pod \"dns-default-xmj7g\" (UID: \"9a3e6d1e-7020-4e01-b08c-6965f9908a29\") " pod="openshift-dns/dns-default-xmj7g" Apr 21 07:53:30.876223 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:30.876188 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:53:30.876223 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:30.876188 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:53:30.876296 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:30.876246 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls podName:9a3e6d1e-7020-4e01-b08c-6965f9908a29 nodeName:}" failed. No retries permitted until 2026-04-21 07:54:34.876229555 +0000 UTC m=+161.289014646 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls") pod "dns-default-xmj7g" (UID: "9a3e6d1e-7020-4e01-b08c-6965f9908a29") : secret "dns-default-metrics-tls" not found Apr 21 07:53:30.876296 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:30.876260 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert podName:cc6f988e-e651-47c1-b9ef-5edf69838385 nodeName:}" failed. No retries permitted until 2026-04-21 07:54:34.876254003 +0000 UTC m=+161.289039095 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert") pod "ingress-canary-fw8gv" (UID: "cc6f988e-e651-47c1-b9ef-5edf69838385") : secret "canary-serving-cert" not found Apr 21 07:53:32.753080 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:32.753031 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-n27nl" podStartSLOduration=4.470893442 podStartE2EDuration="5.75301563s" podCreationTimestamp="2026-04-21 07:53:27 +0000 UTC" firstStartedPulling="2026-04-21 07:53:28.084396283 +0000 UTC m=+94.497181375" lastFinishedPulling="2026-04-21 07:53:29.366518451 +0000 UTC m=+95.779303563" observedRunningTime="2026-04-21 07:53:29.44989605 +0000 UTC m=+95.862681163" watchObservedRunningTime="2026-04-21 07:53:32.75301563 +0000 UTC m=+99.165800744" Apr 21 07:53:32.753452 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:32.753267 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pz5qt"] Apr 21 07:53:32.757236 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:32.757220 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pz5qt" Apr 21 07:53:32.759519 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:32.759486 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:53:32.759633 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:32.759454 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 21 07:53:32.759633 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:32.759454 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 21 07:53:32.760308 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:32.760295 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 21 07:53:32.760364 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:32.760295 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-ljfq6\"" Apr 21 07:53:32.763899 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:32.763789 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pz5qt"] Apr 21 07:53:32.890236 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:32.890211 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9qjq\" (UniqueName: \"kubernetes.io/projected/62f93d32-f152-4b0d-a51e-3e51bdd7183b-kube-api-access-w9qjq\") pod \"service-ca-operator-d6fc45fc5-pz5qt\" (UID: \"62f93d32-f152-4b0d-a51e-3e51bdd7183b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pz5qt" Apr 21 07:53:32.890366 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:32.890258 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62f93d32-f152-4b0d-a51e-3e51bdd7183b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-pz5qt\" (UID: \"62f93d32-f152-4b0d-a51e-3e51bdd7183b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pz5qt" Apr 21 07:53:32.890436 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:32.890364 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f93d32-f152-4b0d-a51e-3e51bdd7183b-config\") pod \"service-ca-operator-d6fc45fc5-pz5qt\" (UID: \"62f93d32-f152-4b0d-a51e-3e51bdd7183b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pz5qt" Apr 21 07:53:32.990768 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:32.990728 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f93d32-f152-4b0d-a51e-3e51bdd7183b-config\") pod \"service-ca-operator-d6fc45fc5-pz5qt\" (UID: \"62f93d32-f152-4b0d-a51e-3e51bdd7183b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pz5qt" Apr 21 07:53:32.990913 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:32.990788 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9qjq\" (UniqueName: \"kubernetes.io/projected/62f93d32-f152-4b0d-a51e-3e51bdd7183b-kube-api-access-w9qjq\") pod \"service-ca-operator-d6fc45fc5-pz5qt\" (UID: \"62f93d32-f152-4b0d-a51e-3e51bdd7183b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pz5qt" Apr 21 07:53:32.990913 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:32.990832 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62f93d32-f152-4b0d-a51e-3e51bdd7183b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-pz5qt\" (UID: \"62f93d32-f152-4b0d-a51e-3e51bdd7183b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pz5qt" Apr 21 07:53:32.991344 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:32.991321 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62f93d32-f152-4b0d-a51e-3e51bdd7183b-config\") pod \"service-ca-operator-d6fc45fc5-pz5qt\" (UID: \"62f93d32-f152-4b0d-a51e-3e51bdd7183b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pz5qt" Apr 21 07:53:32.993105 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:32.993085 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62f93d32-f152-4b0d-a51e-3e51bdd7183b-serving-cert\") pod \"service-ca-operator-d6fc45fc5-pz5qt\" (UID: \"62f93d32-f152-4b0d-a51e-3e51bdd7183b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pz5qt" Apr 21 07:53:32.998036 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:32.998017 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9qjq\" (UniqueName: \"kubernetes.io/projected/62f93d32-f152-4b0d-a51e-3e51bdd7183b-kube-api-access-w9qjq\") pod \"service-ca-operator-d6fc45fc5-pz5qt\" (UID: \"62f93d32-f152-4b0d-a51e-3e51bdd7183b\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pz5qt" Apr 21 07:53:33.066142 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:33.066076 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pz5qt" Apr 21 07:53:33.182653 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:33.182625 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pz5qt"] Apr 21 07:53:33.186257 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:53:33.186219 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62f93d32_f152_4b0d_a51e_3e51bdd7183b.slice/crio-130af094614d0f8fc6a4096d7c3980c16b186479aeb26116a081e8b2ec85a58e WatchSource:0}: Error finding container 130af094614d0f8fc6a4096d7c3980c16b186479aeb26116a081e8b2ec85a58e: Status 404 returned error can't find the container with id 130af094614d0f8fc6a4096d7c3980c16b186479aeb26116a081e8b2ec85a58e Apr 21 07:53:33.385191 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:33.385112 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-q86fn" Apr 21 07:53:33.449321 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:33.449292 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pz5qt" event={"ID":"62f93d32-f152-4b0d-a51e-3e51bdd7183b","Type":"ContainerStarted","Data":"130af094614d0f8fc6a4096d7c3980c16b186479aeb26116a081e8b2ec85a58e"} Apr 21 07:53:36.311056 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:36.311021 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-xm8hm"] Apr 21 07:53:36.314005 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:36.313989 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xm8hm" Apr 21 07:53:36.316244 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:36.316222 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 21 07:53:36.316372 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:36.316254 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-9glz5\"" Apr 21 07:53:36.317091 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:36.317076 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 21 07:53:36.322916 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:36.322887 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-xm8hm"] Apr 21 07:53:36.415370 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:36.415337 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4d6h\" (UniqueName: \"kubernetes.io/projected/0f1ecc94-e328-44b0-8fc4-cb27c09c3a99-kube-api-access-l4d6h\") pod \"migrator-74bb7799d9-xm8hm\" (UID: \"0f1ecc94-e328-44b0-8fc4-cb27c09c3a99\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xm8hm" Apr 21 07:53:36.456060 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:36.456028 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pz5qt" event={"ID":"62f93d32-f152-4b0d-a51e-3e51bdd7183b","Type":"ContainerStarted","Data":"d6cd1627f24090fc211d74f8724ad23ab71e3b5421c4c25fb904fbf746afdd06"} Apr 21 07:53:36.469170 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:36.469120 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pz5qt" podStartSLOduration=2.074802542 podStartE2EDuration="4.469106453s" podCreationTimestamp="2026-04-21 07:53:32 +0000 UTC" firstStartedPulling="2026-04-21 07:53:33.188001883 +0000 UTC m=+99.600786977" lastFinishedPulling="2026-04-21 07:53:35.582305796 +0000 UTC m=+101.995090888" observedRunningTime="2026-04-21 07:53:36.468804709 +0000 UTC m=+102.881589804" watchObservedRunningTime="2026-04-21 07:53:36.469106453 +0000 UTC m=+102.881891596" Apr 21 07:53:36.516769 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:36.516740 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4d6h\" (UniqueName: \"kubernetes.io/projected/0f1ecc94-e328-44b0-8fc4-cb27c09c3a99-kube-api-access-l4d6h\") pod \"migrator-74bb7799d9-xm8hm\" (UID: \"0f1ecc94-e328-44b0-8fc4-cb27c09c3a99\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xm8hm" Apr 21 07:53:36.524241 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:36.524216 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4d6h\" (UniqueName: \"kubernetes.io/projected/0f1ecc94-e328-44b0-8fc4-cb27c09c3a99-kube-api-access-l4d6h\") pod \"migrator-74bb7799d9-xm8hm\" (UID: \"0f1ecc94-e328-44b0-8fc4-cb27c09c3a99\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xm8hm" Apr 21 07:53:36.624272 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:36.624201 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xm8hm" Apr 21 07:53:36.740110 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:36.740074 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-xm8hm"] Apr 21 07:53:36.743722 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:53:36.743692 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f1ecc94_e328_44b0_8fc4_cb27c09c3a99.slice/crio-c5f464dbd79bc6fd546faa78c510d28a363ac9339a5f5887c0d0acac30fd6eb0 WatchSource:0}: Error finding container c5f464dbd79bc6fd546faa78c510d28a363ac9339a5f5887c0d0acac30fd6eb0: Status 404 returned error can't find the container with id c5f464dbd79bc6fd546faa78c510d28a363ac9339a5f5887c0d0acac30fd6eb0 Apr 21 07:53:37.458712 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:37.458677 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xm8hm" event={"ID":"0f1ecc94-e328-44b0-8fc4-cb27c09c3a99","Type":"ContainerStarted","Data":"c5f464dbd79bc6fd546faa78c510d28a363ac9339a5f5887c0d0acac30fd6eb0"} Apr 21 07:53:38.462750 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:38.462714 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xm8hm" event={"ID":"0f1ecc94-e328-44b0-8fc4-cb27c09c3a99","Type":"ContainerStarted","Data":"7aeb6633d642b89b5041c779526b2447a14bd43aecc03d0833d61363aafb0884"} Apr 21 07:53:38.462750 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:38.462750 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xm8hm" event={"ID":"0f1ecc94-e328-44b0-8fc4-cb27c09c3a99","Type":"ContainerStarted","Data":"320d62ba3d5fb0e1227d2a81299cdea2344206a008dca22227925786af121f70"} Apr 21 07:53:38.476711 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:38.476665 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xm8hm" podStartSLOduration=1.326690795 podStartE2EDuration="2.4766507s" podCreationTimestamp="2026-04-21 07:53:36 +0000 UTC" firstStartedPulling="2026-04-21 07:53:36.745638964 +0000 UTC m=+103.158424057" lastFinishedPulling="2026-04-21 07:53:37.895598857 +0000 UTC m=+104.308383962" observedRunningTime="2026-04-21 07:53:38.475870126 +0000 UTC m=+104.888655241" watchObservedRunningTime="2026-04-21 07:53:38.4766507 +0000 UTC m=+104.889435814" Apr 21 07:53:38.532356 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:38.532320 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/60f40b44-8487-4f5b-af99-1681ffa94740-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8jlgf\" (UID: \"60f40b44-8487-4f5b-af99-1681ffa94740\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf" Apr 21 07:53:38.532485 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:38.532441 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 21 07:53:38.532524 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:38.532496 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f40b44-8487-4f5b-af99-1681ffa94740-cluster-monitoring-operator-tls podName:60f40b44-8487-4f5b-af99-1681ffa94740 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:54.532482369 +0000 UTC m=+120.945267464 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/60f40b44-8487-4f5b-af99-1681ffa94740-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-8jlgf" (UID: "60f40b44-8487-4f5b-af99-1681ffa94740") : secret "cluster-monitoring-operator-tls" not found Apr 21 07:53:39.665646 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:39.665610 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fm5jh"] Apr 21 07:53:39.668844 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:39.668821 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fm5jh" Apr 21 07:53:39.671330 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:39.671310 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 07:53:39.671440 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:39.671402 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vjklh\"" Apr 21 07:53:39.671950 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:39.671915 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 07:53:39.682074 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:39.682053 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fm5jh"] Apr 21 07:53:39.740685 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:39.740656 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-data-volume\") pod \"insights-runtime-extractor-fm5jh\" (UID: \"4f44f401-0bd8-4a90-870d-3e4ab5afe97d\") " pod="openshift-insights/insights-runtime-extractor-fm5jh" Apr 21 07:53:39.740685 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:39.740697 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-crio-socket\") pod \"insights-runtime-extractor-fm5jh\" (UID: \"4f44f401-0bd8-4a90-870d-3e4ab5afe97d\") " pod="openshift-insights/insights-runtime-extractor-fm5jh" Apr 21 07:53:39.740959 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:39.740804 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fm5jh\" (UID: \"4f44f401-0bd8-4a90-870d-3e4ab5afe97d\") " pod="openshift-insights/insights-runtime-extractor-fm5jh" Apr 21 07:53:39.740959 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:39.740881 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xczwf\" (UniqueName: \"kubernetes.io/projected/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-kube-api-access-xczwf\") pod \"insights-runtime-extractor-fm5jh\" (UID: \"4f44f401-0bd8-4a90-870d-3e4ab5afe97d\") " pod="openshift-insights/insights-runtime-extractor-fm5jh" Apr 21 07:53:39.741130 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:39.740976 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fm5jh\" (UID: \"4f44f401-0bd8-4a90-870d-3e4ab5afe97d\") " pod="openshift-insights/insights-runtime-extractor-fm5jh" Apr 21 07:53:39.841539 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:39.841503 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xczwf\" (UniqueName: \"kubernetes.io/projected/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-kube-api-access-xczwf\") pod \"insights-runtime-extractor-fm5jh\" (UID: \"4f44f401-0bd8-4a90-870d-3e4ab5afe97d\") " pod="openshift-insights/insights-runtime-extractor-fm5jh" Apr 21 07:53:39.841758 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:39.841556 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fm5jh\" (UID: \"4f44f401-0bd8-4a90-870d-3e4ab5afe97d\") " pod="openshift-insights/insights-runtime-extractor-fm5jh" Apr 21 07:53:39.841758 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:39.841650 2574 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 07:53:39.841758 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:39.841701 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-insights-runtime-extractor-tls podName:4f44f401-0bd8-4a90-870d-3e4ab5afe97d nodeName:}" failed. No retries permitted until 2026-04-21 07:53:40.34168638 +0000 UTC m=+106.754471472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-insights-runtime-extractor-tls") pod "insights-runtime-extractor-fm5jh" (UID: "4f44f401-0bd8-4a90-870d-3e4ab5afe97d") : secret "insights-runtime-extractor-tls" not found Apr 21 07:53:39.841758 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:39.841729 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-data-volume\") pod \"insights-runtime-extractor-fm5jh\" (UID: \"4f44f401-0bd8-4a90-870d-3e4ab5afe97d\") " pod="openshift-insights/insights-runtime-extractor-fm5jh" Apr 21 07:53:39.842004 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:39.841775 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-crio-socket\") pod \"insights-runtime-extractor-fm5jh\" (UID: \"4f44f401-0bd8-4a90-870d-3e4ab5afe97d\") " pod="openshift-insights/insights-runtime-extractor-fm5jh" Apr 21 07:53:39.842004 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:39.841833 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fm5jh\" (UID: \"4f44f401-0bd8-4a90-870d-3e4ab5afe97d\") " pod="openshift-insights/insights-runtime-extractor-fm5jh" Apr 21 07:53:39.842004 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:39.841884 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-crio-socket\") pod \"insights-runtime-extractor-fm5jh\" (UID: \"4f44f401-0bd8-4a90-870d-3e4ab5afe97d\") " pod="openshift-insights/insights-runtime-extractor-fm5jh" Apr 21 07:53:39.842105 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:39.842089 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-data-volume\") pod \"insights-runtime-extractor-fm5jh\" (UID: \"4f44f401-0bd8-4a90-870d-3e4ab5afe97d\") " pod="openshift-insights/insights-runtime-extractor-fm5jh" Apr 21 07:53:39.842290 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:39.842272 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fm5jh\" (UID: \"4f44f401-0bd8-4a90-870d-3e4ab5afe97d\") " pod="openshift-insights/insights-runtime-extractor-fm5jh" Apr 21 07:53:39.849011 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:39.848991 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xczwf\" (UniqueName: \"kubernetes.io/projected/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-kube-api-access-xczwf\") pod \"insights-runtime-extractor-fm5jh\" (UID: \"4f44f401-0bd8-4a90-870d-3e4ab5afe97d\") " pod="openshift-insights/insights-runtime-extractor-fm5jh" Apr 21 07:53:40.172004 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:40.171966 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-r927t"] Apr 21 07:53:40.175150 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:40.175128 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-r927t" Apr 21 07:53:40.177654 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:40.177566 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-b92w9\"" Apr 21 07:53:40.177777 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:40.177736 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 21 07:53:40.177842 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:40.177799 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 21 07:53:40.177892 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:40.177837 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 21 07:53:40.177892 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:40.177850 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 21 07:53:40.184050 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:40.184031 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-r927t"] Apr 21 07:53:40.244449 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:40.244426 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d62814c4-6729-4bbd-9170-e3bb1249bce1-signing-key\") pod \"service-ca-865cb79987-r927t\" (UID: \"d62814c4-6729-4bbd-9170-e3bb1249bce1\") " pod="openshift-service-ca/service-ca-865cb79987-r927t" Apr 21 07:53:40.244566 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:40.244461 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fwpq\" (UniqueName: \"kubernetes.io/projected/d62814c4-6729-4bbd-9170-e3bb1249bce1-kube-api-access-7fwpq\") pod \"service-ca-865cb79987-r927t\" (UID: \"d62814c4-6729-4bbd-9170-e3bb1249bce1\") " pod="openshift-service-ca/service-ca-865cb79987-r927t" Apr 21 07:53:40.244566 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:40.244549 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d62814c4-6729-4bbd-9170-e3bb1249bce1-signing-cabundle\") pod \"service-ca-865cb79987-r927t\" (UID: \"d62814c4-6729-4bbd-9170-e3bb1249bce1\") " pod="openshift-service-ca/service-ca-865cb79987-r927t" Apr 21 07:53:40.345122 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:40.345086 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d62814c4-6729-4bbd-9170-e3bb1249bce1-signing-key\") pod \"service-ca-865cb79987-r927t\" (UID: \"d62814c4-6729-4bbd-9170-e3bb1249bce1\") " pod="openshift-service-ca/service-ca-865cb79987-r927t" Apr 21 07:53:40.345289 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:40.345129 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fwpq\" (UniqueName: \"kubernetes.io/projected/d62814c4-6729-4bbd-9170-e3bb1249bce1-kube-api-access-7fwpq\") pod \"service-ca-865cb79987-r927t\" (UID: \"d62814c4-6729-4bbd-9170-e3bb1249bce1\") " pod="openshift-service-ca/service-ca-865cb79987-r927t" Apr 21 07:53:40.345359 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:40.345295 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d62814c4-6729-4bbd-9170-e3bb1249bce1-signing-cabundle\") pod \"service-ca-865cb79987-r927t\" (UID: \"d62814c4-6729-4bbd-9170-e3bb1249bce1\") " pod="openshift-service-ca/service-ca-865cb79987-r927t" Apr 21 07:53:40.345413 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:40.345358 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fm5jh\" (UID: \"4f44f401-0bd8-4a90-870d-3e4ab5afe97d\") " pod="openshift-insights/insights-runtime-extractor-fm5jh" Apr 21 07:53:40.345465 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:40.345455 2574 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 07:53:40.345527 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:40.345515 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-insights-runtime-extractor-tls podName:4f44f401-0bd8-4a90-870d-3e4ab5afe97d nodeName:}" failed. No retries permitted until 2026-04-21 07:53:41.345498106 +0000 UTC m=+107.758283198 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-insights-runtime-extractor-tls") pod "insights-runtime-extractor-fm5jh" (UID: "4f44f401-0bd8-4a90-870d-3e4ab5afe97d") : secret "insights-runtime-extractor-tls" not found Apr 21 07:53:40.345973 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:40.345952 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d62814c4-6729-4bbd-9170-e3bb1249bce1-signing-cabundle\") pod \"service-ca-865cb79987-r927t\" (UID: \"d62814c4-6729-4bbd-9170-e3bb1249bce1\") " pod="openshift-service-ca/service-ca-865cb79987-r927t" Apr 21 07:53:40.347618 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:40.347600 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d62814c4-6729-4bbd-9170-e3bb1249bce1-signing-key\") pod \"service-ca-865cb79987-r927t\" (UID: \"d62814c4-6729-4bbd-9170-e3bb1249bce1\") " pod="openshift-service-ca/service-ca-865cb79987-r927t" Apr 21 07:53:40.351861 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:40.351841 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fwpq\" (UniqueName: \"kubernetes.io/projected/d62814c4-6729-4bbd-9170-e3bb1249bce1-kube-api-access-7fwpq\") pod \"service-ca-865cb79987-r927t\" (UID: \"d62814c4-6729-4bbd-9170-e3bb1249bce1\") " pod="openshift-service-ca/service-ca-865cb79987-r927t" Apr 21 07:53:40.484579 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:40.484486 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-r927t" Apr 21 07:53:40.595848 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:40.595803 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-r927t"] Apr 21 07:53:40.599060 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:53:40.599030 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd62814c4_6729_4bbd_9170_e3bb1249bce1.slice/crio-0b9b18db593c8198216d56cd89b87c6fe1bf87452ee91b83732b0143e0da70cb WatchSource:0}: Error finding container 0b9b18db593c8198216d56cd89b87c6fe1bf87452ee91b83732b0143e0da70cb: Status 404 returned error can't find the container with id 0b9b18db593c8198216d56cd89b87c6fe1bf87452ee91b83732b0143e0da70cb Apr 21 07:53:41.352053 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:41.352021 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fm5jh\" (UID: \"4f44f401-0bd8-4a90-870d-3e4ab5afe97d\") " pod="openshift-insights/insights-runtime-extractor-fm5jh" Apr 21 07:53:41.352508 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:41.352165 2574 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 07:53:41.352508 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:41.352226 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-insights-runtime-extractor-tls podName:4f44f401-0bd8-4a90-870d-3e4ab5afe97d nodeName:}" failed. No retries permitted until 2026-04-21 07:53:43.352211049 +0000 UTC m=+109.764996146 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-insights-runtime-extractor-tls") pod "insights-runtime-extractor-fm5jh" (UID: "4f44f401-0bd8-4a90-870d-3e4ab5afe97d") : secret "insights-runtime-extractor-tls" not found Apr 21 07:53:41.473337 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:41.473301 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-r927t" event={"ID":"d62814c4-6729-4bbd-9170-e3bb1249bce1","Type":"ContainerStarted","Data":"ca6dc9301cb711e3f155738a2745eb6532ea8ac11089e47d67e12bb2465aebb6"} Apr 21 07:53:41.473337 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:41.473339 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-r927t" event={"ID":"d62814c4-6729-4bbd-9170-e3bb1249bce1","Type":"ContainerStarted","Data":"0b9b18db593c8198216d56cd89b87c6fe1bf87452ee91b83732b0143e0da70cb"} Apr 21 07:53:41.490009 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:41.489960 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-r927t" podStartSLOduration=1.489944995 podStartE2EDuration="1.489944995s" podCreationTimestamp="2026-04-21 07:53:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:53:41.48958846 +0000 UTC m=+107.902373586" watchObservedRunningTime="2026-04-21 07:53:41.489944995 +0000 UTC m=+107.902730107" Apr 21 07:53:43.367257 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:43.367221 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fm5jh\" (UID: \"4f44f401-0bd8-4a90-870d-3e4ab5afe97d\") " pod="openshift-insights/insights-runtime-extractor-fm5jh" Apr 21 07:53:43.367624 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:43.367374 2574 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 07:53:43.367624 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:43.367436 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-insights-runtime-extractor-tls podName:4f44f401-0bd8-4a90-870d-3e4ab5afe97d nodeName:}" failed. No retries permitted until 2026-04-21 07:53:47.367418756 +0000 UTC m=+113.780203849 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-insights-runtime-extractor-tls") pod "insights-runtime-extractor-fm5jh" (UID: "4f44f401-0bd8-4a90-870d-3e4ab5afe97d") : secret "insights-runtime-extractor-tls" not found Apr 21 07:53:47.402468 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:47.402422 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fm5jh\" (UID: \"4f44f401-0bd8-4a90-870d-3e4ab5afe97d\") " pod="openshift-insights/insights-runtime-extractor-fm5jh" Apr 21 07:53:47.402991 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:47.402590 2574 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 07:53:47.402991 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:53:47.402693 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-insights-runtime-extractor-tls podName:4f44f401-0bd8-4a90-870d-3e4ab5afe97d nodeName:}" failed. No retries permitted until 2026-04-21 07:53:55.40266959 +0000 UTC m=+121.815454682 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-insights-runtime-extractor-tls") pod "insights-runtime-extractor-fm5jh" (UID: "4f44f401-0bd8-4a90-870d-3e4ab5afe97d") : secret "insights-runtime-extractor-tls" not found Apr 21 07:53:54.555813 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:54.555772 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/60f40b44-8487-4f5b-af99-1681ffa94740-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8jlgf\" (UID: \"60f40b44-8487-4f5b-af99-1681ffa94740\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf" Apr 21 07:53:54.558295 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:54.558274 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/60f40b44-8487-4f5b-af99-1681ffa94740-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-8jlgf\" (UID: \"60f40b44-8487-4f5b-af99-1681ffa94740\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf" Apr 21 07:53:54.801867 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:54.801837 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-mtsdp\"" Apr 21 07:53:54.810638 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:54.810592 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf" Apr 21 07:53:54.921974 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:54.921937 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf"] Apr 21 07:53:54.926764 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:53:54.926735 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60f40b44_8487_4f5b_af99_1681ffa94740.slice/crio-8118613c267fc8a656622b57e5c4d6c354170ca57fdb6c5bc8dee1da36d12dfb WatchSource:0}: Error finding container 8118613c267fc8a656622b57e5c4d6c354170ca57fdb6c5bc8dee1da36d12dfb: Status 404 returned error can't find the container with id 8118613c267fc8a656622b57e5c4d6c354170ca57fdb6c5bc8dee1da36d12dfb Apr 21 07:53:55.461485 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:55.461448 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fm5jh\" (UID: \"4f44f401-0bd8-4a90-870d-3e4ab5afe97d\") " pod="openshift-insights/insights-runtime-extractor-fm5jh" Apr 21 07:53:55.464186 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:55.464158 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/4f44f401-0bd8-4a90-870d-3e4ab5afe97d-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fm5jh\" (UID: \"4f44f401-0bd8-4a90-870d-3e4ab5afe97d\") " pod="openshift-insights/insights-runtime-extractor-fm5jh" Apr 21 07:53:55.510077 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:55.510046 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf" event={"ID":"60f40b44-8487-4f5b-af99-1681ffa94740","Type":"ContainerStarted","Data":"8118613c267fc8a656622b57e5c4d6c354170ca57fdb6c5bc8dee1da36d12dfb"} Apr 21 07:53:55.579158 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:55.579126 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vjklh\"" Apr 21 07:53:55.587837 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:55.587816 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fm5jh" Apr 21 07:53:55.718112 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:55.718038 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fm5jh"] Apr 21 07:53:55.721451 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:53:55.721423 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f44f401_0bd8_4a90_870d_3e4ab5afe97d.slice/crio-ba55dc2253f12303069fe50d4a4250cd302d94581cef8cb38aed5506e64143b7 WatchSource:0}: Error finding container ba55dc2253f12303069fe50d4a4250cd302d94581cef8cb38aed5506e64143b7: Status 404 returned error can't find the container with id ba55dc2253f12303069fe50d4a4250cd302d94581cef8cb38aed5506e64143b7 Apr 21 07:53:56.514486 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:56.514434 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fm5jh" event={"ID":"4f44f401-0bd8-4a90-870d-3e4ab5afe97d","Type":"ContainerStarted","Data":"ccaf13a360d52fc328f37b97b91e0289dd716ad004a95237e0633004d0c17e54"} Apr 21 07:53:56.514486 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:56.514482 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fm5jh" event={"ID":"4f44f401-0bd8-4a90-870d-3e4ab5afe97d","Type":"ContainerStarted","Data":"ba55dc2253f12303069fe50d4a4250cd302d94581cef8cb38aed5506e64143b7"} Apr 21 07:53:57.518063 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:57.518029 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf" event={"ID":"60f40b44-8487-4f5b-af99-1681ffa94740","Type":"ContainerStarted","Data":"7d8fec140501f11d12ca18c5bd12f34a1190b31a20094901364a04291120104d"} Apr 21 07:53:57.519595 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:57.519572 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fm5jh" event={"ID":"4f44f401-0bd8-4a90-870d-3e4ab5afe97d","Type":"ContainerStarted","Data":"6fc53c270f76ecfa72c01ff5dad6814b6be9878ff3a15ee888ef881b1941cd3d"} Apr 21 07:53:57.532882 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:57.532836 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-8jlgf" podStartSLOduration=33.702112047 podStartE2EDuration="35.532824552s" podCreationTimestamp="2026-04-21 07:53:22 +0000 UTC" firstStartedPulling="2026-04-21 07:53:54.928608609 +0000 UTC m=+121.341393704" lastFinishedPulling="2026-04-21 07:53:56.759321111 +0000 UTC m=+123.172106209" observedRunningTime="2026-04-21 07:53:57.532420904 +0000 UTC m=+123.945206018" watchObservedRunningTime="2026-04-21 07:53:57.532824552 +0000 UTC m=+123.945609666" Apr 21 07:53:57.880265 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:57.880190 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ddc4h"] Apr 21 07:53:57.883462 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:57.883432 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ddc4h" Apr 21 07:53:57.885888 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:57.885868 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-w2j4q\"" Apr 21 07:53:57.886282 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:57.886263 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 21 07:53:57.893790 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:57.893749 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ddc4h"] Apr 21 07:53:57.983134 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:57.983085 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/31c3668a-6489-488c-87ac-270ec5e7bc32-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-ddc4h\" (UID: \"31c3668a-6489-488c-87ac-270ec5e7bc32\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ddc4h" Apr 21 07:53:58.083823 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:58.083785 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/31c3668a-6489-488c-87ac-270ec5e7bc32-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-ddc4h\" (UID: \"31c3668a-6489-488c-87ac-270ec5e7bc32\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ddc4h" Apr 21 07:53:58.086676 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:58.086643 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/31c3668a-6489-488c-87ac-270ec5e7bc32-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-ddc4h\" (UID: \"31c3668a-6489-488c-87ac-270ec5e7bc32\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ddc4h" Apr 21 07:53:58.195519 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:58.195486 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ddc4h" Apr 21 07:53:58.313481 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:58.313450 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ddc4h"] Apr 21 07:53:58.316913 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:53:58.316890 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31c3668a_6489_488c_87ac_270ec5e7bc32.slice/crio-97bd9bdddbe5870902d94ff2ae730db607beb46ab3917a473381abbda5241e98 WatchSource:0}: Error finding container 97bd9bdddbe5870902d94ff2ae730db607beb46ab3917a473381abbda5241e98: Status 404 returned error can't find the container with id 97bd9bdddbe5870902d94ff2ae730db607beb46ab3917a473381abbda5241e98 Apr 21 07:53:58.527106 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:58.527031 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ddc4h" event={"ID":"31c3668a-6489-488c-87ac-270ec5e7bc32","Type":"ContainerStarted","Data":"97bd9bdddbe5870902d94ff2ae730db607beb46ab3917a473381abbda5241e98"} Apr 21 07:53:58.528688 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:58.528665 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fm5jh" event={"ID":"4f44f401-0bd8-4a90-870d-3e4ab5afe97d","Type":"ContainerStarted","Data":"328ac3997f5fb9aa81e254b9b5bf1dfde4e2a475d63beb1dbd3e4a28c33aa63e"} Apr 21 07:53:58.545628 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:58.545585 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fm5jh" podStartSLOduration=17.134455103 podStartE2EDuration="19.545573389s" podCreationTimestamp="2026-04-21 07:53:39 +0000 UTC" firstStartedPulling="2026-04-21 07:53:55.794114938 +0000 UTC m=+122.206900033" lastFinishedPulling="2026-04-21 07:53:58.205233211 +0000 UTC m=+124.618018319" observedRunningTime="2026-04-21 07:53:58.544811671 +0000 UTC m=+124.957596797" watchObservedRunningTime="2026-04-21 07:53:58.545573389 +0000 UTC m=+124.958358537" Apr 21 07:53:59.532196 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:59.532161 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ddc4h" event={"ID":"31c3668a-6489-488c-87ac-270ec5e7bc32","Type":"ContainerStarted","Data":"e5667d79a7efecf6fe54a342708104189786608b3e66011093153e86083453e6"} Apr 21 07:53:59.546288 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:53:59.546242 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ddc4h" podStartSLOduration=1.509150076 podStartE2EDuration="2.546229417s" podCreationTimestamp="2026-04-21 07:53:57 +0000 UTC" firstStartedPulling="2026-04-21 07:53:58.31915059 +0000 UTC m=+124.731935686" lastFinishedPulling="2026-04-21 07:53:59.356229933 +0000 UTC m=+125.769015027" observedRunningTime="2026-04-21 07:53:59.545435104 +0000 UTC m=+125.958220220" watchObservedRunningTime="2026-04-21 07:53:59.546229417 +0000 UTC m=+125.959014530" Apr 21 07:54:00.536326 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:00.536295 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ddc4h" Apr 21 07:54:00.540695 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:00.540673 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ddc4h" Apr 21 07:54:01.298186 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:01.298157 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-js2pk"] Apr 21 07:54:01.300196 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:01.300177 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-js2pk" Apr 21 07:54:01.302236 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:01.302212 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 21 07:54:01.303198 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:01.303179 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 07:54:01.303309 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:01.303232 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 21 07:54:01.303309 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:01.303262 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-qqclh\"" Apr 21 07:54:01.308528 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:01.308509 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-js2pk"] Apr 21 07:54:01.406165 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:01.406129 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e42f657-c2c4-4850-8aee-6981435fe148-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-js2pk\" (UID: \"5e42f657-c2c4-4850-8aee-6981435fe148\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-js2pk" Apr 21 07:54:01.406327 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:01.406203 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8tcg\" (UniqueName: \"kubernetes.io/projected/5e42f657-c2c4-4850-8aee-6981435fe148-kube-api-access-v8tcg\") pod \"prometheus-operator-5676c8c784-js2pk\" (UID: \"5e42f657-c2c4-4850-8aee-6981435fe148\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-js2pk" Apr 21 07:54:01.406327 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:01.406252 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e42f657-c2c4-4850-8aee-6981435fe148-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-js2pk\" (UID: \"5e42f657-c2c4-4850-8aee-6981435fe148\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-js2pk" Apr 21 07:54:01.406327 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:01.406274 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5e42f657-c2c4-4850-8aee-6981435fe148-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-js2pk\" (UID: \"5e42f657-c2c4-4850-8aee-6981435fe148\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-js2pk" Apr 21 07:54:01.506641 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:01.506597 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e42f657-c2c4-4850-8aee-6981435fe148-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-js2pk\" (UID: \"5e42f657-c2c4-4850-8aee-6981435fe148\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-js2pk" Apr 21 07:54:01.506837 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:01.506673 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8tcg\" (UniqueName: \"kubernetes.io/projected/5e42f657-c2c4-4850-8aee-6981435fe148-kube-api-access-v8tcg\") pod \"prometheus-operator-5676c8c784-js2pk\" (UID: \"5e42f657-c2c4-4850-8aee-6981435fe148\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-js2pk" Apr 21 07:54:01.506837 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:01.506716 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e42f657-c2c4-4850-8aee-6981435fe148-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-js2pk\" (UID: \"5e42f657-c2c4-4850-8aee-6981435fe148\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-js2pk" Apr 21 07:54:01.506837 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:01.506749 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5e42f657-c2c4-4850-8aee-6981435fe148-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-js2pk\" (UID: \"5e42f657-c2c4-4850-8aee-6981435fe148\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-js2pk" Apr 21 07:54:01.507274 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:01.507254 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e42f657-c2c4-4850-8aee-6981435fe148-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-js2pk\" (UID: \"5e42f657-c2c4-4850-8aee-6981435fe148\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-js2pk" Apr 21 07:54:01.509341 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:01.509320 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5e42f657-c2c4-4850-8aee-6981435fe148-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-js2pk\" (UID: \"5e42f657-c2c4-4850-8aee-6981435fe148\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-js2pk" Apr 21 07:54:01.509449 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:01.509432 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5e42f657-c2c4-4850-8aee-6981435fe148-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-js2pk\" (UID: \"5e42f657-c2c4-4850-8aee-6981435fe148\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-js2pk" Apr 21 07:54:01.514460 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:01.514433 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8tcg\" (UniqueName: \"kubernetes.io/projected/5e42f657-c2c4-4850-8aee-6981435fe148-kube-api-access-v8tcg\") pod \"prometheus-operator-5676c8c784-js2pk\" (UID: \"5e42f657-c2c4-4850-8aee-6981435fe148\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-js2pk" Apr 21 07:54:01.611121 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:01.611025 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-js2pk" Apr 21 07:54:01.724889 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:01.724864 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-js2pk"] Apr 21 07:54:01.727125 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:54:01.727098 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e42f657_c2c4_4850_8aee_6981435fe148.slice/crio-b5e7707813352be00892e138aa58d7fd3c67f0a8319ea055affbce74d21e32bf WatchSource:0}: Error finding container b5e7707813352be00892e138aa58d7fd3c67f0a8319ea055affbce74d21e32bf: Status 404 returned error can't find the container with id b5e7707813352be00892e138aa58d7fd3c67f0a8319ea055affbce74d21e32bf Apr 21 07:54:02.542710 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:02.542629 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-js2pk" event={"ID":"5e42f657-c2c4-4850-8aee-6981435fe148","Type":"ContainerStarted","Data":"b5e7707813352be00892e138aa58d7fd3c67f0a8319ea055affbce74d21e32bf"} Apr 21 07:54:02.817432 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:02.817354 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs\") pod \"network-metrics-daemon-jjnl5\" (UID: \"40e690b0-0cf8-4414-b2e4-2f3c492f2196\") " pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:54:02.820036 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:02.820010 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/40e690b0-0cf8-4414-b2e4-2f3c492f2196-metrics-certs\") pod \"network-metrics-daemon-jjnl5\" (UID: \"40e690b0-0cf8-4414-b2e4-2f3c492f2196\") " pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:54:03.110940 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:03.105228 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9jd8h\"" Apr 21 07:54:03.114494 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:03.113269 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jjnl5" Apr 21 07:54:03.244499 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:03.244443 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jjnl5"] Apr 21 07:54:03.251060 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:54:03.250536 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40e690b0_0cf8_4414_b2e4_2f3c492f2196.slice/crio-3deb86612001884707f997f7c272f9d6c3db1721e15425daabdd9f0857d4fb0b WatchSource:0}: Error finding container 3deb86612001884707f997f7c272f9d6c3db1721e15425daabdd9f0857d4fb0b: Status 404 returned error can't find the container with id 3deb86612001884707f997f7c272f9d6c3db1721e15425daabdd9f0857d4fb0b Apr 21 07:54:03.545484 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:03.545448 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jjnl5" event={"ID":"40e690b0-0cf8-4414-b2e4-2f3c492f2196","Type":"ContainerStarted","Data":"3deb86612001884707f997f7c272f9d6c3db1721e15425daabdd9f0857d4fb0b"} Apr 21 07:54:03.546819 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:03.546797 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-js2pk" event={"ID":"5e42f657-c2c4-4850-8aee-6981435fe148","Type":"ContainerStarted","Data":"a23c12797352a07711ee05e61afb5505b8ccc1235da19a50ef5aea8481671ef6"} Apr 21 07:54:03.546819 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:03.546822 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-js2pk" event={"ID":"5e42f657-c2c4-4850-8aee-6981435fe148","Type":"ContainerStarted","Data":"88d593f0491c95757efb0a57af49d544fc393d32b5d1b3969326649542d2772c"} Apr 21 07:54:03.567260 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:03.564746 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-js2pk" podStartSLOduration=1.260605929 podStartE2EDuration="2.564729913s" podCreationTimestamp="2026-04-21 07:54:01 +0000 UTC" firstStartedPulling="2026-04-21 07:54:01.729004702 +0000 UTC m=+128.141789798" lastFinishedPulling="2026-04-21 07:54:03.033128689 +0000 UTC m=+129.445913782" observedRunningTime="2026-04-21 07:54:03.562917595 +0000 UTC m=+129.975702708" watchObservedRunningTime="2026-04-21 07:54:03.564729913 +0000 UTC m=+129.977515028" Apr 21 07:54:04.550720 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:04.550686 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jjnl5" event={"ID":"40e690b0-0cf8-4414-b2e4-2f3c492f2196","Type":"ContainerStarted","Data":"07cf6be78203422df4aca8bd8aa8d26fcd6c7f89e37ed6022373e120ab188e2a"} Apr 21 07:54:04.550720 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:04.550726 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jjnl5" event={"ID":"40e690b0-0cf8-4414-b2e4-2f3c492f2196","Type":"ContainerStarted","Data":"401321a462e1ab6f6a31939c27c0abd8449a2c795e496f21945bc2553e70e253"} Apr 21 07:54:04.566232 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:04.566185 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jjnl5" podStartSLOduration=129.659570472 podStartE2EDuration="2m10.566170957s" podCreationTimestamp="2026-04-21 07:51:54 +0000 UTC" firstStartedPulling="2026-04-21 07:54:03.253213694 +0000 UTC m=+129.665998788" lastFinishedPulling="2026-04-21 07:54:04.159814181 +0000 UTC m=+130.572599273" observedRunningTime="2026-04-21 07:54:04.564893087 +0000 UTC m=+130.977678201" watchObservedRunningTime="2026-04-21 07:54:04.566170957 +0000 UTC m=+130.978956071" Apr 21 07:54:05.661776 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.661745 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-t4dg6"] Apr 21 07:54:05.664002 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.663979 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" Apr 21 07:54:05.666233 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.666209 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 21 07:54:05.667169 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.667117 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 21 07:54:05.667169 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.667138 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-vqddp\"" Apr 21 07:54:05.667348 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.667128 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 21 07:54:05.676575 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.675908 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zvf59"] Apr 21 07:54:05.678301 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.678278 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-t4dg6"] Apr 21 07:54:05.678426 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.678412 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.680797 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.680775 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 07:54:05.681043 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.680823 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 07:54:05.681124 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.681060 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 07:54:05.681284 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.681270 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wdcf8\"" Apr 21 07:54:05.740000 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.739970 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-metrics-client-ca\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.740119 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.740044 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-root\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.740119 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.740073 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-node-exporter-textfile\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.740119 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.740100 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-sys\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.740242 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.740127 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-node-exporter-wtmp\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.740242 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.740152 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-node-exporter-tls\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.740242 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.740175 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfp84\" (UniqueName: \"kubernetes.io/projected/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-kube-api-access-lfp84\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.740242 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.740203 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/bc770555-e352-4582-b1e8-001e6f487668-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-t4dg6\" (UID: \"bc770555-e352-4582-b1e8-001e6f487668\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" Apr 21 07:54:05.740242 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.740239 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc770555-e352-4582-b1e8-001e6f487668-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-t4dg6\" (UID: \"bc770555-e352-4582-b1e8-001e6f487668\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" Apr 21 07:54:05.740438 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.740281 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.740438 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.740314 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-node-exporter-accelerators-collector-config\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.740438 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.740337 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bc770555-e352-4582-b1e8-001e6f487668-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-t4dg6\" (UID: \"bc770555-e352-4582-b1e8-001e6f487668\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" Apr 21 07:54:05.740438 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.740362 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/bc770555-e352-4582-b1e8-001e6f487668-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-t4dg6\" (UID: \"bc770555-e352-4582-b1e8-001e6f487668\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" Apr 21 07:54:05.740438 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.740426 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bc770555-e352-4582-b1e8-001e6f487668-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-t4dg6\" (UID: \"bc770555-e352-4582-b1e8-001e6f487668\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" Apr 21 07:54:05.740633 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.740495 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvxgs\" (UniqueName: \"kubernetes.io/projected/bc770555-e352-4582-b1e8-001e6f487668-kube-api-access-gvxgs\") pod \"kube-state-metrics-69db897b98-t4dg6\" (UID: \"bc770555-e352-4582-b1e8-001e6f487668\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" Apr 21 07:54:05.841598 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.841558 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-node-exporter-tls\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.841773 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.841604 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfp84\" (UniqueName: \"kubernetes.io/projected/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-kube-api-access-lfp84\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.841773 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.841745 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/bc770555-e352-4582-b1e8-001e6f487668-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-t4dg6\" (UID: \"bc770555-e352-4582-b1e8-001e6f487668\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" Apr 21 07:54:05.841887 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.841808 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc770555-e352-4582-b1e8-001e6f487668-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-t4dg6\" (UID: \"bc770555-e352-4582-b1e8-001e6f487668\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" Apr 21 07:54:05.841970 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.841877 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.841970 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.841961 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-node-exporter-accelerators-collector-config\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.842077 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.841990 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bc770555-e352-4582-b1e8-001e6f487668-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-t4dg6\" (UID: \"bc770555-e352-4582-b1e8-001e6f487668\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" Apr 21 07:54:05.842077 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.842020 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/bc770555-e352-4582-b1e8-001e6f487668-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-t4dg6\" (UID: \"bc770555-e352-4582-b1e8-001e6f487668\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" Apr 21 07:54:05.842077 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.842060 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bc770555-e352-4582-b1e8-001e6f487668-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-t4dg6\" (UID: \"bc770555-e352-4582-b1e8-001e6f487668\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" Apr 21 07:54:05.842209 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.842094 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvxgs\" (UniqueName: \"kubernetes.io/projected/bc770555-e352-4582-b1e8-001e6f487668-kube-api-access-gvxgs\") pod \"kube-state-metrics-69db897b98-t4dg6\" (UID: \"bc770555-e352-4582-b1e8-001e6f487668\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" Apr 21 07:54:05.842209 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.842132 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/bc770555-e352-4582-b1e8-001e6f487668-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-t4dg6\" (UID: \"bc770555-e352-4582-b1e8-001e6f487668\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" Apr 21 07:54:05.842209 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.842136 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-metrics-client-ca\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.842359 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.842237 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-root\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.842359 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.842267 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-node-exporter-textfile\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.842359 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.842297 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-sys\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.842359 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.842328 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-node-exporter-wtmp\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.842565 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.842480 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-node-exporter-wtmp\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.842565 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.842527 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-root\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.842565 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.842528 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc770555-e352-4582-b1e8-001e6f487668-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-t4dg6\" (UID: \"bc770555-e352-4582-b1e8-001e6f487668\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" Apr 21 07:54:05.842704 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.842560 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-node-exporter-accelerators-collector-config\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.842704 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.842579 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-sys\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.842704 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.842611 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-metrics-client-ca\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.842846 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.842739 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-node-exporter-textfile\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.843207 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.843182 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/bc770555-e352-4582-b1e8-001e6f487668-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-t4dg6\" (UID: \"bc770555-e352-4582-b1e8-001e6f487668\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" Apr 21 07:54:05.844800 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.844779 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.844964 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.844940 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-node-exporter-tls\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.845040 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.845024 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bc770555-e352-4582-b1e8-001e6f487668-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-t4dg6\" (UID: \"bc770555-e352-4582-b1e8-001e6f487668\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" Apr 21 07:54:05.845103 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.845089 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bc770555-e352-4582-b1e8-001e6f487668-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-t4dg6\" (UID: \"bc770555-e352-4582-b1e8-001e6f487668\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" Apr 21 07:54:05.852075 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.852052 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfp84\" (UniqueName: \"kubernetes.io/projected/78cf98b5-cf16-43e9-9ffb-27d7f8a58917-kube-api-access-lfp84\") pod \"node-exporter-zvf59\" (UID: \"78cf98b5-cf16-43e9-9ffb-27d7f8a58917\") " pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.852185 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.852145 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvxgs\" (UniqueName: \"kubernetes.io/projected/bc770555-e352-4582-b1e8-001e6f487668-kube-api-access-gvxgs\") pod \"kube-state-metrics-69db897b98-t4dg6\" (UID: \"bc770555-e352-4582-b1e8-001e6f487668\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" Apr 21 07:54:05.975157 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.975088 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" Apr 21 07:54:05.989700 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:05.989672 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zvf59" Apr 21 07:54:05.999952 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:54:05.999886 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78cf98b5_cf16_43e9_9ffb_27d7f8a58917.slice/crio-28d7287ef98b8d7825981eda68be897ddc6492de55a17807bd623ce0f5f5df64 WatchSource:0}: Error finding container 28d7287ef98b8d7825981eda68be897ddc6492de55a17807bd623ce0f5f5df64: Status 404 returned error can't find the container with id 28d7287ef98b8d7825981eda68be897ddc6492de55a17807bd623ce0f5f5df64 Apr 21 07:54:06.119461 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:06.119431 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-t4dg6"] Apr 21 07:54:06.122570 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:54:06.122542 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc770555_e352_4582_b1e8_001e6f487668.slice/crio-3956659850af53cc8be6de1730e7e6d18c93156d4614fa4bb86c76ee974b53d1 WatchSource:0}: Error finding container 3956659850af53cc8be6de1730e7e6d18c93156d4614fa4bb86c76ee974b53d1: Status 404 returned error can't find the container with id 3956659850af53cc8be6de1730e7e6d18c93156d4614fa4bb86c76ee974b53d1 Apr 21 07:54:06.557339 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:06.557301 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" event={"ID":"bc770555-e352-4582-b1e8-001e6f487668","Type":"ContainerStarted","Data":"3956659850af53cc8be6de1730e7e6d18c93156d4614fa4bb86c76ee974b53d1"} Apr 21 07:54:06.558487 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:06.558452 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zvf59" event={"ID":"78cf98b5-cf16-43e9-9ffb-27d7f8a58917","Type":"ContainerStarted","Data":"28d7287ef98b8d7825981eda68be897ddc6492de55a17807bd623ce0f5f5df64"} Apr 21 07:54:07.562580 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.562548 2574 generic.go:358] "Generic (PLEG): container finished" podID="78cf98b5-cf16-43e9-9ffb-27d7f8a58917" containerID="329ec1857f22967f2c6079a79d36aa1933753d7c260a1891b9c50222c5b29d6b" exitCode=0 Apr 21 07:54:07.563048 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.562637 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zvf59" event={"ID":"78cf98b5-cf16-43e9-9ffb-27d7f8a58917","Type":"ContainerDied","Data":"329ec1857f22967f2c6079a79d36aa1933753d7c260a1891b9c50222c5b29d6b"} Apr 21 07:54:07.630617 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.630592 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7fd6765fd-sch2j"] Apr 21 07:54:07.635301 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.635280 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.637905 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.637883 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 21 07:54:07.638027 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.637938 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 21 07:54:07.638027 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.637951 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 21 07:54:07.638146 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.638055 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-4gh6v\"" Apr 21 07:54:07.638146 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.638094 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 21 07:54:07.638262 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.638162 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-4mfr1dd3jmqlq\"" Apr 21 07:54:07.638376 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.638358 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 21 07:54:07.648003 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.647983 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7fd6765fd-sch2j"] Apr 21 07:54:07.656249 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.656174 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-secret-grpc-tls\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.656351 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.656290 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.656351 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.656321 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-metrics-client-ca\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.656646 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.656401 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.656646 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.656434 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.656646 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.656460 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-secret-thanos-querier-tls\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.656646 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.656497 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wdzx\" (UniqueName: \"kubernetes.io/projected/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-kube-api-access-9wdzx\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.656646 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.656529 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.757760 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.757721 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-secret-grpc-tls\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.757873 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.757792 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.757873 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.757809 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-metrics-client-ca\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.757873 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.757844 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.757873 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.757862 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.758119 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.757877 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-secret-thanos-querier-tls\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.758119 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.757900 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wdzx\" (UniqueName: \"kubernetes.io/projected/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-kube-api-access-9wdzx\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.758119 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.757956 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.759887 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.759600 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-metrics-client-ca\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.762177 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.762129 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-secret-thanos-querier-tls\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.762758 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.762700 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.763325 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.763104 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.763325 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.763288 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.763744 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.763698 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.765697 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.765672 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-secret-grpc-tls\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.768165 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.768098 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wdzx\" (UniqueName: \"kubernetes.io/projected/bbdc7510-ab52-49ea-a9cb-4ed5a39e6614-kube-api-access-9wdzx\") pod \"thanos-querier-7fd6765fd-sch2j\" (UID: \"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614\") " pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:07.947653 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:07.947617 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:08.080500 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:08.080473 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7fd6765fd-sch2j"] Apr 21 07:54:08.082888 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:54:08.082860 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbdc7510_ab52_49ea_a9cb_4ed5a39e6614.slice/crio-c097d980601e3e3952995e3022f2396b59691f8f63eefead04053e783828f404 WatchSource:0}: Error finding container c097d980601e3e3952995e3022f2396b59691f8f63eefead04053e783828f404: Status 404 returned error can't find the container with id c097d980601e3e3952995e3022f2396b59691f8f63eefead04053e783828f404 Apr 21 07:54:08.570522 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:08.570483 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zvf59" event={"ID":"78cf98b5-cf16-43e9-9ffb-27d7f8a58917","Type":"ContainerStarted","Data":"e55c05768398d14c36c65455763f17be119d260a343f2eae7f5d4b33f8b7dbe5"} Apr 21 07:54:08.570915 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:08.570528 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zvf59" event={"ID":"78cf98b5-cf16-43e9-9ffb-27d7f8a58917","Type":"ContainerStarted","Data":"9fa333e08e2b2210f5b7becfdad4f2615b1f28ab1b8f19dfc542a981f0ec0fa6"} Apr 21 07:54:08.571700 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:08.571668 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" event={"ID":"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614","Type":"ContainerStarted","Data":"c097d980601e3e3952995e3022f2396b59691f8f63eefead04053e783828f404"} Apr 21 07:54:08.573449 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:08.573426 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" event={"ID":"bc770555-e352-4582-b1e8-001e6f487668","Type":"ContainerStarted","Data":"81e0aac3bc6e7053974ce6c2a299e7076792779b7434d22c71e0438e4e95d3a9"} Apr 21 07:54:08.573551 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:08.573454 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" event={"ID":"bc770555-e352-4582-b1e8-001e6f487668","Type":"ContainerStarted","Data":"8b8c1e7dea4e22e7945358e2a6fbafdb6466bdb359897d59556c86d70014bdf8"} Apr 21 07:54:08.573551 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:08.573467 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" event={"ID":"bc770555-e352-4582-b1e8-001e6f487668","Type":"ContainerStarted","Data":"db22aed5088185469f9e8369e8f8bbdbb1ddf757cf23aea809ef8fe5f598e901"} Apr 21 07:54:08.591152 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:08.591103 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zvf59" podStartSLOduration=2.7679217080000003 podStartE2EDuration="3.591088453s" podCreationTimestamp="2026-04-21 07:54:05 +0000 UTC" firstStartedPulling="2026-04-21 07:54:06.003097473 +0000 UTC m=+132.415882564" lastFinishedPulling="2026-04-21 07:54:06.8262642 +0000 UTC m=+133.239049309" observedRunningTime="2026-04-21 07:54:08.588613831 +0000 UTC m=+135.001398957" watchObservedRunningTime="2026-04-21 07:54:08.591088453 +0000 UTC m=+135.003873568" Apr 21 07:54:08.604514 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:08.604442 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-t4dg6" podStartSLOduration=2.143813724 podStartE2EDuration="3.604428675s" podCreationTimestamp="2026-04-21 07:54:05 +0000 UTC" firstStartedPulling="2026-04-21 07:54:06.124239029 +0000 UTC m=+132.537024125" lastFinishedPulling="2026-04-21 07:54:07.584853984 +0000 UTC m=+133.997639076" observedRunningTime="2026-04-21 07:54:08.603703256 +0000 UTC m=+135.016488399" watchObservedRunningTime="2026-04-21 07:54:08.604428675 +0000 UTC m=+135.017213791" Apr 21 07:54:10.581738 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.581653 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" event={"ID":"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614","Type":"ContainerStarted","Data":"bceb3ebe6ba35ce1fec4bac0a072bf0877a469afd02ac04bcad792e299acb46f"} Apr 21 07:54:10.581738 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.581694 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" event={"ID":"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614","Type":"ContainerStarted","Data":"853b9b56279a17e3436185ead683c408087447ef9a12dd00e2540585364a3c28"} Apr 21 07:54:10.581738 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.581706 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" event={"ID":"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614","Type":"ContainerStarted","Data":"1fee0d160827e12ec5ff6167b209acc25379f4cf35b840cc16657ab0afa68218"} Apr 21 07:54:10.859993 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.859901 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-75679f49d-lrpkx"] Apr 21 07:54:10.862393 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.862373 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.864691 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.864664 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 21 07:54:10.864691 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.864671 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 21 07:54:10.864893 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.864692 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 21 07:54:10.864893 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.864691 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 21 07:54:10.864893 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.864762 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-4bhp5\"" Apr 21 07:54:10.864893 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.864789 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 21 07:54:10.870249 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.870228 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 21 07:54:10.872481 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.872464 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-75679f49d-lrpkx"] Apr 21 07:54:10.883829 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.883812 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79ec319c-106c-4f6b-8565-b2204159d925-telemeter-trusted-ca-bundle\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.883957 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.883842 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/79ec319c-106c-4f6b-8565-b2204159d925-telemeter-client-tls\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.883957 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.883863 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/79ec319c-106c-4f6b-8565-b2204159d925-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.883957 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.883912 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79ec319c-106c-4f6b-8565-b2204159d925-serving-certs-ca-bundle\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.884075 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.883997 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx5ht\" (UniqueName: \"kubernetes.io/projected/79ec319c-106c-4f6b-8565-b2204159d925-kube-api-access-vx5ht\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.884075 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.884031 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/79ec319c-106c-4f6b-8565-b2204159d925-federate-client-tls\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.884148 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.884081 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/79ec319c-106c-4f6b-8565-b2204159d925-secret-telemeter-client\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.884148 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.884115 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/79ec319c-106c-4f6b-8565-b2204159d925-metrics-client-ca\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.985289 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.985259 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/79ec319c-106c-4f6b-8565-b2204159d925-telemeter-client-tls\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.985463 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.985299 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/79ec319c-106c-4f6b-8565-b2204159d925-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.985463 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.985333 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79ec319c-106c-4f6b-8565-b2204159d925-serving-certs-ca-bundle\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.985463 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.985363 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vx5ht\" (UniqueName: \"kubernetes.io/projected/79ec319c-106c-4f6b-8565-b2204159d925-kube-api-access-vx5ht\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.985463 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.985389 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/79ec319c-106c-4f6b-8565-b2204159d925-federate-client-tls\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.985463 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.985439 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/79ec319c-106c-4f6b-8565-b2204159d925-secret-telemeter-client\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.985696 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.985468 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/79ec319c-106c-4f6b-8565-b2204159d925-metrics-client-ca\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.985696 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.985521 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79ec319c-106c-4f6b-8565-b2204159d925-telemeter-trusted-ca-bundle\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.986302 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.986247 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79ec319c-106c-4f6b-8565-b2204159d925-serving-certs-ca-bundle\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.986431 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.986389 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/79ec319c-106c-4f6b-8565-b2204159d925-metrics-client-ca\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.986546 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.986433 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79ec319c-106c-4f6b-8565-b2204159d925-telemeter-trusted-ca-bundle\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.990746 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.988839 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/79ec319c-106c-4f6b-8565-b2204159d925-secret-telemeter-client\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.990746 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.989053 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/79ec319c-106c-4f6b-8565-b2204159d925-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.990746 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.989257 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/79ec319c-106c-4f6b-8565-b2204159d925-telemeter-client-tls\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.990746 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.989412 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/79ec319c-106c-4f6b-8565-b2204159d925-federate-client-tls\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:10.994338 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:10.994315 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx5ht\" (UniqueName: \"kubernetes.io/projected/79ec319c-106c-4f6b-8565-b2204159d925-kube-api-access-vx5ht\") pod \"telemeter-client-75679f49d-lrpkx\" (UID: \"79ec319c-106c-4f6b-8565-b2204159d925\") " pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:11.173019 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.172989 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" Apr 21 07:54:11.303662 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.303642 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-75679f49d-lrpkx"] Apr 21 07:54:11.310490 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:54:11.310459 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79ec319c_106c_4f6b_8565_b2204159d925.slice/crio-d80d9130ac41c0cd13fc187e4e7ef448080aeb10a6b06c5f647c0e466dd233ee WatchSource:0}: Error finding container d80d9130ac41c0cd13fc187e4e7ef448080aeb10a6b06c5f647c0e466dd233ee: Status 404 returned error can't find the container with id d80d9130ac41c0cd13fc187e4e7ef448080aeb10a6b06c5f647c0e466dd233ee Apr 21 07:54:11.587225 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.587145 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" event={"ID":"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614","Type":"ContainerStarted","Data":"503377c7b38d1ac55dd9180acd329b1fc2b8b4b15a848805abddebd3fc51025b"} Apr 21 07:54:11.587225 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.587190 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" event={"ID":"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614","Type":"ContainerStarted","Data":"a5296722586c5019d90cfb8b8acb4cee87730a6ca210e42fb717adb03c0263fa"} Apr 21 07:54:11.587225 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.587208 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" event={"ID":"bbdc7510-ab52-49ea-a9cb-4ed5a39e6614","Type":"ContainerStarted","Data":"3de7bbc85ffb5c5453e6101efe4e1474592d6e3a1af6330052b15d9a5379a01f"} Apr 21 07:54:11.587679 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.587360 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:11.588379 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.588357 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" event={"ID":"79ec319c-106c-4f6b-8565-b2204159d925","Type":"ContainerStarted","Data":"d80d9130ac41c0cd13fc187e4e7ef448080aeb10a6b06c5f647c0e466dd233ee"} Apr 21 07:54:11.607989 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.607945 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" podStartSLOduration=1.510353334 podStartE2EDuration="4.607910997s" podCreationTimestamp="2026-04-21 07:54:07 +0000 UTC" firstStartedPulling="2026-04-21 07:54:08.084684597 +0000 UTC m=+134.497469703" lastFinishedPulling="2026-04-21 07:54:11.182242273 +0000 UTC m=+137.595027366" observedRunningTime="2026-04-21 07:54:11.60771357 +0000 UTC m=+138.020498686" watchObservedRunningTime="2026-04-21 07:54:11.607910997 +0000 UTC m=+138.020696113" Apr 21 07:54:11.821967 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.821911 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 07:54:11.825334 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.825302 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.827635 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.827609 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 07:54:11.827767 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.827613 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 07:54:11.827767 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.827668 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 07:54:11.828161 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.828143 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 07:54:11.828381 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.828361 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 07:54:11.828636 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.828152 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 07:54:11.828881 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.828847 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 07:54:11.829073 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.828647 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 07:54:11.829271 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.829248 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 07:54:11.829440 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.829421 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 07:54:11.829510 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.829456 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 07:54:11.830181 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.829751 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-urvvqtsb9eh4\"" Apr 21 07:54:11.830181 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.829771 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 07:54:11.830181 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.829913 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-gj5j8\"" Apr 21 07:54:11.831361 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.831342 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 07:54:11.840405 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.840349 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 07:54:11.892991 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.892963 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.893143 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.892998 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.893143 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.893026 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.893143 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.893048 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz9vz\" (UniqueName: \"kubernetes.io/projected/037c36c7-08ed-4a06-ab93-55d7b251322e-kube-api-access-nz9vz\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.893143 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.893086 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.893143 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.893115 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-config\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.893311 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.893154 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.893311 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.893210 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/037c36c7-08ed-4a06-ab93-55d7b251322e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.893311 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.893236 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.893311 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.893256 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/037c36c7-08ed-4a06-ab93-55d7b251322e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.893451 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.893309 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.893451 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.893379 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/037c36c7-08ed-4a06-ab93-55d7b251322e-config-out\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.893451 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.893418 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.893535 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.893457 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.893535 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.893477 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.893535 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.893492 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.893535 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.893509 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.893829 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.893567 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-web-config\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.994475 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.994438 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.994630 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.994491 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.994630 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.994519 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.994630 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.994543 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.994630 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.994568 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.994630 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.994621 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-web-config\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.994890 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.994657 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.994890 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.994688 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.994890 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.994718 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.994890 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.994743 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nz9vz\" (UniqueName: \"kubernetes.io/projected/037c36c7-08ed-4a06-ab93-55d7b251322e-kube-api-access-nz9vz\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.994890 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.994782 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.994890 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.994816 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-config\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.994890 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.994850 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.994890 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.994901 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/037c36c7-08ed-4a06-ab93-55d7b251322e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.995349 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.994951 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.995349 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.994979 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/037c36c7-08ed-4a06-ab93-55d7b251322e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.995349 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.995010 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.995349 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.995051 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/037c36c7-08ed-4a06-ab93-55d7b251322e-config-out\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.995517 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.995482 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.995517 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.995481 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.996593 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.996559 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.997881 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.997854 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/037c36c7-08ed-4a06-ab93-55d7b251322e-config-out\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:11.998184 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.998158 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/037c36c7-08ed-4a06-ab93-55d7b251322e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:12.000030 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.998764 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:12.000030 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.998862 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:12.000030 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.998954 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:12.000030 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.999225 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-web-config\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:12.000030 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.999578 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:12.000030 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:11.999896 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:12.001277 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:12.001219 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:12.001668 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:12.001619 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:12.001902 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:12.001858 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:12.002145 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:12.002126 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-config\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:12.002693 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:12.002589 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/037c36c7-08ed-4a06-ab93-55d7b251322e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:12.004265 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:12.004205 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:12.005295 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:12.005272 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz9vz\" (UniqueName: \"kubernetes.io/projected/037c36c7-08ed-4a06-ab93-55d7b251322e-kube-api-access-nz9vz\") pod \"prometheus-k8s-0\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:12.138329 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:12.138250 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:12.298620 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:12.298586 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 07:54:12.302917 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:54:12.302884 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod037c36c7_08ed_4a06_ab93_55d7b251322e.slice/crio-285a3a7a6d7bfd2a6c8f59b044bcc1ab8e251715eb16564a18f3245ede2cb41b WatchSource:0}: Error finding container 285a3a7a6d7bfd2a6c8f59b044bcc1ab8e251715eb16564a18f3245ede2cb41b: Status 404 returned error can't find the container with id 285a3a7a6d7bfd2a6c8f59b044bcc1ab8e251715eb16564a18f3245ede2cb41b Apr 21 07:54:12.592955 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:12.592897 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"037c36c7-08ed-4a06-ab93-55d7b251322e","Type":"ContainerStarted","Data":"285a3a7a6d7bfd2a6c8f59b044bcc1ab8e251715eb16564a18f3245ede2cb41b"} Apr 21 07:54:14.600232 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:14.600190 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" event={"ID":"79ec319c-106c-4f6b-8565-b2204159d925","Type":"ContainerStarted","Data":"90541c13c2f21f3145fb4b32d97a365df9f7c6a72bbf016ef5244feb26b4d225"} Apr 21 07:54:14.600659 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:14.600238 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" event={"ID":"79ec319c-106c-4f6b-8565-b2204159d925","Type":"ContainerStarted","Data":"caf5ebce0b513fa03cb2a3f2c3a8f53c377f757fec8f517c31dff1cb368f77b1"} Apr 21 07:54:14.600659 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:14.600252 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" event={"ID":"79ec319c-106c-4f6b-8565-b2204159d925","Type":"ContainerStarted","Data":"4831b45754e3f0a9bdbb412b1142a37bc0e1a770afb2a9880862e70e1227600f"} Apr 21 07:54:14.601528 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:14.601499 2574 generic.go:358] "Generic (PLEG): container finished" podID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerID="207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a" exitCode=0 Apr 21 07:54:14.601635 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:14.601584 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"037c36c7-08ed-4a06-ab93-55d7b251322e","Type":"ContainerDied","Data":"207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a"} Apr 21 07:54:14.621692 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:14.621654 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-75679f49d-lrpkx" podStartSLOduration=2.336967555 podStartE2EDuration="4.621642117s" podCreationTimestamp="2026-04-21 07:54:10 +0000 UTC" firstStartedPulling="2026-04-21 07:54:11.311994971 +0000 UTC m=+137.724780063" lastFinishedPulling="2026-04-21 07:54:13.596669523 +0000 UTC m=+140.009454625" observedRunningTime="2026-04-21 07:54:14.619471817 +0000 UTC m=+141.032256944" watchObservedRunningTime="2026-04-21 07:54:14.621642117 +0000 UTC m=+141.034427231" Apr 21 07:54:17.605787 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:17.605740 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7fd6765fd-sch2j" Apr 21 07:54:17.616858 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:17.616804 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"037c36c7-08ed-4a06-ab93-55d7b251322e","Type":"ContainerStarted","Data":"346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f"} Apr 21 07:54:17.616988 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:17.616883 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"037c36c7-08ed-4a06-ab93-55d7b251322e","Type":"ContainerStarted","Data":"a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d"} Apr 21 07:54:17.616988 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:17.616900 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"037c36c7-08ed-4a06-ab93-55d7b251322e","Type":"ContainerStarted","Data":"8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab"} Apr 21 07:54:18.624509 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:18.624468 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"037c36c7-08ed-4a06-ab93-55d7b251322e","Type":"ContainerStarted","Data":"9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a"} Apr 21 07:54:18.624873 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:18.624513 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"037c36c7-08ed-4a06-ab93-55d7b251322e","Type":"ContainerStarted","Data":"e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a"} Apr 21 07:54:18.624873 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:18.624543 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"037c36c7-08ed-4a06-ab93-55d7b251322e","Type":"ContainerStarted","Data":"0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582"} Apr 21 07:54:18.651474 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:18.651407 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.581669454 podStartE2EDuration="7.651386185s" podCreationTimestamp="2026-04-21 07:54:11 +0000 UTC" firstStartedPulling="2026-04-21 07:54:12.305504228 +0000 UTC m=+138.718289320" lastFinishedPulling="2026-04-21 07:54:17.37522096 +0000 UTC m=+143.788006051" observedRunningTime="2026-04-21 07:54:18.649052697 +0000 UTC m=+145.061837812" watchObservedRunningTime="2026-04-21 07:54:18.651386185 +0000 UTC m=+145.064171300" Apr 21 07:54:22.138344 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:22.138308 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:54:29.974966 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:54:29.974902 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-xmj7g" podUID="9a3e6d1e-7020-4e01-b08c-6965f9908a29" Apr 21 07:54:29.988066 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:54:29.988025 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-fw8gv" podUID="cc6f988e-e651-47c1-b9ef-5edf69838385" Apr 21 07:54:30.664191 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:30.664151 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xmj7g" Apr 21 07:54:34.896314 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:34.896282 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert\") pod \"ingress-canary-fw8gv\" (UID: \"cc6f988e-e651-47c1-b9ef-5edf69838385\") " pod="openshift-ingress-canary/ingress-canary-fw8gv" Apr 21 07:54:34.896684 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:34.896335 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls\") pod \"dns-default-xmj7g\" (UID: \"9a3e6d1e-7020-4e01-b08c-6965f9908a29\") " pod="openshift-dns/dns-default-xmj7g" Apr 21 07:54:34.898898 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:34.898868 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9a3e6d1e-7020-4e01-b08c-6965f9908a29-metrics-tls\") pod \"dns-default-xmj7g\" (UID: \"9a3e6d1e-7020-4e01-b08c-6965f9908a29\") " pod="openshift-dns/dns-default-xmj7g" Apr 21 07:54:34.899111 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:34.899089 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc6f988e-e651-47c1-b9ef-5edf69838385-cert\") pod \"ingress-canary-fw8gv\" (UID: \"cc6f988e-e651-47c1-b9ef-5edf69838385\") " pod="openshift-ingress-canary/ingress-canary-fw8gv" Apr 21 07:54:35.167401 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:35.167327 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4xjgf\"" Apr 21 07:54:35.176117 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:35.176089 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xmj7g" Apr 21 07:54:35.298119 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:35.298090 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xmj7g"] Apr 21 07:54:35.300511 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:54:35.300487 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a3e6d1e_7020_4e01_b08c_6965f9908a29.slice/crio-bde69bbfbbe4bbb022c4aa16bddecdc92dfcf1540aab471572c5f941400689a8 WatchSource:0}: Error finding container bde69bbfbbe4bbb022c4aa16bddecdc92dfcf1540aab471572c5f941400689a8: Status 404 returned error can't find the container with id bde69bbfbbe4bbb022c4aa16bddecdc92dfcf1540aab471572c5f941400689a8 Apr 21 07:54:35.679948 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:35.679899 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xmj7g" event={"ID":"9a3e6d1e-7020-4e01-b08c-6965f9908a29","Type":"ContainerStarted","Data":"bde69bbfbbe4bbb022c4aa16bddecdc92dfcf1540aab471572c5f941400689a8"} Apr 21 07:54:36.686817 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:36.686779 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xmj7g" event={"ID":"9a3e6d1e-7020-4e01-b08c-6965f9908a29","Type":"ContainerStarted","Data":"869eec537c3f9d650ce9f8fa7e9af1da17712e9d7088f7180681df37a452e913"} Apr 21 07:54:37.691251 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:37.691210 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xmj7g" event={"ID":"9a3e6d1e-7020-4e01-b08c-6965f9908a29","Type":"ContainerStarted","Data":"224742d800c15586a739fe6fe3ae9d6018008edd1f88db60a9862c6c1527424e"} Apr 21 07:54:37.691699 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:37.691285 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-xmj7g" Apr 21 07:54:37.709516 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:37.709470 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xmj7g" podStartSLOduration=130.469895514 podStartE2EDuration="2m11.709455802s" podCreationTimestamp="2026-04-21 07:52:26 +0000 UTC" firstStartedPulling="2026-04-21 07:54:35.302472431 +0000 UTC m=+161.715257538" lastFinishedPulling="2026-04-21 07:54:36.542032733 +0000 UTC m=+162.954817826" observedRunningTime="2026-04-21 07:54:37.708323051 +0000 UTC m=+164.121108181" watchObservedRunningTime="2026-04-21 07:54:37.709455802 +0000 UTC m=+164.122240915" Apr 21 07:54:41.704496 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:41.704458 2574 generic.go:358] "Generic (PLEG): container finished" podID="c63570c8-68d2-427e-8592-5b3ee57b7d7a" containerID="5b5f435604c6e883453e1e5ac76f3475b24e17dd958a8e7bfe7ca9ada08c8cf3" exitCode=0 Apr 21 07:54:41.704868 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:41.704533 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-78688" event={"ID":"c63570c8-68d2-427e-8592-5b3ee57b7d7a","Type":"ContainerDied","Data":"5b5f435604c6e883453e1e5ac76f3475b24e17dd958a8e7bfe7ca9ada08c8cf3"} Apr 21 07:54:41.704913 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:41.704891 2574 scope.go:117] "RemoveContainer" containerID="5b5f435604c6e883453e1e5ac76f3475b24e17dd958a8e7bfe7ca9ada08c8cf3" Apr 21 07:54:42.709287 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:42.709246 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-78688" event={"ID":"c63570c8-68d2-427e-8592-5b3ee57b7d7a","Type":"ContainerStarted","Data":"ca1e227beaa6ce9c8238b761ee382589ef1fd5f13b4444646067501b38390bf3"} Apr 21 07:54:44.091345 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:44.091319 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fw8gv" Apr 21 07:54:44.093511 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:44.093491 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pcpsl\"" Apr 21 07:54:44.102134 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:44.102120 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fw8gv" Apr 21 07:54:44.220905 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:44.220879 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fw8gv"] Apr 21 07:54:44.223327 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:54:44.223297 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc6f988e_e651_47c1_b9ef_5edf69838385.slice/crio-e056083649feb117e698079e4c3d810a53951ef69992e9a7e9ab4f47e96b8a49 WatchSource:0}: Error finding container e056083649feb117e698079e4c3d810a53951ef69992e9a7e9ab4f47e96b8a49: Status 404 returned error can't find the container with id e056083649feb117e698079e4c3d810a53951ef69992e9a7e9ab4f47e96b8a49 Apr 21 07:54:44.718487 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:44.718450 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fw8gv" event={"ID":"cc6f988e-e651-47c1-b9ef-5edf69838385","Type":"ContainerStarted","Data":"e056083649feb117e698079e4c3d810a53951ef69992e9a7e9ab4f47e96b8a49"} Apr 21 07:54:45.508775 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:45.508740 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-8jlgf_60f40b44-8487-4f5b-af99-1681ffa94740/cluster-monitoring-operator/0.log" Apr 21 07:54:45.707848 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:45.707813 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-t4dg6_bc770555-e352-4582-b1e8-001e6f487668/kube-state-metrics/0.log" Apr 21 07:54:45.906556 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:45.906529 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-t4dg6_bc770555-e352-4582-b1e8-001e6f487668/kube-rbac-proxy-main/0.log" Apr 21 07:54:46.107194 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:46.107120 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-t4dg6_bc770555-e352-4582-b1e8-001e6f487668/kube-rbac-proxy-self/0.log" Apr 21 07:54:46.725973 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:46.725913 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fw8gv" event={"ID":"cc6f988e-e651-47c1-b9ef-5edf69838385","Type":"ContainerStarted","Data":"6cf3c41ce73ec4c1b074c99ea5f6a0a37f406ea1add8207658a6d258fce31666"} Apr 21 07:54:46.727276 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:46.727255 2574 generic.go:358] "Generic (PLEG): container finished" podID="62f93d32-f152-4b0d-a51e-3e51bdd7183b" containerID="d6cd1627f24090fc211d74f8724ad23ab71e3b5421c4c25fb904fbf746afdd06" exitCode=0 Apr 21 07:54:46.727372 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:46.727321 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pz5qt" event={"ID":"62f93d32-f152-4b0d-a51e-3e51bdd7183b","Type":"ContainerDied","Data":"d6cd1627f24090fc211d74f8724ad23ab71e3b5421c4c25fb904fbf746afdd06"} Apr 21 07:54:46.727556 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:46.727543 2574 scope.go:117] "RemoveContainer" containerID="d6cd1627f24090fc211d74f8724ad23ab71e3b5421c4c25fb904fbf746afdd06" Apr 21 07:54:46.741053 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:46.741006 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fw8gv" podStartSLOduration=139.210381835 podStartE2EDuration="2m20.740994311s" podCreationTimestamp="2026-04-21 07:52:26 +0000 UTC" firstStartedPulling="2026-04-21 07:54:44.225188333 +0000 UTC m=+170.637973433" lastFinishedPulling="2026-04-21 07:54:45.7558008 +0000 UTC m=+172.168585909" observedRunningTime="2026-04-21 07:54:46.738905984 +0000 UTC m=+173.151691098" watchObservedRunningTime="2026-04-21 07:54:46.740994311 +0000 UTC m=+173.153779425" Apr 21 07:54:47.696960 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:47.696901 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xmj7g" Apr 21 07:54:47.732547 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:47.732512 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pz5qt" event={"ID":"62f93d32-f152-4b0d-a51e-3e51bdd7183b","Type":"ContainerStarted","Data":"e0dd0ef0d62dfe30d0b75fbdc9ecf4d0ff59372afbc25552f25d99c9562c278b"} Apr 21 07:54:47.909339 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:47.909307 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zvf59_78cf98b5-cf16-43e9-9ffb-27d7f8a58917/init-textfile/0.log" Apr 21 07:54:48.107138 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:48.107058 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zvf59_78cf98b5-cf16-43e9-9ffb-27d7f8a58917/node-exporter/0.log" Apr 21 07:54:48.306321 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:48.306292 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zvf59_78cf98b5-cf16-43e9-9ffb-27d7f8a58917/kube-rbac-proxy/0.log" Apr 21 07:54:49.106900 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:49.106871 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_037c36c7-08ed-4a06-ab93-55d7b251322e/init-config-reloader/0.log" Apr 21 07:54:49.307711 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:49.307683 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_037c36c7-08ed-4a06-ab93-55d7b251322e/prometheus/0.log" Apr 21 07:54:49.506760 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:49.506734 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_037c36c7-08ed-4a06-ab93-55d7b251322e/config-reloader/0.log" Apr 21 07:54:49.707342 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:49.707314 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_037c36c7-08ed-4a06-ab93-55d7b251322e/thanos-sidecar/0.log" Apr 21 07:54:49.909146 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:49.909069 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_037c36c7-08ed-4a06-ab93-55d7b251322e/kube-rbac-proxy-web/0.log" Apr 21 07:54:50.107147 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:50.107117 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_037c36c7-08ed-4a06-ab93-55d7b251322e/kube-rbac-proxy/0.log" Apr 21 07:54:50.306711 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:50.306684 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_037c36c7-08ed-4a06-ab93-55d7b251322e/kube-rbac-proxy-thanos/0.log" Apr 21 07:54:50.508889 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:50.508859 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-js2pk_5e42f657-c2c4-4850-8aee-6981435fe148/prometheus-operator/0.log" Apr 21 07:54:50.706445 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:50.706419 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-js2pk_5e42f657-c2c4-4850-8aee-6981435fe148/kube-rbac-proxy/0.log" Apr 21 07:54:50.906880 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:50.906849 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-ddc4h_31c3668a-6489-488c-87ac-270ec5e7bc32/prometheus-operator-admission-webhook/0.log" Apr 21 07:54:51.106840 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:51.106763 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75679f49d-lrpkx_79ec319c-106c-4f6b-8565-b2204159d925/telemeter-client/0.log" Apr 21 07:54:51.308101 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:51.308066 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75679f49d-lrpkx_79ec319c-106c-4f6b-8565-b2204159d925/reload/0.log" Apr 21 07:54:51.506844 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:51.506812 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75679f49d-lrpkx_79ec319c-106c-4f6b-8565-b2204159d925/kube-rbac-proxy/0.log" Apr 21 07:54:51.707270 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:51.707238 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd6765fd-sch2j_bbdc7510-ab52-49ea-a9cb-4ed5a39e6614/thanos-query/0.log" Apr 21 07:54:51.907217 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:51.907136 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd6765fd-sch2j_bbdc7510-ab52-49ea-a9cb-4ed5a39e6614/kube-rbac-proxy-web/0.log" Apr 21 07:54:52.106547 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:52.106521 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd6765fd-sch2j_bbdc7510-ab52-49ea-a9cb-4ed5a39e6614/kube-rbac-proxy/0.log" Apr 21 07:54:52.307728 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:52.307688 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd6765fd-sch2j_bbdc7510-ab52-49ea-a9cb-4ed5a39e6614/prom-label-proxy/0.log" Apr 21 07:54:52.507205 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:52.507171 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd6765fd-sch2j_bbdc7510-ab52-49ea-a9cb-4ed5a39e6614/kube-rbac-proxy-rules/0.log" Apr 21 07:54:52.707327 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:52.707302 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd6765fd-sch2j_bbdc7510-ab52-49ea-a9cb-4ed5a39e6614/kube-rbac-proxy-metrics/0.log" Apr 21 07:54:54.907620 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:54.907589 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xmj7g_9a3e6d1e-7020-4e01-b08c-6965f9908a29/dns/0.log" Apr 21 07:54:55.107289 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:55.107260 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xmj7g_9a3e6d1e-7020-4e01-b08c-6965f9908a29/kube-rbac-proxy/0.log" Apr 21 07:54:55.706087 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:55.706060 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zglz2_2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1/dns-node-resolver/0.log" Apr 21 07:54:56.307314 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:54:56.307290 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-fw8gv_cc6f988e-e651-47c1-b9ef-5edf69838385/serve-healthcheck-canary/0.log" Apr 21 07:55:12.139185 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:12.139150 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:12.158201 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:12.158177 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:12.819470 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:12.819440 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:30.259168 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.259129 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 07:55:30.259807 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.259718 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="prometheus" containerID="cri-o://8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab" gracePeriod=600 Apr 21 07:55:30.259807 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.259803 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="config-reloader" containerID="cri-o://a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d" gracePeriod=600 Apr 21 07:55:30.260102 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.259749 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="kube-rbac-proxy" containerID="cri-o://e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a" gracePeriod=600 Apr 21 07:55:30.260102 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.259744 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="kube-rbac-proxy-thanos" containerID="cri-o://9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a" gracePeriod=600 Apr 21 07:55:30.260102 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.259802 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="kube-rbac-proxy-web" containerID="cri-o://0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582" gracePeriod=600 Apr 21 07:55:30.260102 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.259756 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="thanos-sidecar" containerID="cri-o://346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f" gracePeriod=600 Apr 21 07:55:30.499485 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.499458 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:30.671713 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.671681 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-config\") pod \"037c36c7-08ed-4a06-ab93-55d7b251322e\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " Apr 21 07:55:30.671855 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.671746 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-thanos-prometheus-http-client-file\") pod \"037c36c7-08ed-4a06-ab93-55d7b251322e\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " Apr 21 07:55:30.671855 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.671768 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-grpc-tls\") pod \"037c36c7-08ed-4a06-ab93-55d7b251322e\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " Apr 21 07:55:30.671855 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.671796 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"037c36c7-08ed-4a06-ab93-55d7b251322e\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " Apr 21 07:55:30.671855 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.671823 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-kube-rbac-proxy\") pod \"037c36c7-08ed-4a06-ab93-55d7b251322e\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " Apr 21 07:55:30.671855 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.671854 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-prometheus-k8s-tls\") pod \"037c36c7-08ed-4a06-ab93-55d7b251322e\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " Apr 21 07:55:30.672139 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.671884 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-prometheus-trusted-ca-bundle\") pod \"037c36c7-08ed-4a06-ab93-55d7b251322e\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " Apr 21 07:55:30.672139 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.671908 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-web-config\") pod \"037c36c7-08ed-4a06-ab93-55d7b251322e\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " Apr 21 07:55:30.672139 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.671963 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-configmap-metrics-client-ca\") pod \"037c36c7-08ed-4a06-ab93-55d7b251322e\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " Apr 21 07:55:30.672139 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.671988 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/037c36c7-08ed-4a06-ab93-55d7b251322e-tls-assets\") pod \"037c36c7-08ed-4a06-ab93-55d7b251322e\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " Apr 21 07:55:30.672139 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.672016 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-prometheus-k8s-rulefiles-0\") pod \"037c36c7-08ed-4a06-ab93-55d7b251322e\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " Apr 21 07:55:30.672139 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.672035 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-configmap-kubelet-serving-ca-bundle\") pod \"037c36c7-08ed-4a06-ab93-55d7b251322e\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " Apr 21 07:55:30.672139 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.672057 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/037c36c7-08ed-4a06-ab93-55d7b251322e-prometheus-k8s-db\") pod \"037c36c7-08ed-4a06-ab93-55d7b251322e\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " Apr 21 07:55:30.672139 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.672076 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"037c36c7-08ed-4a06-ab93-55d7b251322e\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " Apr 21 07:55:30.672139 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.672115 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-configmap-serving-certs-ca-bundle\") pod \"037c36c7-08ed-4a06-ab93-55d7b251322e\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " Apr 21 07:55:30.672570 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.672155 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz9vz\" (UniqueName: \"kubernetes.io/projected/037c36c7-08ed-4a06-ab93-55d7b251322e-kube-api-access-nz9vz\") pod \"037c36c7-08ed-4a06-ab93-55d7b251322e\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " Apr 21 07:55:30.672570 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.672178 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/037c36c7-08ed-4a06-ab93-55d7b251322e-config-out\") pod \"037c36c7-08ed-4a06-ab93-55d7b251322e\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " Apr 21 07:55:30.672570 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.672205 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-metrics-client-certs\") pod \"037c36c7-08ed-4a06-ab93-55d7b251322e\" (UID: \"037c36c7-08ed-4a06-ab93-55d7b251322e\") " Apr 21 07:55:30.673212 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.673171 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "037c36c7-08ed-4a06-ab93-55d7b251322e" (UID: "037c36c7-08ed-4a06-ab93-55d7b251322e"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:55:30.673578 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.673544 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/037c36c7-08ed-4a06-ab93-55d7b251322e-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "037c36c7-08ed-4a06-ab93-55d7b251322e" (UID: "037c36c7-08ed-4a06-ab93-55d7b251322e"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 07:55:30.673673 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.673572 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "037c36c7-08ed-4a06-ab93-55d7b251322e" (UID: "037c36c7-08ed-4a06-ab93-55d7b251322e"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:55:30.675272 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.675245 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "037c36c7-08ed-4a06-ab93-55d7b251322e" (UID: "037c36c7-08ed-4a06-ab93-55d7b251322e"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:55:30.676778 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.675562 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "037c36c7-08ed-4a06-ab93-55d7b251322e" (UID: "037c36c7-08ed-4a06-ab93-55d7b251322e"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:55:30.676889 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.676449 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "037c36c7-08ed-4a06-ab93-55d7b251322e" (UID: "037c36c7-08ed-4a06-ab93-55d7b251322e"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:55:30.676889 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.676514 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "037c36c7-08ed-4a06-ab93-55d7b251322e" (UID: "037c36c7-08ed-4a06-ab93-55d7b251322e"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:55:30.676889 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.676589 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "037c36c7-08ed-4a06-ab93-55d7b251322e" (UID: "037c36c7-08ed-4a06-ab93-55d7b251322e"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:55:30.677180 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.677096 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "037c36c7-08ed-4a06-ab93-55d7b251322e" (UID: "037c36c7-08ed-4a06-ab93-55d7b251322e"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:55:30.677180 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.677118 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "037c36c7-08ed-4a06-ab93-55d7b251322e" (UID: "037c36c7-08ed-4a06-ab93-55d7b251322e"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:55:30.677356 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.677181 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/037c36c7-08ed-4a06-ab93-55d7b251322e-kube-api-access-nz9vz" (OuterVolumeSpecName: "kube-api-access-nz9vz") pod "037c36c7-08ed-4a06-ab93-55d7b251322e" (UID: "037c36c7-08ed-4a06-ab93-55d7b251322e"). InnerVolumeSpecName "kube-api-access-nz9vz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:55:30.677356 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.677186 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "037c36c7-08ed-4a06-ab93-55d7b251322e" (UID: "037c36c7-08ed-4a06-ab93-55d7b251322e"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:55:30.677356 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.677218 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "037c36c7-08ed-4a06-ab93-55d7b251322e" (UID: "037c36c7-08ed-4a06-ab93-55d7b251322e"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:55:30.677356 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.677274 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/037c36c7-08ed-4a06-ab93-55d7b251322e-config-out" (OuterVolumeSpecName: "config-out") pod "037c36c7-08ed-4a06-ab93-55d7b251322e" (UID: "037c36c7-08ed-4a06-ab93-55d7b251322e"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 07:55:30.677356 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.677316 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-config" (OuterVolumeSpecName: "config") pod "037c36c7-08ed-4a06-ab93-55d7b251322e" (UID: "037c36c7-08ed-4a06-ab93-55d7b251322e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:55:30.678507 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.678474 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "037c36c7-08ed-4a06-ab93-55d7b251322e" (UID: "037c36c7-08ed-4a06-ab93-55d7b251322e"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:55:30.678686 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.678670 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/037c36c7-08ed-4a06-ab93-55d7b251322e-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "037c36c7-08ed-4a06-ab93-55d7b251322e" (UID: "037c36c7-08ed-4a06-ab93-55d7b251322e"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:55:30.686420 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.686396 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-web-config" (OuterVolumeSpecName: "web-config") pod "037c36c7-08ed-4a06-ab93-55d7b251322e" (UID: "037c36c7-08ed-4a06-ab93-55d7b251322e"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:55:30.773289 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.773254 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 21 07:55:30.773289 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.773283 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 21 07:55:30.773289 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.773294 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nz9vz\" (UniqueName: \"kubernetes.io/projected/037c36c7-08ed-4a06-ab93-55d7b251322e-kube-api-access-nz9vz\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 21 07:55:30.773504 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.773303 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/037c36c7-08ed-4a06-ab93-55d7b251322e-config-out\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 21 07:55:30.773504 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.773312 2574 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-metrics-client-certs\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 21 07:55:30.773504 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.773321 2574 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-config\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 21 07:55:30.773504 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.773330 2574 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-thanos-prometheus-http-client-file\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 21 07:55:30.773504 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.773339 2574 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-grpc-tls\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 21 07:55:30.773504 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.773348 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 21 07:55:30.773504 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.773358 2574 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-kube-rbac-proxy\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 21 07:55:30.773504 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.773366 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-secret-prometheus-k8s-tls\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 21 07:55:30.773504 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.773374 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-prometheus-trusted-ca-bundle\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 21 07:55:30.773504 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.773385 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/037c36c7-08ed-4a06-ab93-55d7b251322e-web-config\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 21 07:55:30.773504 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.773396 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-configmap-metrics-client-ca\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 21 07:55:30.773504 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.773404 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/037c36c7-08ed-4a06-ab93-55d7b251322e-tls-assets\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 21 07:55:30.773504 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.773413 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 21 07:55:30.773504 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.773422 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/037c36c7-08ed-4a06-ab93-55d7b251322e-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 21 07:55:30.773504 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.773432 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/037c36c7-08ed-4a06-ab93-55d7b251322e-prometheus-k8s-db\") on node \"ip-10-0-138-20.ec2.internal\" DevicePath \"\"" Apr 21 07:55:30.859698 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.859665 2574 generic.go:358] "Generic (PLEG): container finished" podID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerID="9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a" exitCode=0 Apr 21 07:55:30.859698 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.859692 2574 generic.go:358] "Generic (PLEG): container finished" podID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerID="e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a" exitCode=0 Apr 21 07:55:30.859698 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.859698 2574 generic.go:358] "Generic (PLEG): container finished" podID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerID="0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582" exitCode=0 Apr 21 07:55:30.859698 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.859704 2574 generic.go:358] "Generic (PLEG): container finished" podID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerID="346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f" exitCode=0 Apr 21 07:55:30.859966 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.859709 2574 generic.go:358] "Generic (PLEG): container finished" podID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerID="a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d" exitCode=0 Apr 21 07:55:30.859966 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.859714 2574 generic.go:358] "Generic (PLEG): container finished" podID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerID="8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab" exitCode=0 Apr 21 07:55:30.859966 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.859753 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"037c36c7-08ed-4a06-ab93-55d7b251322e","Type":"ContainerDied","Data":"9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a"} Apr 21 07:55:30.859966 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.859798 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:30.859966 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.859811 2574 scope.go:117] "RemoveContainer" containerID="9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a" Apr 21 07:55:30.859966 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.859799 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"037c36c7-08ed-4a06-ab93-55d7b251322e","Type":"ContainerDied","Data":"e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a"} Apr 21 07:55:30.859966 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.859952 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"037c36c7-08ed-4a06-ab93-55d7b251322e","Type":"ContainerDied","Data":"0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582"} Apr 21 07:55:30.859966 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.859966 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"037c36c7-08ed-4a06-ab93-55d7b251322e","Type":"ContainerDied","Data":"346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f"} Apr 21 07:55:30.860264 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.859975 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"037c36c7-08ed-4a06-ab93-55d7b251322e","Type":"ContainerDied","Data":"a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d"} Apr 21 07:55:30.860264 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.859984 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"037c36c7-08ed-4a06-ab93-55d7b251322e","Type":"ContainerDied","Data":"8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab"} Apr 21 07:55:30.860264 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.859997 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"037c36c7-08ed-4a06-ab93-55d7b251322e","Type":"ContainerDied","Data":"285a3a7a6d7bfd2a6c8f59b044bcc1ab8e251715eb16564a18f3245ede2cb41b"} Apr 21 07:55:30.870165 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.868797 2574 scope.go:117] "RemoveContainer" containerID="e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a" Apr 21 07:55:30.876966 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.876938 2574 scope.go:117] "RemoveContainer" containerID="0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582" Apr 21 07:55:30.883903 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.883889 2574 scope.go:117] "RemoveContainer" containerID="346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f" Apr 21 07:55:30.889438 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.889416 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 07:55:30.891214 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.891184 2574 scope.go:117] "RemoveContainer" containerID="a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d" Apr 21 07:55:30.893830 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.893807 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 07:55:30.899990 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.899971 2574 scope.go:117] "RemoveContainer" containerID="8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab" Apr 21 07:55:30.906702 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.906673 2574 scope.go:117] "RemoveContainer" containerID="207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a" Apr 21 07:55:30.913188 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.913171 2574 scope.go:117] "RemoveContainer" containerID="9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a" Apr 21 07:55:30.913437 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:55:30.913416 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a\": container with ID starting with 9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a not found: ID does not exist" containerID="9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a" Apr 21 07:55:30.913483 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.913446 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a"} err="failed to get container status \"9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a\": rpc error: code = NotFound desc = could not find container \"9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a\": container with ID starting with 9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a not found: ID does not exist" Apr 21 07:55:30.913524 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.913482 2574 scope.go:117] "RemoveContainer" containerID="e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a" Apr 21 07:55:30.913677 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:55:30.913662 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a\": container with ID starting with e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a not found: ID does not exist" containerID="e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a" Apr 21 07:55:30.913712 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.913681 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a"} err="failed to get container status \"e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a\": rpc error: code = NotFound desc = could not find container \"e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a\": container with ID starting with e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a not found: ID does not exist" Apr 21 07:55:30.913712 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.913694 2574 scope.go:117] "RemoveContainer" containerID="0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582" Apr 21 07:55:30.913887 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:55:30.913870 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582\": container with ID starting with 0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582 not found: ID does not exist" containerID="0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582" Apr 21 07:55:30.913936 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.913894 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582"} err="failed to get container status \"0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582\": rpc error: code = NotFound desc = could not find container \"0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582\": container with ID starting with 0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582 not found: ID does not exist" Apr 21 07:55:30.913936 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.913916 2574 scope.go:117] "RemoveContainer" containerID="346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f" Apr 21 07:55:30.914120 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:55:30.914105 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f\": container with ID starting with 346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f not found: ID does not exist" containerID="346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f" Apr 21 07:55:30.914164 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.914124 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f"} err="failed to get container status \"346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f\": rpc error: code = NotFound desc = could not find container \"346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f\": container with ID starting with 346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f not found: ID does not exist" Apr 21 07:55:30.914164 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.914137 2574 scope.go:117] "RemoveContainer" containerID="a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d" Apr 21 07:55:30.914318 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:55:30.914304 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d\": container with ID starting with a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d not found: ID does not exist" containerID="a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d" Apr 21 07:55:30.914356 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.914321 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d"} err="failed to get container status \"a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d\": rpc error: code = NotFound desc = could not find container \"a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d\": container with ID starting with a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d not found: ID does not exist" Apr 21 07:55:30.914356 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.914333 2574 scope.go:117] "RemoveContainer" containerID="8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab" Apr 21 07:55:30.914502 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:55:30.914488 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab\": container with ID starting with 8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab not found: ID does not exist" containerID="8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab" Apr 21 07:55:30.914536 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.914504 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab"} err="failed to get container status \"8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab\": rpc error: code = NotFound desc = could not find container \"8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab\": container with ID starting with 8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab not found: ID does not exist" Apr 21 07:55:30.914536 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.914515 2574 scope.go:117] "RemoveContainer" containerID="207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a" Apr 21 07:55:30.914736 ip-10-0-138-20 kubenswrapper[2574]: E0421 07:55:30.914717 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a\": container with ID starting with 207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a not found: ID does not exist" containerID="207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a" Apr 21 07:55:30.914782 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.914737 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a"} err="failed to get container status \"207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a\": rpc error: code = NotFound desc = could not find container \"207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a\": container with ID starting with 207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a not found: ID does not exist" Apr 21 07:55:30.914782 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.914749 2574 scope.go:117] "RemoveContainer" containerID="9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a" Apr 21 07:55:30.915007 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.914910 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a"} err="failed to get container status \"9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a\": rpc error: code = NotFound desc = could not find container \"9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a\": container with ID starting with 9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a not found: ID does not exist" Apr 21 07:55:30.915007 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.914937 2574 scope.go:117] "RemoveContainer" containerID="e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a" Apr 21 07:55:30.915158 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.915142 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a"} err="failed to get container status \"e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a\": rpc error: code = NotFound desc = could not find container \"e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a\": container with ID starting with e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a not found: ID does not exist" Apr 21 07:55:30.915199 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.915158 2574 scope.go:117] "RemoveContainer" containerID="0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582" Apr 21 07:55:30.915384 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.915368 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582"} err="failed to get container status \"0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582\": rpc error: code = NotFound desc = could not find container \"0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582\": container with ID starting with 0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582 not found: ID does not exist" Apr 21 07:55:30.915430 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.915384 2574 scope.go:117] "RemoveContainer" containerID="346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f" Apr 21 07:55:30.915574 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.915558 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f"} err="failed to get container status \"346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f\": rpc error: code = NotFound desc = could not find container \"346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f\": container with ID starting with 346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f not found: ID does not exist" Apr 21 07:55:30.915617 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.915575 2574 scope.go:117] "RemoveContainer" containerID="a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d" Apr 21 07:55:30.915740 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.915720 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d"} err="failed to get container status \"a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d\": rpc error: code = NotFound desc = could not find container \"a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d\": container with ID starting with a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d not found: ID does not exist" Apr 21 07:55:30.915791 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.915741 2574 scope.go:117] "RemoveContainer" containerID="8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab" Apr 21 07:55:30.915989 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.915969 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab"} err="failed to get container status \"8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab\": rpc error: code = NotFound desc = could not find container \"8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab\": container with ID starting with 8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab not found: ID does not exist" Apr 21 07:55:30.916040 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.915990 2574 scope.go:117] "RemoveContainer" containerID="207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a" Apr 21 07:55:30.916220 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.916203 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a"} err="failed to get container status \"207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a\": rpc error: code = NotFound desc = could not find container \"207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a\": container with ID starting with 207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a not found: ID does not exist" Apr 21 07:55:30.916258 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.916220 2574 scope.go:117] "RemoveContainer" containerID="9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a" Apr 21 07:55:30.916449 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.916428 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a"} err="failed to get container status \"9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a\": rpc error: code = NotFound desc = could not find container \"9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a\": container with ID starting with 9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a not found: ID does not exist" Apr 21 07:55:30.916514 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.916452 2574 scope.go:117] "RemoveContainer" containerID="e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a" Apr 21 07:55:30.916647 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.916623 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a"} err="failed to get container status \"e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a\": rpc error: code = NotFound desc = could not find container \"e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a\": container with ID starting with e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a not found: ID does not exist" Apr 21 07:55:30.916690 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.916650 2574 scope.go:117] "RemoveContainer" containerID="0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582" Apr 21 07:55:30.916867 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.916849 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582"} err="failed to get container status \"0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582\": rpc error: code = NotFound desc = could not find container \"0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582\": container with ID starting with 0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582 not found: ID does not exist" Apr 21 07:55:30.916973 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.916869 2574 scope.go:117] "RemoveContainer" containerID="346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f" Apr 21 07:55:30.917115 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.917097 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f"} err="failed to get container status \"346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f\": rpc error: code = NotFound desc = could not find container \"346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f\": container with ID starting with 346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f not found: ID does not exist" Apr 21 07:55:30.917162 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.917118 2574 scope.go:117] "RemoveContainer" containerID="a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d" Apr 21 07:55:30.917323 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.917306 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d"} err="failed to get container status \"a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d\": rpc error: code = NotFound desc = could not find container \"a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d\": container with ID starting with a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d not found: ID does not exist" Apr 21 07:55:30.917387 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.917325 2574 scope.go:117] "RemoveContainer" containerID="8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab" Apr 21 07:55:30.917542 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.917525 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab"} err="failed to get container status \"8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab\": rpc error: code = NotFound desc = could not find container \"8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab\": container with ID starting with 8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab not found: ID does not exist" Apr 21 07:55:30.917585 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.917543 2574 scope.go:117] "RemoveContainer" containerID="207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a" Apr 21 07:55:30.917753 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.917736 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a"} err="failed to get container status \"207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a\": rpc error: code = NotFound desc = could not find container \"207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a\": container with ID starting with 207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a not found: ID does not exist" Apr 21 07:55:30.917796 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.917754 2574 scope.go:117] "RemoveContainer" containerID="9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a" Apr 21 07:55:30.917964 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.917947 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a"} err="failed to get container status \"9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a\": rpc error: code = NotFound desc = could not find container \"9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a\": container with ID starting with 9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a not found: ID does not exist" Apr 21 07:55:30.918027 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.917966 2574 scope.go:117] "RemoveContainer" containerID="e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a" Apr 21 07:55:30.918171 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.918154 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a"} err="failed to get container status \"e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a\": rpc error: code = NotFound desc = could not find container \"e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a\": container with ID starting with e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a not found: ID does not exist" Apr 21 07:55:30.918219 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.918171 2574 scope.go:117] "RemoveContainer" containerID="0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582" Apr 21 07:55:30.918355 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.918340 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582"} err="failed to get container status \"0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582\": rpc error: code = NotFound desc = could not find container \"0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582\": container with ID starting with 0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582 not found: ID does not exist" Apr 21 07:55:30.918401 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.918356 2574 scope.go:117] "RemoveContainer" containerID="346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f" Apr 21 07:55:30.918528 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.918512 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f"} err="failed to get container status \"346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f\": rpc error: code = NotFound desc = could not find container \"346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f\": container with ID starting with 346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f not found: ID does not exist" Apr 21 07:55:30.918594 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.918529 2574 scope.go:117] "RemoveContainer" containerID="a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d" Apr 21 07:55:30.918719 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.918705 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d"} err="failed to get container status \"a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d\": rpc error: code = NotFound desc = could not find container \"a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d\": container with ID starting with a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d not found: ID does not exist" Apr 21 07:55:30.918763 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.918720 2574 scope.go:117] "RemoveContainer" containerID="8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab" Apr 21 07:55:30.918902 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.918886 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab"} err="failed to get container status \"8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab\": rpc error: code = NotFound desc = could not find container \"8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab\": container with ID starting with 8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab not found: ID does not exist" Apr 21 07:55:30.918969 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.918902 2574 scope.go:117] "RemoveContainer" containerID="207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a" Apr 21 07:55:30.919124 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.919108 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a"} err="failed to get container status \"207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a\": rpc error: code = NotFound desc = could not find container \"207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a\": container with ID starting with 207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a not found: ID does not exist" Apr 21 07:55:30.919188 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.919126 2574 scope.go:117] "RemoveContainer" containerID="9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a" Apr 21 07:55:30.919356 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.919338 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a"} err="failed to get container status \"9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a\": rpc error: code = NotFound desc = could not find container \"9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a\": container with ID starting with 9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a not found: ID does not exist" Apr 21 07:55:30.919398 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.919357 2574 scope.go:117] "RemoveContainer" containerID="e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a" Apr 21 07:55:30.919552 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.919535 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a"} err="failed to get container status \"e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a\": rpc error: code = NotFound desc = could not find container \"e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a\": container with ID starting with e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a not found: ID does not exist" Apr 21 07:55:30.919624 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.919553 2574 scope.go:117] "RemoveContainer" containerID="0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582" Apr 21 07:55:30.919762 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.919744 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582"} err="failed to get container status \"0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582\": rpc error: code = NotFound desc = could not find container \"0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582\": container with ID starting with 0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582 not found: ID does not exist" Apr 21 07:55:30.919806 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.919763 2574 scope.go:117] "RemoveContainer" containerID="346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f" Apr 21 07:55:30.920009 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.919991 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f"} err="failed to get container status \"346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f\": rpc error: code = NotFound desc = could not find container \"346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f\": container with ID starting with 346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f not found: ID does not exist" Apr 21 07:55:30.920083 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.920009 2574 scope.go:117] "RemoveContainer" containerID="a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d" Apr 21 07:55:30.920181 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.920165 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d"} err="failed to get container status \"a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d\": rpc error: code = NotFound desc = could not find container \"a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d\": container with ID starting with a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d not found: ID does not exist" Apr 21 07:55:30.920245 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.920183 2574 scope.go:117] "RemoveContainer" containerID="8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab" Apr 21 07:55:30.920382 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.920365 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab"} err="failed to get container status \"8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab\": rpc error: code = NotFound desc = could not find container \"8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab\": container with ID starting with 8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab not found: ID does not exist" Apr 21 07:55:30.920437 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.920383 2574 scope.go:117] "RemoveContainer" containerID="207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a" Apr 21 07:55:30.920565 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.920543 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a"} err="failed to get container status \"207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a\": rpc error: code = NotFound desc = could not find container \"207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a\": container with ID starting with 207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a not found: ID does not exist" Apr 21 07:55:30.920633 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.920567 2574 scope.go:117] "RemoveContainer" containerID="9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a" Apr 21 07:55:30.920797 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.920768 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a"} err="failed to get container status \"9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a\": rpc error: code = NotFound desc = could not find container \"9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a\": container with ID starting with 9cc5892fb8c3ed30fecad3e437f870d2849d4caf1aea721d90ad06f8dc4a555a not found: ID does not exist" Apr 21 07:55:30.920892 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.920800 2574 scope.go:117] "RemoveContainer" containerID="e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a" Apr 21 07:55:30.921104 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.921083 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a"} err="failed to get container status \"e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a\": rpc error: code = NotFound desc = could not find container \"e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a\": container with ID starting with e9235e91c8b1f410c957bd38ca9f48843c2b6a90231e978805a9facc7f4ce06a not found: ID does not exist" Apr 21 07:55:30.921104 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.921103 2574 scope.go:117] "RemoveContainer" containerID="0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582" Apr 21 07:55:30.926262 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.926191 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582"} err="failed to get container status \"0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582\": rpc error: code = NotFound desc = could not find container \"0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582\": container with ID starting with 0aac2861c3b62b1ae78daf01a67ce8a739f05b6dda2545c5b25ecddbee69e582 not found: ID does not exist" Apr 21 07:55:30.926262 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.926220 2574 scope.go:117] "RemoveContainer" containerID="346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f" Apr 21 07:55:30.927033 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.927009 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f"} err="failed to get container status \"346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f\": rpc error: code = NotFound desc = could not find container \"346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f\": container with ID starting with 346fbbe1e480ca011d39726589bb76fa9d22dbc2de8ea46d20f326a2c310742f not found: ID does not exist" Apr 21 07:55:30.927119 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.927053 2574 scope.go:117] "RemoveContainer" containerID="a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d" Apr 21 07:55:30.927392 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.927371 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d"} err="failed to get container status \"a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d\": rpc error: code = NotFound desc = could not find container \"a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d\": container with ID starting with a29acab6d44001ef784db5ef3d9e77f24c3ef4edb071d27b44081546d2b2530d not found: ID does not exist" Apr 21 07:55:30.927466 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.927395 2574 scope.go:117] "RemoveContainer" containerID="8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab" Apr 21 07:55:30.927690 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.927671 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab"} err="failed to get container status \"8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab\": rpc error: code = NotFound desc = could not find container \"8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab\": container with ID starting with 8354aa8c1dd6c993456821712d5e8597b6659c1b2760d5f18d0f233d10b517ab not found: ID does not exist" Apr 21 07:55:30.927690 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.927691 2574 scope.go:117] "RemoveContainer" containerID="207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a" Apr 21 07:55:30.927972 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.927948 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a"} err="failed to get container status \"207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a\": rpc error: code = NotFound desc = could not find container \"207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a\": container with ID starting with 207c67d146e4eb7e29f0c7c2bebbd940e52af058affe8e50e50a6afb394ad09a not found: ID does not exist" Apr 21 07:55:30.928317 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.928292 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 07:55:30.928618 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.928607 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="kube-rbac-proxy-thanos" Apr 21 07:55:30.928658 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.928631 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="kube-rbac-proxy-thanos" Apr 21 07:55:30.928658 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.928641 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="config-reloader" Apr 21 07:55:30.928658 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.928648 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="config-reloader" Apr 21 07:55:30.928658 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.928653 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="kube-rbac-proxy" Apr 21 07:55:30.928658 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.928658 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="kube-rbac-proxy" Apr 21 07:55:30.928801 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.928666 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="init-config-reloader" Apr 21 07:55:30.928801 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.928673 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="init-config-reloader" Apr 21 07:55:30.928801 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.928682 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="kube-rbac-proxy-web" Apr 21 07:55:30.928801 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.928687 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="kube-rbac-proxy-web" Apr 21 07:55:30.928801 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.928693 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="prometheus" Apr 21 07:55:30.928801 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.928698 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="prometheus" Apr 21 07:55:30.928801 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.928704 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="thanos-sidecar" Apr 21 07:55:30.928801 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.928709 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="thanos-sidecar" Apr 21 07:55:30.928801 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.928751 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="config-reloader" Apr 21 07:55:30.928801 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.928759 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="kube-rbac-proxy" Apr 21 07:55:30.928801 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.928767 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="kube-rbac-proxy-thanos" Apr 21 07:55:30.928801 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.928773 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="kube-rbac-proxy-web" Apr 21 07:55:30.928801 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.928779 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="prometheus" Apr 21 07:55:30.928801 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.928786 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" containerName="thanos-sidecar" Apr 21 07:55:30.932183 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.932166 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:30.935067 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.935047 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 07:55:30.935334 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.935321 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 07:55:30.935516 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.935502 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 07:55:30.935867 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.935849 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 07:55:30.935968 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.935901 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 07:55:30.935968 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.935909 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 07:55:30.936204 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.936172 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 07:55:30.936204 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.936177 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 07:55:30.936358 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.936237 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 07:55:30.936447 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.936431 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-gj5j8\"" Apr 21 07:55:30.936498 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.936444 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-urvvqtsb9eh4\"" Apr 21 07:55:30.936568 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.936538 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 07:55:30.936943 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.936899 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 07:55:30.938812 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.938773 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 07:55:30.942127 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.942108 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 07:55:30.947842 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:30.947821 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 07:55:31.075793 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.075767 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.075793 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.075799 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.076026 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.075827 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.076026 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.075890 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.076026 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.075918 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.076026 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.075976 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-config-out\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.076026 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.075993 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-web-config\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.076026 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.076015 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.076311 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.076041 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.076311 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.076057 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.076311 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.076100 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.076311 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.076144 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm76x\" (UniqueName: \"kubernetes.io/projected/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-kube-api-access-gm76x\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.076311 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.076197 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.076311 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.076245 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.076311 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.076281 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.076517 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.076312 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.076517 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.076331 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.076517 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.076360 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-config\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.177162 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.177103 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.177162 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.177132 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.177162 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.177149 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.177374 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.177168 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-config-out\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.177374 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.177183 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-web-config\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.177374 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.177211 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.177374 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.177242 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.177374 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.177347 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.177374 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.177364 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.177646 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.177386 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gm76x\" (UniqueName: \"kubernetes.io/projected/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-kube-api-access-gm76x\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.177646 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.177423 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.177646 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.177455 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.177646 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.177477 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.177646 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.177495 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.177646 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.177517 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.177646 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.177537 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.177646 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.177546 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-config\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.177646 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.177598 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.177646 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.177617 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.178375 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.178226 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.178375 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.178226 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.180344 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.178854 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.180344 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.179956 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.180580 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.180556 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-config\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.180649 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.180592 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-config-out\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.180649 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.180604 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-web-config\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.180756 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.180655 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.180801 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.180760 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.181155 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.181128 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.181254 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.181157 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.182798 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.182769 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.183057 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.183038 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.183216 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.183190 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.183734 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.183715 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.183792 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.183748 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.189863 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.189847 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm76x\" (UniqueName: \"kubernetes.io/projected/0d02ad7c-8322-4415-8ed6-4a55a47d4ed8-kube-api-access-gm76x\") pod \"prometheus-k8s-0\" (UID: \"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.242987 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.242956 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:55:31.370961 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.370915 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 07:55:31.373732 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:55:31.373700 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d02ad7c_8322_4415_8ed6_4a55a47d4ed8.slice/crio-20d777e336410a65be8150abd13843824d525cade6848c0231515aefdc12496d WatchSource:0}: Error finding container 20d777e336410a65be8150abd13843824d525cade6848c0231515aefdc12496d: Status 404 returned error can't find the container with id 20d777e336410a65be8150abd13843824d525cade6848c0231515aefdc12496d Apr 21 07:55:31.864759 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.864672 2574 generic.go:358] "Generic (PLEG): container finished" podID="0d02ad7c-8322-4415-8ed6-4a55a47d4ed8" containerID="23c4e33cbff562d1d5b617680f1028f767d64c3d6a54710b2795b6eae4fc619f" exitCode=0 Apr 21 07:55:31.864759 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.864740 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8","Type":"ContainerDied","Data":"23c4e33cbff562d1d5b617680f1028f767d64c3d6a54710b2795b6eae4fc619f"} Apr 21 07:55:31.864970 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:31.864761 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8","Type":"ContainerStarted","Data":"20d777e336410a65be8150abd13843824d525cade6848c0231515aefdc12496d"} Apr 21 07:55:32.094785 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:32.094753 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="037c36c7-08ed-4a06-ab93-55d7b251322e" path="/var/lib/kubelet/pods/037c36c7-08ed-4a06-ab93-55d7b251322e/volumes" Apr 21 07:55:32.869902 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:32.869871 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8","Type":"ContainerStarted","Data":"9a06642901acc237d5888a023ae695a5eef34f426282cdd98a078fe27d509a1b"} Apr 21 07:55:32.869902 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:32.869905 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8","Type":"ContainerStarted","Data":"5ad857a3dfa3590e2a75c932537ba9b7a002aa47cc664b5cf1d310bc754672d3"} Apr 21 07:55:32.870294 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:32.869914 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8","Type":"ContainerStarted","Data":"dff4b3674515cce005ab9d1c9c567a8177d0d575499e509ed3c0686874895c9c"} Apr 21 07:55:32.870294 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:32.869950 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8","Type":"ContainerStarted","Data":"70b26512255dbadca003424292de95b5bd502e54959b3d91ac35471b84516a7e"} Apr 21 07:55:32.870294 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:32.869963 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8","Type":"ContainerStarted","Data":"b7db43ca02e8f8aa7494e99e79166090c8419557298294eba5a5dba696ae668d"} Apr 21 07:55:32.870294 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:32.869976 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0d02ad7c-8322-4415-8ed6-4a55a47d4ed8","Type":"ContainerStarted","Data":"6537876d70bacccca5574194c27922550820137d7c4779e1f606f3dfaaf85372"} Apr 21 07:55:32.899004 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:32.898918 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.898902389 podStartE2EDuration="2.898902389s" podCreationTimestamp="2026-04-21 07:55:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:55:32.896539809 +0000 UTC m=+219.309324945" watchObservedRunningTime="2026-04-21 07:55:32.898902389 +0000 UTC m=+219.311687504" Apr 21 07:55:36.243574 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:55:36.243534 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:56:00.476461 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:00.476423 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-fvx4b"] Apr 21 07:56:00.484842 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:00.484799 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvx4b" Apr 21 07:56:00.487998 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:00.487974 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 07:56:00.495812 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:00.495783 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fvx4b"] Apr 21 07:56:00.618786 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:00.618758 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6e453765-1a60-4c7e-bbcf-2c5b182ea0b9-original-pull-secret\") pod \"global-pull-secret-syncer-fvx4b\" (UID: \"6e453765-1a60-4c7e-bbcf-2c5b182ea0b9\") " pod="kube-system/global-pull-secret-syncer-fvx4b" Apr 21 07:56:00.618948 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:00.618817 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6e453765-1a60-4c7e-bbcf-2c5b182ea0b9-dbus\") pod \"global-pull-secret-syncer-fvx4b\" (UID: \"6e453765-1a60-4c7e-bbcf-2c5b182ea0b9\") " pod="kube-system/global-pull-secret-syncer-fvx4b" Apr 21 07:56:00.618948 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:00.618889 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6e453765-1a60-4c7e-bbcf-2c5b182ea0b9-kubelet-config\") pod \"global-pull-secret-syncer-fvx4b\" (UID: \"6e453765-1a60-4c7e-bbcf-2c5b182ea0b9\") " pod="kube-system/global-pull-secret-syncer-fvx4b" Apr 21 07:56:00.720063 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:00.720024 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6e453765-1a60-4c7e-bbcf-2c5b182ea0b9-original-pull-secret\") pod \"global-pull-secret-syncer-fvx4b\" (UID: \"6e453765-1a60-4c7e-bbcf-2c5b182ea0b9\") " pod="kube-system/global-pull-secret-syncer-fvx4b" Apr 21 07:56:00.720255 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:00.720122 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6e453765-1a60-4c7e-bbcf-2c5b182ea0b9-dbus\") pod \"global-pull-secret-syncer-fvx4b\" (UID: \"6e453765-1a60-4c7e-bbcf-2c5b182ea0b9\") " pod="kube-system/global-pull-secret-syncer-fvx4b" Apr 21 07:56:00.720255 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:00.720172 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6e453765-1a60-4c7e-bbcf-2c5b182ea0b9-kubelet-config\") pod \"global-pull-secret-syncer-fvx4b\" (UID: \"6e453765-1a60-4c7e-bbcf-2c5b182ea0b9\") " pod="kube-system/global-pull-secret-syncer-fvx4b" Apr 21 07:56:00.720345 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:00.720262 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6e453765-1a60-4c7e-bbcf-2c5b182ea0b9-kubelet-config\") pod \"global-pull-secret-syncer-fvx4b\" (UID: \"6e453765-1a60-4c7e-bbcf-2c5b182ea0b9\") " pod="kube-system/global-pull-secret-syncer-fvx4b" Apr 21 07:56:00.720345 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:00.720297 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6e453765-1a60-4c7e-bbcf-2c5b182ea0b9-dbus\") pod \"global-pull-secret-syncer-fvx4b\" (UID: \"6e453765-1a60-4c7e-bbcf-2c5b182ea0b9\") " pod="kube-system/global-pull-secret-syncer-fvx4b" Apr 21 07:56:00.722474 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:00.722457 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6e453765-1a60-4c7e-bbcf-2c5b182ea0b9-original-pull-secret\") pod \"global-pull-secret-syncer-fvx4b\" (UID: \"6e453765-1a60-4c7e-bbcf-2c5b182ea0b9\") " pod="kube-system/global-pull-secret-syncer-fvx4b" Apr 21 07:56:00.793715 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:00.793623 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fvx4b" Apr 21 07:56:00.914118 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:00.914053 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fvx4b"] Apr 21 07:56:00.919194 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:56:00.919166 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e453765_1a60_4c7e_bbcf_2c5b182ea0b9.slice/crio-a298ba7a35eca9c42049d34018cbdb72deefd2f36938fd612127223f81059ff0 WatchSource:0}: Error finding container a298ba7a35eca9c42049d34018cbdb72deefd2f36938fd612127223f81059ff0: Status 404 returned error can't find the container with id a298ba7a35eca9c42049d34018cbdb72deefd2f36938fd612127223f81059ff0 Apr 21 07:56:00.955546 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:00.955517 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fvx4b" event={"ID":"6e453765-1a60-4c7e-bbcf-2c5b182ea0b9","Type":"ContainerStarted","Data":"a298ba7a35eca9c42049d34018cbdb72deefd2f36938fd612127223f81059ff0"} Apr 21 07:56:05.971728 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:05.971690 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fvx4b" event={"ID":"6e453765-1a60-4c7e-bbcf-2c5b182ea0b9","Type":"ContainerStarted","Data":"fd9ecf6b847e1d17c39e352c0745a09895777c06e7a2a052ccb4d898bac9c88d"} Apr 21 07:56:05.997503 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:05.997451 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-fvx4b" podStartSLOduration=1.586103722 podStartE2EDuration="5.99743389s" podCreationTimestamp="2026-04-21 07:56:00 +0000 UTC" firstStartedPulling="2026-04-21 07:56:00.920824438 +0000 UTC m=+247.333609530" lastFinishedPulling="2026-04-21 07:56:05.332154606 +0000 UTC m=+251.744939698" observedRunningTime="2026-04-21 07:56:05.996811799 +0000 UTC m=+252.409596924" watchObservedRunningTime="2026-04-21 07:56:05.99743389 +0000 UTC m=+252.410219004" Apr 21 07:56:31.244009 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:31.243960 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:56:31.260037 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:31.260004 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:56:31.981965 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:31.981916 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9fzd8/must-gather-gddqz"] Apr 21 07:56:31.985432 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:31.985413 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fzd8/must-gather-gddqz" Apr 21 07:56:31.987645 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:31.987620 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9fzd8\"/\"openshift-service-ca.crt\"" Apr 21 07:56:31.987756 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:31.987620 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9fzd8\"/\"kube-root-ca.crt\"" Apr 21 07:56:31.988449 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:31.988430 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-9fzd8\"/\"default-dockercfg-v97gr\"" Apr 21 07:56:31.991660 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:31.991629 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9fzd8/must-gather-gddqz"] Apr 21 07:56:32.077234 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:32.077208 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 07:56:32.077710 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:32.077598 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6edb690c-ff70-400e-bf1f-2093c2c842cb-must-gather-output\") pod \"must-gather-gddqz\" (UID: \"6edb690c-ff70-400e-bf1f-2093c2c842cb\") " pod="openshift-must-gather-9fzd8/must-gather-gddqz" Apr 21 07:56:32.077710 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:32.077659 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qghn8\" (UniqueName: \"kubernetes.io/projected/6edb690c-ff70-400e-bf1f-2093c2c842cb-kube-api-access-qghn8\") pod \"must-gather-gddqz\" (UID: \"6edb690c-ff70-400e-bf1f-2093c2c842cb\") " pod="openshift-must-gather-9fzd8/must-gather-gddqz" Apr 21 07:56:32.178833 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:32.178797 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6edb690c-ff70-400e-bf1f-2093c2c842cb-must-gather-output\") pod \"must-gather-gddqz\" (UID: \"6edb690c-ff70-400e-bf1f-2093c2c842cb\") " pod="openshift-must-gather-9fzd8/must-gather-gddqz" Apr 21 07:56:32.178833 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:32.178841 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qghn8\" (UniqueName: \"kubernetes.io/projected/6edb690c-ff70-400e-bf1f-2093c2c842cb-kube-api-access-qghn8\") pod \"must-gather-gddqz\" (UID: \"6edb690c-ff70-400e-bf1f-2093c2c842cb\") " pod="openshift-must-gather-9fzd8/must-gather-gddqz" Apr 21 07:56:32.179181 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:32.179159 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/6edb690c-ff70-400e-bf1f-2093c2c842cb-must-gather-output\") pod \"must-gather-gddqz\" (UID: \"6edb690c-ff70-400e-bf1f-2093c2c842cb\") " pod="openshift-must-gather-9fzd8/must-gather-gddqz" Apr 21 07:56:32.186882 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:32.186853 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qghn8\" (UniqueName: \"kubernetes.io/projected/6edb690c-ff70-400e-bf1f-2093c2c842cb-kube-api-access-qghn8\") pod \"must-gather-gddqz\" (UID: \"6edb690c-ff70-400e-bf1f-2093c2c842cb\") " pod="openshift-must-gather-9fzd8/must-gather-gddqz" Apr 21 07:56:32.311727 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:32.311637 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fzd8/must-gather-gddqz" Apr 21 07:56:32.431490 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:32.431457 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9fzd8/must-gather-gddqz"] Apr 21 07:56:32.434627 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:56:32.434600 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6edb690c_ff70_400e_bf1f_2093c2c842cb.slice/crio-a471abbba347e070471191e729e4ece608e952135b73a76b21918807adb81263 WatchSource:0}: Error finding container a471abbba347e070471191e729e4ece608e952135b73a76b21918807adb81263: Status 404 returned error can't find the container with id a471abbba347e070471191e729e4ece608e952135b73a76b21918807adb81263 Apr 21 07:56:33.065665 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:33.065627 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fzd8/must-gather-gddqz" event={"ID":"6edb690c-ff70-400e-bf1f-2093c2c842cb","Type":"ContainerStarted","Data":"a471abbba347e070471191e729e4ece608e952135b73a76b21918807adb81263"} Apr 21 07:56:34.071030 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:34.070989 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fzd8/must-gather-gddqz" event={"ID":"6edb690c-ff70-400e-bf1f-2093c2c842cb","Type":"ContainerStarted","Data":"b12d699f3b116830312d886167b5761b86a2f593fab8c71df2c6302b3a25607d"} Apr 21 07:56:34.071030 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:34.071033 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fzd8/must-gather-gddqz" event={"ID":"6edb690c-ff70-400e-bf1f-2093c2c842cb","Type":"ContainerStarted","Data":"2b3d992b3812fa13247213f195aa5c19f3c4e33577cdf1954ce6ae14aaf4d05f"} Apr 21 07:56:34.085321 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:34.085259 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9fzd8/must-gather-gddqz" podStartSLOduration=2.308447036 podStartE2EDuration="3.08523894s" podCreationTimestamp="2026-04-21 07:56:31 +0000 UTC" firstStartedPulling="2026-04-21 07:56:32.436343758 +0000 UTC m=+278.849128854" lastFinishedPulling="2026-04-21 07:56:33.213135652 +0000 UTC m=+279.625920758" observedRunningTime="2026-04-21 07:56:34.084290904 +0000 UTC m=+280.497076031" watchObservedRunningTime="2026-04-21 07:56:34.08523894 +0000 UTC m=+280.498024055" Apr 21 07:56:34.525496 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:34.525468 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-fvx4b_6e453765-1a60-4c7e-bbcf-2c5b182ea0b9/global-pull-secret-syncer/0.log" Apr 21 07:56:34.655492 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:34.655427 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-l96hv_a7cb393b-d3ae-4c28-9350-a79f012ee6c8/konnectivity-agent/0.log" Apr 21 07:56:34.722238 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:34.722204 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-20.ec2.internal_2219300bfb8a5cea9f09d55fabfc69ab/haproxy/0.log" Apr 21 07:56:37.740872 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:37.740817 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-8jlgf_60f40b44-8487-4f5b-af99-1681ffa94740/cluster-monitoring-operator/0.log" Apr 21 07:56:37.766515 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:37.766442 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-t4dg6_bc770555-e352-4582-b1e8-001e6f487668/kube-state-metrics/0.log" Apr 21 07:56:37.794011 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:37.793982 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-t4dg6_bc770555-e352-4582-b1e8-001e6f487668/kube-rbac-proxy-main/0.log" Apr 21 07:56:37.820670 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:37.820633 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-t4dg6_bc770555-e352-4582-b1e8-001e6f487668/kube-rbac-proxy-self/0.log" Apr 21 07:56:38.067186 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:38.067075 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zvf59_78cf98b5-cf16-43e9-9ffb-27d7f8a58917/node-exporter/0.log" Apr 21 07:56:38.086759 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:38.086728 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zvf59_78cf98b5-cf16-43e9-9ffb-27d7f8a58917/kube-rbac-proxy/0.log" Apr 21 07:56:38.104685 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:38.104660 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zvf59_78cf98b5-cf16-43e9-9ffb-27d7f8a58917/init-textfile/0.log" Apr 21 07:56:38.213202 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:38.213159 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0d02ad7c-8322-4415-8ed6-4a55a47d4ed8/prometheus/0.log" Apr 21 07:56:38.232709 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:38.232684 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0d02ad7c-8322-4415-8ed6-4a55a47d4ed8/config-reloader/0.log" Apr 21 07:56:38.252860 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:38.252824 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0d02ad7c-8322-4415-8ed6-4a55a47d4ed8/thanos-sidecar/0.log" Apr 21 07:56:38.271208 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:38.271177 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0d02ad7c-8322-4415-8ed6-4a55a47d4ed8/kube-rbac-proxy-web/0.log" Apr 21 07:56:38.290579 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:38.290524 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0d02ad7c-8322-4415-8ed6-4a55a47d4ed8/kube-rbac-proxy/0.log" Apr 21 07:56:38.312067 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:38.312040 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0d02ad7c-8322-4415-8ed6-4a55a47d4ed8/kube-rbac-proxy-thanos/0.log" Apr 21 07:56:38.336489 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:38.336411 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_0d02ad7c-8322-4415-8ed6-4a55a47d4ed8/init-config-reloader/0.log" Apr 21 07:56:38.362789 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:38.362753 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-js2pk_5e42f657-c2c4-4850-8aee-6981435fe148/prometheus-operator/0.log" Apr 21 07:56:38.384523 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:38.384479 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-js2pk_5e42f657-c2c4-4850-8aee-6981435fe148/kube-rbac-proxy/0.log" Apr 21 07:56:38.403654 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:38.403606 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-ddc4h_31c3668a-6489-488c-87ac-270ec5e7bc32/prometheus-operator-admission-webhook/0.log" Apr 21 07:56:38.428294 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:38.428267 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75679f49d-lrpkx_79ec319c-106c-4f6b-8565-b2204159d925/telemeter-client/0.log" Apr 21 07:56:38.448116 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:38.448083 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75679f49d-lrpkx_79ec319c-106c-4f6b-8565-b2204159d925/reload/0.log" Apr 21 07:56:38.464481 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:38.464457 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75679f49d-lrpkx_79ec319c-106c-4f6b-8565-b2204159d925/kube-rbac-proxy/0.log" Apr 21 07:56:38.492904 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:38.492864 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd6765fd-sch2j_bbdc7510-ab52-49ea-a9cb-4ed5a39e6614/thanos-query/0.log" Apr 21 07:56:38.509601 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:38.509562 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd6765fd-sch2j_bbdc7510-ab52-49ea-a9cb-4ed5a39e6614/kube-rbac-proxy-web/0.log" Apr 21 07:56:38.529282 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:38.529251 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd6765fd-sch2j_bbdc7510-ab52-49ea-a9cb-4ed5a39e6614/kube-rbac-proxy/0.log" Apr 21 07:56:38.547486 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:38.547457 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd6765fd-sch2j_bbdc7510-ab52-49ea-a9cb-4ed5a39e6614/prom-label-proxy/0.log" Apr 21 07:56:38.568396 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:38.568367 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd6765fd-sch2j_bbdc7510-ab52-49ea-a9cb-4ed5a39e6614/kube-rbac-proxy-rules/0.log" Apr 21 07:56:38.595979 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:38.595875 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd6765fd-sch2j_bbdc7510-ab52-49ea-a9cb-4ed5a39e6614/kube-rbac-proxy-metrics/0.log" Apr 21 07:56:40.810964 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:40.810863 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-n27nl_d9a5643a-cb31-4bd1-aab6-1b040f232ad8/volume-data-source-validator/0.log" Apr 21 07:56:41.202731 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.202699 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc"] Apr 21 07:56:41.207466 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.207441 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc" Apr 21 07:56:41.212358 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.212331 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc"] Apr 21 07:56:41.267462 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.267421 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh47v\" (UniqueName: \"kubernetes.io/projected/f8baa47d-2c8d-4003-ac08-189b34c49368-kube-api-access-gh47v\") pod \"perf-node-gather-daemonset-nc6hc\" (UID: \"f8baa47d-2c8d-4003-ac08-189b34c49368\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc" Apr 21 07:56:41.267644 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.267477 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f8baa47d-2c8d-4003-ac08-189b34c49368-proc\") pod \"perf-node-gather-daemonset-nc6hc\" (UID: \"f8baa47d-2c8d-4003-ac08-189b34c49368\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc" Apr 21 07:56:41.267644 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.267510 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8baa47d-2c8d-4003-ac08-189b34c49368-sys\") pod \"perf-node-gather-daemonset-nc6hc\" (UID: \"f8baa47d-2c8d-4003-ac08-189b34c49368\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc" Apr 21 07:56:41.267644 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.267539 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f8baa47d-2c8d-4003-ac08-189b34c49368-podres\") pod \"perf-node-gather-daemonset-nc6hc\" (UID: \"f8baa47d-2c8d-4003-ac08-189b34c49368\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc" Apr 21 07:56:41.267644 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.267577 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f8baa47d-2c8d-4003-ac08-189b34c49368-lib-modules\") pod \"perf-node-gather-daemonset-nc6hc\" (UID: \"f8baa47d-2c8d-4003-ac08-189b34c49368\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc" Apr 21 07:56:41.368975 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.368911 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gh47v\" (UniqueName: \"kubernetes.io/projected/f8baa47d-2c8d-4003-ac08-189b34c49368-kube-api-access-gh47v\") pod \"perf-node-gather-daemonset-nc6hc\" (UID: \"f8baa47d-2c8d-4003-ac08-189b34c49368\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc" Apr 21 07:56:41.369257 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.369227 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f8baa47d-2c8d-4003-ac08-189b34c49368-proc\") pod \"perf-node-gather-daemonset-nc6hc\" (UID: \"f8baa47d-2c8d-4003-ac08-189b34c49368\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc" Apr 21 07:56:41.369429 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.369401 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8baa47d-2c8d-4003-ac08-189b34c49368-sys\") pod \"perf-node-gather-daemonset-nc6hc\" (UID: \"f8baa47d-2c8d-4003-ac08-189b34c49368\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc" Apr 21 07:56:41.369574 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.369559 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f8baa47d-2c8d-4003-ac08-189b34c49368-podres\") pod \"perf-node-gather-daemonset-nc6hc\" (UID: \"f8baa47d-2c8d-4003-ac08-189b34c49368\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc" Apr 21 07:56:41.369720 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.369703 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f8baa47d-2c8d-4003-ac08-189b34c49368-lib-modules\") pod \"perf-node-gather-daemonset-nc6hc\" (UID: \"f8baa47d-2c8d-4003-ac08-189b34c49368\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc" Apr 21 07:56:41.369811 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.369787 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f8baa47d-2c8d-4003-ac08-189b34c49368-lib-modules\") pod \"perf-node-gather-daemonset-nc6hc\" (UID: \"f8baa47d-2c8d-4003-ac08-189b34c49368\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc" Apr 21 07:56:41.369886 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.369496 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8baa47d-2c8d-4003-ac08-189b34c49368-sys\") pod \"perf-node-gather-daemonset-nc6hc\" (UID: \"f8baa47d-2c8d-4003-ac08-189b34c49368\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc" Apr 21 07:56:41.369886 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.369329 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f8baa47d-2c8d-4003-ac08-189b34c49368-proc\") pod \"perf-node-gather-daemonset-nc6hc\" (UID: \"f8baa47d-2c8d-4003-ac08-189b34c49368\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc" Apr 21 07:56:41.369886 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.369706 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f8baa47d-2c8d-4003-ac08-189b34c49368-podres\") pod \"perf-node-gather-daemonset-nc6hc\" (UID: \"f8baa47d-2c8d-4003-ac08-189b34c49368\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc" Apr 21 07:56:41.377440 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.377409 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh47v\" (UniqueName: \"kubernetes.io/projected/f8baa47d-2c8d-4003-ac08-189b34c49368-kube-api-access-gh47v\") pod \"perf-node-gather-daemonset-nc6hc\" (UID: \"f8baa47d-2c8d-4003-ac08-189b34c49368\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc" Apr 21 07:56:41.464897 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.464820 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xmj7g_9a3e6d1e-7020-4e01-b08c-6965f9908a29/dns/0.log" Apr 21 07:56:41.481266 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.481246 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xmj7g_9a3e6d1e-7020-4e01-b08c-6965f9908a29/kube-rbac-proxy/0.log" Apr 21 07:56:41.518801 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.518782 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc" Apr 21 07:56:41.536357 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.536336 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zglz2_2e1e8c1e-08e4-4ee4-a7bf-e84dfd5e0fa1/dns-node-resolver/0.log" Apr 21 07:56:41.659713 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.659684 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc"] Apr 21 07:56:41.666909 ip-10-0-138-20 kubenswrapper[2574]: W0421 07:56:41.666877 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf8baa47d_2c8d_4003_ac08_189b34c49368.slice/crio-913f767fdd127ee3e222c85fcd264b52d245034c465e9c2cb4057c54e6565d6d WatchSource:0}: Error finding container 913f767fdd127ee3e222c85fcd264b52d245034c465e9c2cb4057c54e6565d6d: Status 404 returned error can't find the container with id 913f767fdd127ee3e222c85fcd264b52d245034c465e9c2cb4057c54e6565d6d Apr 21 07:56:41.918161 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:41.918130 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-gktqp_db4c65d3-fa4f-4575-b518-0d9e5c9215b9/node-ca/0.log" Apr 21 07:56:42.103804 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:42.103715 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc" event={"ID":"f8baa47d-2c8d-4003-ac08-189b34c49368","Type":"ContainerStarted","Data":"2cb6450c8bdf193a8eb07ef2fc5423ea08d5f06f29432a54f7b86c17790c598f"} Apr 21 07:56:42.103804 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:42.103758 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc" event={"ID":"f8baa47d-2c8d-4003-ac08-189b34c49368","Type":"ContainerStarted","Data":"913f767fdd127ee3e222c85fcd264b52d245034c465e9c2cb4057c54e6565d6d"} Apr 21 07:56:42.103804 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:42.103777 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc" Apr 21 07:56:42.118793 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:42.118749 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc" podStartSLOduration=1.118735236 podStartE2EDuration="1.118735236s" podCreationTimestamp="2026-04-21 07:56:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:56:42.117796007 +0000 UTC m=+288.530581112" watchObservedRunningTime="2026-04-21 07:56:42.118735236 +0000 UTC m=+288.531520351" Apr 21 07:56:42.852978 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:42.852943 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-fw8gv_cc6f988e-e651-47c1-b9ef-5edf69838385/serve-healthcheck-canary/0.log" Apr 21 07:56:43.171260 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:43.171231 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-78688_c63570c8-68d2-427e-8592-5b3ee57b7d7a/insights-operator/0.log" Apr 21 07:56:43.172748 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:43.172722 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-78688_c63570c8-68d2-427e-8592-5b3ee57b7d7a/insights-operator/1.log" Apr 21 07:56:43.261040 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:43.261006 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fm5jh_4f44f401-0bd8-4a90-870d-3e4ab5afe97d/kube-rbac-proxy/0.log" Apr 21 07:56:43.279088 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:43.279053 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fm5jh_4f44f401-0bd8-4a90-870d-3e4ab5afe97d/exporter/0.log" Apr 21 07:56:43.298577 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:43.298550 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fm5jh_4f44f401-0bd8-4a90-870d-3e4ab5afe97d/extractor/0.log" Apr 21 07:56:46.787716 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:46.787689 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-xm8hm_0f1ecc94-e328-44b0-8fc4-cb27c09c3a99/migrator/0.log" Apr 21 07:56:46.804155 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:46.804129 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-xm8hm_0f1ecc94-e328-44b0-8fc4-cb27c09c3a99/graceful-termination/0.log" Apr 21 07:56:48.043269 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:48.043239 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kl7xh_2d12ced6-a6eb-40bb-8087-c53f467d8c26/kube-multus-additional-cni-plugins/0.log" Apr 21 07:56:48.062832 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:48.062805 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kl7xh_2d12ced6-a6eb-40bb-8087-c53f467d8c26/egress-router-binary-copy/0.log" Apr 21 07:56:48.080785 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:48.080764 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kl7xh_2d12ced6-a6eb-40bb-8087-c53f467d8c26/cni-plugins/0.log" Apr 21 07:56:48.099539 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:48.099517 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kl7xh_2d12ced6-a6eb-40bb-8087-c53f467d8c26/bond-cni-plugin/0.log" Apr 21 07:56:48.117629 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:48.117607 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kl7xh_2d12ced6-a6eb-40bb-8087-c53f467d8c26/routeoverride-cni/0.log" Apr 21 07:56:48.119430 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:48.119412 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-nc6hc" Apr 21 07:56:48.136415 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:48.136397 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kl7xh_2d12ced6-a6eb-40bb-8087-c53f467d8c26/whereabouts-cni-bincopy/0.log" Apr 21 07:56:48.160395 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:48.160365 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kl7xh_2d12ced6-a6eb-40bb-8087-c53f467d8c26/whereabouts-cni/0.log" Apr 21 07:56:48.340134 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:48.340066 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hmr5g_c5afc987-f5b7-46b4-91c1-5f015f3b2010/kube-multus/0.log" Apr 21 07:56:48.418299 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:48.418270 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jjnl5_40e690b0-0cf8-4414-b2e4-2f3c492f2196/network-metrics-daemon/0.log" Apr 21 07:56:48.434079 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:48.434056 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jjnl5_40e690b0-0cf8-4414-b2e4-2f3c492f2196/kube-rbac-proxy/0.log" Apr 21 07:56:49.341583 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:49.341538 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-452pv_5ca5f8c6-71ae-48cf-87d8-4190acb7d09e/ovn-controller/0.log" Apr 21 07:56:49.358412 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:49.358383 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-452pv_5ca5f8c6-71ae-48cf-87d8-4190acb7d09e/ovn-acl-logging/0.log" Apr 21 07:56:49.360669 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:49.360644 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-452pv_5ca5f8c6-71ae-48cf-87d8-4190acb7d09e/ovn-acl-logging/1.log" Apr 21 07:56:49.380384 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:49.380353 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-452pv_5ca5f8c6-71ae-48cf-87d8-4190acb7d09e/kube-rbac-proxy-node/0.log" Apr 21 07:56:49.405149 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:49.405118 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-452pv_5ca5f8c6-71ae-48cf-87d8-4190acb7d09e/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 07:56:49.426845 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:49.426815 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-452pv_5ca5f8c6-71ae-48cf-87d8-4190acb7d09e/northd/0.log" Apr 21 07:56:49.453231 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:49.453206 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-452pv_5ca5f8c6-71ae-48cf-87d8-4190acb7d09e/nbdb/0.log" Apr 21 07:56:49.484255 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:49.484230 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-452pv_5ca5f8c6-71ae-48cf-87d8-4190acb7d09e/sbdb/0.log" Apr 21 07:56:49.606635 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:49.606539 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-452pv_5ca5f8c6-71ae-48cf-87d8-4190acb7d09e/ovnkube-controller/0.log" Apr 21 07:56:50.927061 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:50.927029 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-q86fn_e5633692-d3ef-4f27-aec7-1ddc39fd1781/network-check-target-container/0.log" Apr 21 07:56:51.738613 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:51.738586 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-59f92_5fed3a2e-1505-4288-acc9-d9b3aabca4d0/iptables-alerter/0.log" Apr 21 07:56:52.379402 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:52.379334 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-6dmvw_ff084e90-da5c-4aa6-97ea-239d7b8f0827/tuned/0.log" Apr 21 07:56:54.023890 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:54.023861 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-452pv_5ca5f8c6-71ae-48cf-87d8-4190acb7d09e/ovn-acl-logging/0.log" Apr 21 07:56:54.024305 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:54.024153 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-452pv_5ca5f8c6-71ae-48cf-87d8-4190acb7d09e/ovn-acl-logging/0.log" Apr 21 07:56:54.601276 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:54.601243 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-pz5qt_62f93d32-f152-4b0d-a51e-3e51bdd7183b/service-ca-operator/1.log" Apr 21 07:56:54.605919 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:54.602989 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-pz5qt_62f93d32-f152-4b0d-a51e-3e51bdd7183b/service-ca-operator/0.log" Apr 21 07:56:54.872622 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:54.872526 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-865cb79987-r927t_d62814c4-6729-4bbd-9170-e3bb1249bce1/service-ca-controller/0.log" Apr 21 07:56:55.262779 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:55.262753 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-ppf7s_5d721100-b3b1-4a7b-b896-0f8f1e63c33b/csi-driver/0.log" Apr 21 07:56:55.280897 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:55.280869 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-ppf7s_5d721100-b3b1-4a7b-b896-0f8f1e63c33b/csi-node-driver-registrar/0.log" Apr 21 07:56:55.297527 ip-10-0-138-20 kubenswrapper[2574]: I0421 07:56:55.297507 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-ppf7s_5d721100-b3b1-4a7b-b896-0f8f1e63c33b/csi-liveness-probe/0.log"