Apr 17 09:20:55.279504 ip-10-0-138-237 systemd[1]: Starting Kubernetes Kubelet... Apr 17 09:20:55.724261 ip-10-0-138-237 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 09:20:55.724261 ip-10-0-138-237 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 09:20:55.724261 ip-10-0-138-237 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 09:20:55.724261 ip-10-0-138-237 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 09:20:55.724261 ip-10-0-138-237 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 09:20:55.727651 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.727585 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 09:20:55.733307 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733285 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 09:20:55.733307 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733304 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 09:20:55.733307 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733308 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 09:20:55.733307 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733311 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 09:20:55.733466 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733315 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 09:20:55.733466 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733318 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 09:20:55.733466 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733321 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 09:20:55.733466 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733323 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 09:20:55.733466 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733327 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 09:20:55.733466 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733330 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 09:20:55.733466 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733333 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 09:20:55.733466 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733336 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 09:20:55.733466 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733339 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 09:20:55.733466 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733341 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 09:20:55.733466 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733344 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 09:20:55.733466 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733347 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 09:20:55.733466 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733350 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 09:20:55.733466 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733352 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 09:20:55.733466 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733355 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 09:20:55.733466 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733357 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 09:20:55.733466 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733360 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 09:20:55.733466 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733362 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 09:20:55.733466 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733365 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 09:20:55.733466 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733368 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 09:20:55.733948 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733370 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 09:20:55.733948 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733373 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 09:20:55.733948 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733375 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 09:20:55.733948 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733378 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 09:20:55.733948 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733386 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 09:20:55.733948 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733389 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 09:20:55.733948 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733392 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 09:20:55.733948 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733395 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 09:20:55.733948 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733397 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 09:20:55.733948 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733400 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 09:20:55.733948 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733403 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 09:20:55.733948 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733405 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 09:20:55.733948 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733408 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 09:20:55.733948 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733411 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 09:20:55.733948 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733414 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 09:20:55.733948 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733417 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 09:20:55.733948 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733420 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 09:20:55.733948 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733423 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 09:20:55.733948 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733425 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 09:20:55.733948 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733430 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 09:20:55.734452 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733434 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 09:20:55.734452 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733437 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 09:20:55.734452 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733439 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 09:20:55.734452 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733442 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 09:20:55.734452 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733444 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 09:20:55.734452 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733447 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 09:20:55.734452 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733449 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 09:20:55.734452 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733452 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 09:20:55.734452 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733454 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 09:20:55.734452 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733457 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 09:20:55.734452 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733459 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 09:20:55.734452 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733463 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 09:20:55.734452 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733466 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 09:20:55.734452 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733468 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 09:20:55.734452 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733471 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 09:20:55.734452 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733474 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 09:20:55.734452 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733477 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 09:20:55.734452 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733480 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 09:20:55.734452 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733483 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 09:20:55.734922 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733487 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 09:20:55.734922 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733490 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 09:20:55.734922 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733493 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 09:20:55.734922 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733495 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 09:20:55.734922 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733498 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 09:20:55.734922 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733501 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 09:20:55.734922 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733504 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 09:20:55.734922 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733506 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 09:20:55.734922 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733509 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 09:20:55.734922 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733512 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 09:20:55.734922 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733516 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 09:20:55.734922 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733520 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 09:20:55.734922 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733527 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 09:20:55.734922 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733531 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 09:20:55.734922 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733535 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 09:20:55.734922 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733540 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 09:20:55.734922 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733544 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 09:20:55.734922 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733548 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 09:20:55.734922 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733551 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 09:20:55.734922 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733553 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 09:20:55.735437 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733556 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 09:20:55.735437 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733559 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 09:20:55.735437 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733562 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 09:20:55.735437 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733970 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 09:20:55.735437 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733976 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 09:20:55.735437 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733979 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 09:20:55.735437 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733982 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 09:20:55.735437 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733984 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 09:20:55.735437 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733987 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 09:20:55.735437 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733990 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 09:20:55.735437 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733993 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 09:20:55.735437 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.733996 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 09:20:55.735437 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734000 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 09:20:55.735437 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734002 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 09:20:55.735437 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734006 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 09:20:55.735437 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734009 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 09:20:55.735437 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734012 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 09:20:55.735437 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734014 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 09:20:55.735437 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734019 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 09:20:55.735437 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734022 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 09:20:55.735955 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734025 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 09:20:55.735955 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734027 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 09:20:55.735955 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734030 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 09:20:55.735955 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734032 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 09:20:55.735955 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734035 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 09:20:55.735955 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734037 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 09:20:55.735955 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734040 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 09:20:55.735955 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734043 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 09:20:55.735955 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734045 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 09:20:55.735955 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734048 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 09:20:55.735955 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734051 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 09:20:55.735955 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734053 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 09:20:55.735955 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734055 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 09:20:55.735955 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734058 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 09:20:55.735955 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734060 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 09:20:55.735955 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734063 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 09:20:55.735955 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734066 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 09:20:55.735955 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734068 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 09:20:55.735955 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734071 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 09:20:55.736446 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734073 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 09:20:55.736446 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734076 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 09:20:55.736446 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734078 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 09:20:55.736446 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734081 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 09:20:55.736446 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734083 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 09:20:55.736446 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734086 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 09:20:55.736446 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734089 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 09:20:55.736446 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734094 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 09:20:55.736446 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734097 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 09:20:55.736446 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734100 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 09:20:55.736446 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734103 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 09:20:55.736446 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734107 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 09:20:55.736446 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734110 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 09:20:55.736446 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734112 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 09:20:55.736446 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734115 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 09:20:55.736446 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734118 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 09:20:55.736446 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734120 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 09:20:55.736446 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734123 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 09:20:55.736446 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734125 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 09:20:55.736911 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734128 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 09:20:55.736911 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734130 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 09:20:55.736911 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734134 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 09:20:55.736911 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734136 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 09:20:55.736911 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734139 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 09:20:55.736911 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734141 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 09:20:55.736911 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734144 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 09:20:55.736911 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734146 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 09:20:55.736911 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734149 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 09:20:55.736911 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734151 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 09:20:55.736911 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734154 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 09:20:55.736911 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734156 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 09:20:55.736911 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734159 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 09:20:55.736911 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734161 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 09:20:55.736911 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734164 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 09:20:55.736911 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734166 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 09:20:55.736911 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734188 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 09:20:55.736911 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734191 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 09:20:55.736911 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734194 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 09:20:55.737399 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734198 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 09:20:55.737399 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734202 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 09:20:55.737399 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734205 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 09:20:55.737399 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734207 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 09:20:55.737399 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734210 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 09:20:55.737399 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734213 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 09:20:55.737399 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734215 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 09:20:55.737399 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734218 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 09:20:55.737399 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734220 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 09:20:55.737399 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734223 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 09:20:55.737399 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734226 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 09:20:55.737399 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734228 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 09:20:55.737399 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734295 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 09:20:55.737399 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734306 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 09:20:55.737399 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734314 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 09:20:55.737399 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734321 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 09:20:55.737399 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734328 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 09:20:55.737399 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734334 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 09:20:55.737399 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734339 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 09:20:55.737399 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734343 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 09:20:55.737399 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734347 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734350 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734354 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734357 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734360 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734363 2574 flags.go:64] FLAG: --cgroup-root="" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734366 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734369 2574 flags.go:64] FLAG: --client-ca-file="" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734372 2574 flags.go:64] FLAG: --cloud-config="" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734375 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734378 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734383 2574 flags.go:64] FLAG: --cluster-domain="" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734386 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734389 2574 flags.go:64] FLAG: --config-dir="" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734392 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734395 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734399 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734402 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734405 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734408 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734411 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734414 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734417 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734420 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734423 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 09:20:55.737917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734427 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734430 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734433 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734436 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734440 2574 flags.go:64] FLAG: --enable-server="true" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734443 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734449 2574 flags.go:64] FLAG: --event-burst="100" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734452 2574 flags.go:64] FLAG: --event-qps="50" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734455 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734459 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734462 2574 flags.go:64] FLAG: --eviction-hard="" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734466 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734469 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734472 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734475 2574 flags.go:64] FLAG: --eviction-soft="" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734477 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734480 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734483 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734486 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734489 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734492 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734494 2574 flags.go:64] FLAG: --feature-gates="" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734498 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734501 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734504 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 09:20:55.738526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734508 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734511 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734513 2574 flags.go:64] FLAG: --help="false" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734516 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-138-237.ec2.internal" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734519 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734522 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734525 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734528 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734532 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734534 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734537 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734540 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734544 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734548 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734552 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734554 2574 flags.go:64] FLAG: --kube-reserved="" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734558 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734561 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734564 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734567 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734570 2574 flags.go:64] FLAG: --lock-file="" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734573 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734575 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734578 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 09:20:55.739140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734584 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734587 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734589 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734592 2574 flags.go:64] FLAG: --logging-format="text" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734595 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734598 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734601 2574 flags.go:64] FLAG: --manifest-url="" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734604 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734608 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734611 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734615 2574 flags.go:64] FLAG: --max-pods="110" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734618 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734621 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734624 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734627 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734630 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734633 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734636 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734643 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734646 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734649 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734654 2574 flags.go:64] FLAG: --pod-cidr="" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734659 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 09:20:55.739734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734664 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734667 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734670 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734673 2574 flags.go:64] FLAG: --port="10250" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734676 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734679 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0352d1dda5e0f8161" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734682 2574 flags.go:64] FLAG: --qos-reserved="" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734685 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734688 2574 flags.go:64] FLAG: --register-node="true" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734691 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734694 2574 flags.go:64] FLAG: --register-with-taints="" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734697 2574 flags.go:64] FLAG: --registry-burst="10" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734700 2574 flags.go:64] FLAG: --registry-qps="5" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734703 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734706 2574 flags.go:64] FLAG: --reserved-memory="" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734710 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734713 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734716 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734719 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734722 2574 flags.go:64] FLAG: --runonce="false" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734724 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734727 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734730 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734733 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734736 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734739 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 09:20:55.740303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734742 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734745 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734748 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734751 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734755 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734759 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734763 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734766 2574 flags.go:64] FLAG: --system-cgroups="" Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734769 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734775 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734777 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734780 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734785 2574 flags.go:64] FLAG: --tls-min-version="" Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734788 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734791 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734793 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734796 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734799 2574 flags.go:64] FLAG: --v="2" Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734804 2574 flags.go:64] FLAG: --version="false" Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734808 2574 flags.go:64] FLAG: --vmodule="" Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734812 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.734815 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734911 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734915 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 09:20:55.740967 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734918 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 09:20:55.741549 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734921 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 09:20:55.741549 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734924 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 09:20:55.741549 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734927 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 09:20:55.741549 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734930 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 09:20:55.741549 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734933 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 09:20:55.741549 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734936 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 09:20:55.741549 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734938 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 09:20:55.741549 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734941 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 09:20:55.741549 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734944 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 09:20:55.741549 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734946 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 09:20:55.741549 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734949 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 09:20:55.741549 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734952 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 09:20:55.741549 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734955 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 09:20:55.741549 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734959 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 09:20:55.741549 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734962 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 09:20:55.741549 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734965 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 09:20:55.741549 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734967 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 09:20:55.741549 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734971 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 09:20:55.741549 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734975 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 09:20:55.742023 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734978 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 09:20:55.742023 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734981 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 09:20:55.742023 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734984 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 09:20:55.742023 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734986 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 09:20:55.742023 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734989 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 09:20:55.742023 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734992 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 09:20:55.742023 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734995 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 09:20:55.742023 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.734997 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 09:20:55.742023 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735000 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 09:20:55.742023 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735002 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 09:20:55.742023 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735005 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 09:20:55.742023 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735008 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 09:20:55.742023 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735010 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 09:20:55.742023 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735013 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 09:20:55.742023 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735015 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 09:20:55.742023 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735018 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 09:20:55.742023 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735020 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 09:20:55.742023 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735023 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 09:20:55.742023 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735025 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 09:20:55.742023 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735028 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 09:20:55.742529 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735030 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 09:20:55.742529 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735033 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 09:20:55.742529 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735036 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 09:20:55.742529 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735038 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 09:20:55.742529 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735041 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 09:20:55.742529 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735044 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 09:20:55.742529 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735057 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 09:20:55.742529 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735062 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 09:20:55.742529 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735065 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 09:20:55.742529 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735068 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 09:20:55.742529 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735070 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 09:20:55.742529 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735074 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 09:20:55.742529 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735078 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 09:20:55.742529 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735081 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 09:20:55.742529 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735084 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 09:20:55.742529 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735087 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 09:20:55.742529 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735089 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 09:20:55.742529 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735092 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 09:20:55.742529 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735095 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 09:20:55.743041 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735097 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 09:20:55.743041 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735100 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 09:20:55.743041 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735103 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 09:20:55.743041 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735105 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 09:20:55.743041 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735108 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 09:20:55.743041 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735110 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 09:20:55.743041 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735113 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 09:20:55.743041 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735115 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 09:20:55.743041 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735118 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 09:20:55.743041 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735121 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 09:20:55.743041 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735123 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 09:20:55.743041 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735126 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 09:20:55.743041 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735128 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 09:20:55.743041 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735131 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 09:20:55.743041 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735133 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 09:20:55.743041 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735136 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 09:20:55.743041 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735138 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 09:20:55.743041 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735141 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 09:20:55.743041 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735144 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 09:20:55.743041 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735148 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 09:20:55.743563 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735150 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 09:20:55.743563 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735153 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 09:20:55.743563 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735156 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 09:20:55.743563 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735158 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 09:20:55.743563 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.735161 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 09:20:55.743563 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.735935 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 09:20:55.743563 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.741875 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 09:20:55.743563 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.741890 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 09:20:55.743563 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.741942 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 09:20:55.743563 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.741947 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 09:20:55.743563 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.741950 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 09:20:55.743563 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.741953 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 09:20:55.743563 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.741956 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 09:20:55.743563 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.741959 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 09:20:55.743563 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.741962 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 09:20:55.743563 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.741964 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 09:20:55.743958 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.741967 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 09:20:55.743958 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.741969 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 09:20:55.743958 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.741972 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 09:20:55.743958 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.741976 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 09:20:55.743958 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.741980 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 09:20:55.743958 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.741983 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 09:20:55.743958 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.741985 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 09:20:55.743958 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.741988 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 09:20:55.743958 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.741991 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 09:20:55.743958 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.741994 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 09:20:55.743958 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.741998 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 09:20:55.743958 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742001 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 09:20:55.743958 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742004 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 09:20:55.743958 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742007 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 09:20:55.743958 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742010 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 09:20:55.743958 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742012 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 09:20:55.743958 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742015 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 09:20:55.743958 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742023 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 09:20:55.743958 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742027 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 09:20:55.743958 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742029 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 09:20:55.744484 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742032 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 09:20:55.744484 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742035 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 09:20:55.744484 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742038 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 09:20:55.744484 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742041 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 09:20:55.744484 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742044 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 09:20:55.744484 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742046 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 09:20:55.744484 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742049 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 09:20:55.744484 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742052 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 09:20:55.744484 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742055 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 09:20:55.744484 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742057 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 09:20:55.744484 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742060 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 09:20:55.744484 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742063 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 09:20:55.744484 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742065 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 09:20:55.744484 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742068 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 09:20:55.744484 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742070 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 09:20:55.744484 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742073 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 09:20:55.744484 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742075 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 09:20:55.744484 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742078 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 09:20:55.744484 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742081 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 09:20:55.744484 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742083 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 09:20:55.744972 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742086 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 09:20:55.744972 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742088 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 09:20:55.744972 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742091 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 09:20:55.744972 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742093 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 09:20:55.744972 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742096 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 09:20:55.744972 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742099 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 09:20:55.744972 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742101 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 09:20:55.744972 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742104 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 09:20:55.744972 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742106 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 09:20:55.744972 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742109 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 09:20:55.744972 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742112 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 09:20:55.744972 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742114 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 09:20:55.744972 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742117 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 09:20:55.744972 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742119 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 09:20:55.744972 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742122 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 09:20:55.744972 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742125 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 09:20:55.744972 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742128 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 09:20:55.744972 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742130 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 09:20:55.744972 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742133 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 09:20:55.744972 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742135 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 09:20:55.745476 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742138 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 09:20:55.745476 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742140 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 09:20:55.745476 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742143 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 09:20:55.745476 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742145 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 09:20:55.745476 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742148 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 09:20:55.745476 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742150 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 09:20:55.745476 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742153 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 09:20:55.745476 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742156 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 09:20:55.745476 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742158 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 09:20:55.745476 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742161 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 09:20:55.745476 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742163 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 09:20:55.745476 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742166 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 09:20:55.745476 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742183 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 09:20:55.745476 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742187 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 09:20:55.745476 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742190 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 09:20:55.745476 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742193 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 09:20:55.745476 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742195 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 09:20:55.745476 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742198 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 09:20:55.745916 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.742204 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 09:20:55.745916 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742319 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 09:20:55.745916 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742324 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 09:20:55.745916 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742327 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 09:20:55.745916 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742330 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 09:20:55.745916 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742334 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 09:20:55.745916 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742337 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 09:20:55.745916 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742340 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 09:20:55.745916 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742342 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 09:20:55.745916 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742345 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 09:20:55.745916 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742348 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 09:20:55.745916 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742351 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 09:20:55.745916 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742353 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 09:20:55.745916 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742356 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 09:20:55.745916 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742358 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 09:20:55.746309 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742361 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 09:20:55.746309 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742363 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 09:20:55.746309 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742366 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 09:20:55.746309 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742368 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 09:20:55.746309 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742371 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 09:20:55.746309 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742374 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 09:20:55.746309 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742376 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 09:20:55.746309 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742378 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 09:20:55.746309 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742381 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 09:20:55.746309 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742383 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 09:20:55.746309 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742386 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 09:20:55.746309 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742388 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 09:20:55.746309 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742391 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 09:20:55.746309 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742394 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 09:20:55.746309 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742396 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 09:20:55.746309 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742399 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 09:20:55.746309 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742401 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 09:20:55.746309 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742404 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 09:20:55.746309 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742406 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 09:20:55.746766 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742409 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 09:20:55.746766 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742411 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 09:20:55.746766 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742414 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 09:20:55.746766 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742416 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 09:20:55.746766 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742422 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 09:20:55.746766 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742424 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 09:20:55.746766 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742428 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 09:20:55.746766 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742430 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 09:20:55.746766 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742433 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 09:20:55.746766 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742435 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 09:20:55.746766 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742438 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 09:20:55.746766 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742440 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 09:20:55.746766 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742443 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 09:20:55.746766 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742445 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 09:20:55.746766 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742448 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 09:20:55.746766 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742450 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 09:20:55.746766 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742453 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 09:20:55.746766 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742455 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 09:20:55.746766 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742458 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 09:20:55.746766 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742460 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 09:20:55.747271 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742463 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 09:20:55.747271 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742465 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 09:20:55.747271 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742468 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 09:20:55.747271 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742470 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 09:20:55.747271 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742473 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 09:20:55.747271 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742475 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 09:20:55.747271 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742478 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 09:20:55.747271 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742480 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 09:20:55.747271 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742483 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 09:20:55.747271 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742485 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 09:20:55.747271 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742488 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 09:20:55.747271 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742491 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 09:20:55.747271 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742493 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 09:20:55.747271 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742495 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 09:20:55.747271 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742498 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 09:20:55.747271 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742500 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 09:20:55.747271 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742504 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 09:20:55.747271 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742512 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 09:20:55.747271 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742516 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 09:20:55.747739 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742520 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 09:20:55.747739 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742523 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 09:20:55.747739 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742526 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 09:20:55.747739 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742528 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 09:20:55.747739 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742531 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 09:20:55.747739 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742533 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 09:20:55.747739 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742536 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 09:20:55.747739 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742539 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 09:20:55.747739 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742541 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 09:20:55.747739 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742544 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 09:20:55.747739 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742546 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 09:20:55.747739 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742549 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 09:20:55.747739 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742551 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 09:20:55.747739 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:55.742554 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 09:20:55.747739 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.742558 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 09:20:55.747739 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.743458 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 09:20:55.748123 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.745454 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 09:20:55.748123 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.746230 2574 server.go:1019] "Starting client certificate rotation" Apr 17 09:20:55.748123 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.746322 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 09:20:55.748123 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.746354 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 09:20:55.772306 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.772288 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 09:20:55.774813 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.774788 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 09:20:55.790291 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.790086 2574 log.go:25] "Validated CRI v1 runtime API" Apr 17 09:20:55.796198 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.796182 2574 log.go:25] "Validated CRI v1 image API" Apr 17 09:20:55.797396 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.797379 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 09:20:55.800515 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.800485 2574 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 8540a346-1e2a-49b4-a4ae-fc9ccaeacdd5:/dev/nvme0n1p4 e60fad1a-bb87-4845-8199-cb50bdf76662:/dev/nvme0n1p3] Apr 17 09:20:55.800515 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.800506 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 09:20:55.803554 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.803535 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 09:20:55.806011 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.805902 2574 manager.go:217] Machine: {Timestamp:2026-04-17 09:20:55.804164584 +0000 UTC m=+0.400216469 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3080216 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec23a8ebd9e73bc838efad32b71abd4c SystemUUID:ec23a8eb-d9e7-3bc8-38ef-ad32b71abd4c BootID:d6631b96-794d-47d4-ac7d-036af1fd6692 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:5a:e6:1c:30:f1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:5a:e6:1c:30:f1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:72:3c:74:63:2a:4d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 09:20:55.806824 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.806815 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 09:20:55.806899 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.806888 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 09:20:55.809610 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.809585 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 09:20:55.809745 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.809612 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-237.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 09:20:55.809790 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.809753 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 09:20:55.809790 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.809761 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 09:20:55.809790 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.809774 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 09:20:55.810633 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.810623 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 09:20:55.811511 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.811501 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 17 09:20:55.811610 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.811601 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 09:20:55.814044 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.814034 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 17 09:20:55.814078 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.814049 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 09:20:55.814078 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.814061 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 09:20:55.814078 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.814070 2574 kubelet.go:397] "Adding apiserver pod source" Apr 17 09:20:55.814078 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.814077 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 09:20:55.815100 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.815085 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 09:20:55.815142 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.815112 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 09:20:55.818839 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.818822 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 09:20:55.820095 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.820079 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 09:20:55.821975 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.821959 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 09:20:55.822052 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.821980 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 09:20:55.822052 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.821989 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 09:20:55.822052 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.822009 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 09:20:55.822052 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.822018 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 09:20:55.822052 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.822027 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 09:20:55.822052 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.822036 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 09:20:55.822052 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.822045 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 09:20:55.822052 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.822054 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 09:20:55.822338 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.822064 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 09:20:55.822338 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.822082 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 09:20:55.822338 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.822095 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 09:20:55.823717 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.823705 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 09:20:55.823768 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.823720 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 09:20:55.824326 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.824303 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-75r6z" Apr 17 09:20:55.825418 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:55.825389 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 09:20:55.825505 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:55.825432 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-237.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 09:20:55.827052 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.827037 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 09:20:55.827135 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.827076 2574 server.go:1295] "Started kubelet" Apr 17 09:20:55.827214 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.827191 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 09:20:55.827266 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.827199 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 09:20:55.827310 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.827272 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 09:20:55.827763 ip-10-0-138-237 systemd[1]: Started Kubernetes Kubelet. Apr 17 09:20:55.828375 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.828318 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 09:20:55.828923 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.828903 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-237.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 09:20:55.829922 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.829908 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 17 09:20:55.832155 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.832135 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-75r6z" Apr 17 09:20:55.835356 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.835337 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 09:20:55.835884 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.835869 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 09:20:55.836235 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:55.835276 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-237.ec2.internal.18a71a72344daae6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-237.ec2.internal,UID:ip-10-0-138-237.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-237.ec2.internal,},FirstTimestamp:2026-04-17 09:20:55.82704919 +0000 UTC m=+0.423101081,LastTimestamp:2026-04-17 09:20:55.82704919 +0000 UTC m=+0.423101081,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-237.ec2.internal,}" Apr 17 09:20:55.836519 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.836501 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 09:20:55.836519 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.836520 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 09:20:55.837487 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:55.837465 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-237.ec2.internal\" not found" Apr 17 09:20:55.837609 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.837535 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 17 09:20:55.837609 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.837544 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 17 09:20:55.837609 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.837557 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 09:20:55.838915 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.838896 2574 factory.go:153] Registering CRI-O factory Apr 17 09:20:55.838986 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.838932 2574 factory.go:223] Registration of the crio container factory successfully Apr 17 09:20:55.839042 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.838986 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 09:20:55.839042 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.838996 2574 factory.go:55] Registering systemd factory Apr 17 09:20:55.839042 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.839006 2574 factory.go:223] Registration of the systemd container factory successfully Apr 17 09:20:55.839042 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.839036 2574 factory.go:103] Registering Raw factory Apr 17 09:20:55.839250 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.839051 2574 manager.go:1196] Started watching for new ooms in manager Apr 17 09:20:55.839370 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:55.838991 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 09:20:55.839474 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.839463 2574 manager.go:319] Starting recovery of all containers Apr 17 09:20:55.840978 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.840953 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:20:55.843626 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:55.843604 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-138-237.ec2.internal\" not found" node="ip-10-0-138-237.ec2.internal" Apr 17 09:20:55.849325 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.849217 2574 manager.go:324] Recovery completed Apr 17 09:20:55.853475 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.853463 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:20:55.855692 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.855676 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-237.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:20:55.855770 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.855702 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:20:55.855770 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.855715 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-237.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:20:55.856228 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.856213 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 09:20:55.856228 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.856224 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 09:20:55.856331 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.856239 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 17 09:20:55.858829 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.858816 2574 policy_none.go:49] "None policy: Start" Apr 17 09:20:55.858873 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.858833 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 09:20:55.858873 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.858842 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 17 09:20:55.905540 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.895381 2574 manager.go:341] "Starting Device Plugin manager" Apr 17 09:20:55.905540 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:55.895421 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 09:20:55.905540 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.895432 2574 server.go:85] "Starting device plugin registration server" Apr 17 09:20:55.905540 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.895626 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 09:20:55.905540 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.895635 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 09:20:55.905540 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.895724 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 09:20:55.905540 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.895789 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 09:20:55.905540 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.895795 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 09:20:55.905540 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:55.896184 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 09:20:55.905540 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:55.896222 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-237.ec2.internal\" not found" Apr 17 09:20:55.960388 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.960369 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 09:20:55.961512 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.961500 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 09:20:55.961577 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.961521 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 09:20:55.961577 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.961535 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 09:20:55.961577 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.961542 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 09:20:55.961577 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:55.961569 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 09:20:55.963656 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.963640 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:20:55.996415 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.996371 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:20:55.997109 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.997095 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-237.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:20:55.997146 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.997121 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:20:55.997146 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.997135 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-237.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:20:55.997232 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:55.997166 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-237.ec2.internal" Apr 17 09:20:56.007381 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.007365 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-237.ec2.internal" Apr 17 09:20:56.007430 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:56.007383 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-237.ec2.internal\": node \"ip-10-0-138-237.ec2.internal\" not found" Apr 17 09:20:56.028121 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:56.028097 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-237.ec2.internal\" not found" Apr 17 09:20:56.062497 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.062472 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-138-237.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-237.ec2.internal"] Apr 17 09:20:56.062559 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.062526 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:20:56.063923 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.063909 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-237.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:20:56.063978 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.063932 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:20:56.063978 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.063942 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-237.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:20:56.065261 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.065250 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:20:56.065394 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.065382 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-237.ec2.internal" Apr 17 09:20:56.065428 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.065406 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:20:56.065878 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.065852 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-237.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:20:56.065878 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.065863 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-237.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:20:56.065878 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.065879 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:20:56.066052 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.065886 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:20:56.066052 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.065894 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-237.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:20:56.066052 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.065897 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-237.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:20:56.067288 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.067271 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-237.ec2.internal" Apr 17 09:20:56.067371 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.067302 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 09:20:56.067878 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.067863 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-237.ec2.internal" event="NodeHasSufficientMemory" Apr 17 09:20:56.067936 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.067891 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 09:20:56.067936 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.067906 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-237.ec2.internal" event="NodeHasSufficientPID" Apr 17 09:20:56.095808 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:56.095783 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-237.ec2.internal\" not found" node="ip-10-0-138-237.ec2.internal" Apr 17 09:20:56.100113 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:56.100099 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-237.ec2.internal\" not found" node="ip-10-0-138-237.ec2.internal" Apr 17 09:20:56.128273 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:56.128255 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-237.ec2.internal\" not found" Apr 17 09:20:56.139637 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.139619 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9ca6567298fdecb7125a67e335ee762b-config\") pod \"kube-apiserver-proxy-ip-10-0-138-237.ec2.internal\" (UID: \"9ca6567298fdecb7125a67e335ee762b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-237.ec2.internal" Apr 17 09:20:56.139689 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.139640 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/eb58e212e7c3efb18409513c0745b3f7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-237.ec2.internal\" (UID: \"eb58e212e7c3efb18409513c0745b3f7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-237.ec2.internal" Apr 17 09:20:56.139689 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.139656 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb58e212e7c3efb18409513c0745b3f7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-237.ec2.internal\" (UID: \"eb58e212e7c3efb18409513c0745b3f7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-237.ec2.internal" Apr 17 09:20:56.228463 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:56.228440 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-237.ec2.internal\" not found" Apr 17 09:20:56.239877 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.239857 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9ca6567298fdecb7125a67e335ee762b-config\") pod \"kube-apiserver-proxy-ip-10-0-138-237.ec2.internal\" (UID: \"9ca6567298fdecb7125a67e335ee762b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-237.ec2.internal" Apr 17 09:20:56.239932 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.239881 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/eb58e212e7c3efb18409513c0745b3f7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-237.ec2.internal\" (UID: \"eb58e212e7c3efb18409513c0745b3f7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-237.ec2.internal" Apr 17 09:20:56.239932 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.239898 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb58e212e7c3efb18409513c0745b3f7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-237.ec2.internal\" (UID: \"eb58e212e7c3efb18409513c0745b3f7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-237.ec2.internal" Apr 17 09:20:56.240010 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.239938 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb58e212e7c3efb18409513c0745b3f7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-237.ec2.internal\" (UID: \"eb58e212e7c3efb18409513c0745b3f7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-237.ec2.internal" Apr 17 09:20:56.240010 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.239959 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/9ca6567298fdecb7125a67e335ee762b-config\") pod \"kube-apiserver-proxy-ip-10-0-138-237.ec2.internal\" (UID: \"9ca6567298fdecb7125a67e335ee762b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-237.ec2.internal" Apr 17 09:20:56.240010 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.239974 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/eb58e212e7c3efb18409513c0745b3f7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-237.ec2.internal\" (UID: \"eb58e212e7c3efb18409513c0745b3f7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-237.ec2.internal" Apr 17 09:20:56.329263 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:56.329225 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-237.ec2.internal\" not found" Apr 17 09:20:56.397773 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.397745 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-237.ec2.internal" Apr 17 09:20:56.403285 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.403269 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-237.ec2.internal" Apr 17 09:20:56.430185 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:56.430156 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-237.ec2.internal\" not found" Apr 17 09:20:56.530715 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:56.530694 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-237.ec2.internal\" not found" Apr 17 09:20:56.631246 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:56.631227 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-237.ec2.internal\" not found" Apr 17 09:20:56.731792 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:56.731767 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-237.ec2.internal\" not found" Apr 17 09:20:56.746246 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.746231 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 09:20:56.746361 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.746347 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 09:20:56.746398 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.746377 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 09:20:56.831887 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:56.831862 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-237.ec2.internal\" not found" Apr 17 09:20:56.836029 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.836009 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 09:20:56.836128 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.836024 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 09:15:55 +0000 UTC" deadline="2027-12-26 20:06:58.101922476 +0000 UTC" Apr 17 09:20:56.836128 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.836051 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14842h46m1.265874341s" Apr 17 09:20:56.850720 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.850701 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 09:20:56.869218 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.869195 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-xmz5j" Apr 17 09:20:56.876424 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.876409 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-xmz5j" Apr 17 09:20:56.885727 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:56.885260 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ca6567298fdecb7125a67e335ee762b.slice/crio-52a78181f47d8955f54ecc064977714054db34f4cfa52e17dfebd331c28425a0 WatchSource:0}: Error finding container 52a78181f47d8955f54ecc064977714054db34f4cfa52e17dfebd331c28425a0: Status 404 returned error can't find the container with id 52a78181f47d8955f54ecc064977714054db34f4cfa52e17dfebd331c28425a0 Apr 17 09:20:56.885727 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:56.885668 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb58e212e7c3efb18409513c0745b3f7.slice/crio-cd7ffbed3dc89f9d20985bab19184d70202a1f5b424953fc5ffca2877c869a3a WatchSource:0}: Error finding container cd7ffbed3dc89f9d20985bab19184d70202a1f5b424953fc5ffca2877c869a3a: Status 404 returned error can't find the container with id cd7ffbed3dc89f9d20985bab19184d70202a1f5b424953fc5ffca2877c869a3a Apr 17 09:20:56.889910 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.889894 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 09:20:56.932607 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:56.932588 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-237.ec2.internal\" not found" Apr 17 09:20:56.964717 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.964680 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-237.ec2.internal" event={"ID":"9ca6567298fdecb7125a67e335ee762b","Type":"ContainerStarted","Data":"52a78181f47d8955f54ecc064977714054db34f4cfa52e17dfebd331c28425a0"} Apr 17 09:20:56.965552 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:56.965531 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-237.ec2.internal" event={"ID":"eb58e212e7c3efb18409513c0745b3f7","Type":"ContainerStarted","Data":"cd7ffbed3dc89f9d20985bab19184d70202a1f5b424953fc5ffca2877c869a3a"} Apr 17 09:20:57.032654 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:57.032631 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-237.ec2.internal\" not found" Apr 17 09:20:57.133185 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:57.133161 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-237.ec2.internal\" not found" Apr 17 09:20:57.233725 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:57.233675 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-237.ec2.internal\" not found" Apr 17 09:20:57.275112 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.275086 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:20:57.336417 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.336398 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-237.ec2.internal" Apr 17 09:20:57.346137 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.346115 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 09:20:57.347854 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.347836 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-237.ec2.internal" Apr 17 09:20:57.356966 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.356950 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 09:20:57.397322 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.397294 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:20:57.815183 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.815155 2574 apiserver.go:52] "Watching apiserver" Apr 17 09:20:57.821046 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.820997 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 09:20:57.822836 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.822809 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rb9kg","kube-system/konnectivity-agent-fgpzd","openshift-cluster-node-tuning-operator/tuned-nl7c6","openshift-image-registry/node-ca-l7tg5","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-237.ec2.internal","openshift-multus/multus-additional-cni-plugins-g62mf","openshift-multus/network-metrics-daemon-22cz6","openshift-network-operator/iptables-alerter-2g9gn","kube-system/kube-apiserver-proxy-ip-10-0-138-237.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55","openshift-dns/node-resolver-g5ls5","openshift-multus/multus-c8z79","openshift-network-diagnostics/network-check-target-v6qts"] Apr 17 09:20:57.824541 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.824519 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:20:57.824643 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:57.824616 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-22cz6" podUID="fba6f7ca-a68b-4315-91fd-d249cb9d13d1" Apr 17 09:20:57.827146 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.827127 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fgpzd" Apr 17 09:20:57.828596 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.828575 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.828687 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.828663 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l7tg5" Apr 17 09:20:57.829820 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.829798 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 09:20:57.829936 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.829878 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 09:20:57.830024 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.829989 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.830243 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.830099 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-tvg9s\"" Apr 17 09:20:57.831280 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.831044 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 09:20:57.831280 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.831107 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 09:20:57.831280 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.831280 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 09:20:57.831509 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.831307 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 09:20:57.831509 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.831496 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gw65g\"" Apr 17 09:20:57.831617 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.831545 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 09:20:57.831754 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.831737 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-ldmbl\"" Apr 17 09:20:57.832434 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.832410 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 09:20:57.832540 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.832505 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 09:20:57.832590 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.832537 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 09:20:57.832693 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.832677 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.832875 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.832782 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2g9gn" Apr 17 09:20:57.832965 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.832951 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 09:20:57.833198 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.833126 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 09:20:57.833198 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.833160 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-4xddh\"" Apr 17 09:20:57.834288 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.834272 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:57.835588 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.835513 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 09:20:57.835962 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.835943 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 09:20:57.836040 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.835997 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-s2p4j\"" Apr 17 09:20:57.836040 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.836024 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 09:20:57.836153 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.835943 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 09:20:57.836297 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.836281 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 09:20:57.836352 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.836286 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 09:20:57.836352 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.836304 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 09:20:57.836352 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.836346 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 09:20:57.836470 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.836288 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 09:20:57.836470 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.836397 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-nfd8z\"" Apr 17 09:20:57.836777 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.836760 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 09:20:57.836866 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.836847 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 09:20:57.836921 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.836867 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-m5qnc\"" Apr 17 09:20:57.836921 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.836881 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 09:20:57.837091 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.837074 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-g5ls5" Apr 17 09:20:57.837255 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.837232 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.839585 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.839555 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 09:20:57.839723 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.839706 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 09:20:57.839802 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.839769 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:20:57.840069 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.839814 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-fc98p\"" Apr 17 09:20:57.840069 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:57.839827 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6qts" podUID="da8df46a-d006-4e3a-8a95-df428038ed39" Apr 17 09:20:57.840069 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.839890 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 09:20:57.840540 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.840516 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-5zcxm\"" Apr 17 09:20:57.848717 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.848694 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-run-systemd\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.848802 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.848744 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/19bff221-f968-4a84-9891-8578f50203f2-ovn-node-metrics-cert\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.848802 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.848773 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0755186c-8ac0-47fe-abc7-dd4eae84ad55-tmp-dir\") pod \"node-resolver-g5ls5\" (UID: \"0755186c-8ac0-47fe-abc7-dd4eae84ad55\") " pod="openshift-dns/node-resolver-g5ls5" Apr 17 09:20:57.848912 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.848805 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03476f8a-15ce-445d-b484-102c5da8fbed-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c7s55\" (UID: \"03476f8a-15ce-445d-b484-102c5da8fbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:57.848912 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.848829 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/03476f8a-15ce-445d-b484-102c5da8fbed-device-dir\") pod \"aws-ebs-csi-driver-node-c7s55\" (UID: \"03476f8a-15ce-445d-b484-102c5da8fbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:57.848912 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.848852 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-host-run-multus-certs\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.848912 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.848885 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-etc-systemd\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.849072 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.848912 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-lib-modules\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.849072 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.848936 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/19bff221-f968-4a84-9891-8578f50203f2-ovnkube-config\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.849072 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.848974 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2a16b37e-cd01-4bbd-9f94-16d59a30ae97-iptables-alerter-script\") pod \"iptables-alerter-2g9gn\" (UID: \"2a16b37e-cd01-4bbd-9f94-16d59a30ae97\") " pod="openshift-network-operator/iptables-alerter-2g9gn" Apr 17 09:20:57.849072 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.848998 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/03476f8a-15ce-445d-b484-102c5da8fbed-socket-dir\") pod \"aws-ebs-csi-driver-node-c7s55\" (UID: \"03476f8a-15ce-445d-b484-102c5da8fbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:57.849072 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849022 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-multus-cni-dir\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.849072 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849046 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-etc-sysctl-d\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.849072 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849069 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-var-lib-kubelet\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.849399 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849092 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs\") pod \"network-metrics-daemon-22cz6\" (UID: \"fba6f7ca-a68b-4315-91fd-d249cb9d13d1\") " pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:20:57.849399 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849115 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-etc-openvswitch\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.849399 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849142 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-host-slash\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.849399 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849189 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/03476f8a-15ce-445d-b484-102c5da8fbed-sys-fs\") pod \"aws-ebs-csi-driver-node-c7s55\" (UID: \"03476f8a-15ce-445d-b484-102c5da8fbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:57.849399 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849214 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/34bc662a-193e-440f-9c2e-1dee8a208524-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.849399 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849238 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-systemd-units\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.849399 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849260 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-host-var-lib-cni-bin\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.849399 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849306 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34bc662a-193e-440f-9c2e-1dee8a208524-system-cni-dir\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.849399 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849344 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ce7cf71b-3181-4cb7-84c1-caec23780d1c-konnectivity-ca\") pod \"konnectivity-agent-fgpzd\" (UID: \"ce7cf71b-3181-4cb7-84c1-caec23780d1c\") " pod="kube-system/konnectivity-agent-fgpzd" Apr 17 09:20:57.849399 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849372 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-host-kubelet\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.849399 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849399 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlrrl\" (UniqueName: \"kubernetes.io/projected/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-kube-api-access-dlrrl\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.849994 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849421 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-run-openvswitch\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.849994 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849458 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/19bff221-f968-4a84-9891-8578f50203f2-env-overrides\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.849994 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849482 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/03476f8a-15ce-445d-b484-102c5da8fbed-etc-selinux\") pod \"aws-ebs-csi-driver-node-c7s55\" (UID: \"03476f8a-15ce-445d-b484-102c5da8fbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:57.849994 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849504 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-cnibin\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.849994 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849527 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-multus-socket-dir-parent\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.849994 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849550 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-etc-sysctl-conf\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.849994 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849572 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/34bc662a-193e-440f-9c2e-1dee8a208524-cnibin\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.849994 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849596 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x24pr\" (UniqueName: \"kubernetes.io/projected/2a16b37e-cd01-4bbd-9f94-16d59a30ae97-kube-api-access-x24pr\") pod \"iptables-alerter-2g9gn\" (UID: \"2a16b37e-cd01-4bbd-9f94-16d59a30ae97\") " pod="openshift-network-operator/iptables-alerter-2g9gn" Apr 17 09:20:57.849994 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849619 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-host-var-lib-cni-multus\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.849994 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849645 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lthfs\" (UniqueName: \"kubernetes.io/projected/7a3125fc-e8c4-420c-8d7b-684643355422-kube-api-access-lthfs\") pod \"node-ca-l7tg5\" (UID: \"7a3125fc-e8c4-420c-8d7b-684643355422\") " pod="openshift-image-registry/node-ca-l7tg5" Apr 17 09:20:57.849994 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849673 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0755186c-8ac0-47fe-abc7-dd4eae84ad55-hosts-file\") pod \"node-resolver-g5ls5\" (UID: \"0755186c-8ac0-47fe-abc7-dd4eae84ad55\") " pod="openshift-dns/node-resolver-g5ls5" Apr 17 09:20:57.849994 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849715 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt4tb\" (UniqueName: \"kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb\") pod \"network-check-target-v6qts\" (UID: \"da8df46a-d006-4e3a-8a95-df428038ed39\") " pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:20:57.849994 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849749 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/19bff221-f968-4a84-9891-8578f50203f2-ovnkube-script-lib\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.849994 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849780 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/34bc662a-193e-440f-9c2e-1dee8a208524-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.849994 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849805 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-os-release\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.849994 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849829 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e2050949-863c-4e07-8b7f-adfdaf82601d-multus-daemon-config\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.850667 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849851 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-run\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.850667 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849874 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7a3125fc-e8c4-420c-8d7b-684643355422-serviceca\") pod \"node-ca-l7tg5\" (UID: \"7a3125fc-e8c4-420c-8d7b-684643355422\") " pod="openshift-image-registry/node-ca-l7tg5" Apr 17 09:20:57.850667 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849897 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-host-var-lib-kubelet\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.850667 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849920 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-multus-conf-dir\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.850667 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849944 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-etc-sysconfig\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.850667 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.849968 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/34bc662a-193e-440f-9c2e-1dee8a208524-cni-binary-copy\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.850667 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850011 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bbrk\" (UniqueName: \"kubernetes.io/projected/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-kube-api-access-7bbrk\") pod \"network-metrics-daemon-22cz6\" (UID: \"fba6f7ca-a68b-4315-91fd-d249cb9d13d1\") " pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:20:57.850667 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850052 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-node-log\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.850667 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850079 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-host-run-k8s-cni-cncf-io\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.850667 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850103 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-etc-modprobe-d\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.850667 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850127 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-etc-tuned\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.850667 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850151 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwcb8\" (UniqueName: \"kubernetes.io/projected/0755186c-8ac0-47fe-abc7-dd4eae84ad55-kube-api-access-dwcb8\") pod \"node-resolver-g5ls5\" (UID: \"0755186c-8ac0-47fe-abc7-dd4eae84ad55\") " pod="openshift-dns/node-resolver-g5ls5" Apr 17 09:20:57.850667 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850189 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-var-lib-openvswitch\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.850667 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850226 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-log-socket\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.850667 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850267 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-host-run-ovn-kubernetes\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.850667 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850301 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-host-cni-bin\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.850667 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850327 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-hostroot\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.851365 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850350 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-etc-kubernetes\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.851365 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850374 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-tmp\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.851365 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850402 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-host-run-netns\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.851365 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850425 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-host-cni-netd\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.851365 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850449 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.851365 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850489 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx85m\" (UniqueName: \"kubernetes.io/projected/e2050949-863c-4e07-8b7f-adfdaf82601d-kube-api-access-tx85m\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.851365 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850524 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrpbj\" (UniqueName: \"kubernetes.io/projected/03476f8a-15ce-445d-b484-102c5da8fbed-kube-api-access-hrpbj\") pod \"aws-ebs-csi-driver-node-c7s55\" (UID: \"03476f8a-15ce-445d-b484-102c5da8fbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:57.851365 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850550 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e2050949-863c-4e07-8b7f-adfdaf82601d-cni-binary-copy\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.851365 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850573 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-run-ovn\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.851365 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850609 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-system-cni-dir\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.851365 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850632 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-host-run-netns\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.851365 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850655 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-etc-kubernetes\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.851365 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850680 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/34bc662a-193e-440f-9c2e-1dee8a208524-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.851365 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.850998 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlwvp\" (UniqueName: \"kubernetes.io/projected/34bc662a-193e-440f-9c2e-1dee8a208524-kube-api-access-xlwvp\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.851365 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.851038 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ce7cf71b-3181-4cb7-84c1-caec23780d1c-agent-certs\") pod \"konnectivity-agent-fgpzd\" (UID: \"ce7cf71b-3181-4cb7-84c1-caec23780d1c\") " pod="kube-system/konnectivity-agent-fgpzd" Apr 17 09:20:57.851365 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.851146 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7f7w\" (UniqueName: \"kubernetes.io/projected/19bff221-f968-4a84-9891-8578f50203f2-kube-api-access-p7f7w\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.852063 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.851197 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2a16b37e-cd01-4bbd-9f94-16d59a30ae97-host-slash\") pod \"iptables-alerter-2g9gn\" (UID: \"2a16b37e-cd01-4bbd-9f94-16d59a30ae97\") " pod="openshift-network-operator/iptables-alerter-2g9gn" Apr 17 09:20:57.852063 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.851259 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/03476f8a-15ce-445d-b484-102c5da8fbed-registration-dir\") pod \"aws-ebs-csi-driver-node-c7s55\" (UID: \"03476f8a-15ce-445d-b484-102c5da8fbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:57.852063 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.851298 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-host\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.852063 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.851333 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a3125fc-e8c4-420c-8d7b-684643355422-host\") pod \"node-ca-l7tg5\" (UID: \"7a3125fc-e8c4-420c-8d7b-684643355422\") " pod="openshift-image-registry/node-ca-l7tg5" Apr 17 09:20:57.852063 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.851365 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-sys\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.852063 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.851427 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/34bc662a-193e-440f-9c2e-1dee8a208524-os-release\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.877841 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.877818 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 09:15:56 +0000 UTC" deadline="2027-10-08 23:42:53.228956682 +0000 UTC" Apr 17 09:20:57.877841 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.877839 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12950h21m55.351119252s" Apr 17 09:20:57.938389 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.938367 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 09:20:57.951768 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.951742 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs\") pod \"network-metrics-daemon-22cz6\" (UID: \"fba6f7ca-a68b-4315-91fd-d249cb9d13d1\") " pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:20:57.951898 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.951782 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-etc-openvswitch\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.951898 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.951806 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-host-slash\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.951898 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.951832 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/03476f8a-15ce-445d-b484-102c5da8fbed-sys-fs\") pod \"aws-ebs-csi-driver-node-c7s55\" (UID: \"03476f8a-15ce-445d-b484-102c5da8fbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:57.951898 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.951863 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/34bc662a-193e-440f-9c2e-1dee8a208524-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.951898 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.951881 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-host-slash\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.952186 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.951882 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-etc-openvswitch\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.952186 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.951917 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/03476f8a-15ce-445d-b484-102c5da8fbed-sys-fs\") pod \"aws-ebs-csi-driver-node-c7s55\" (UID: \"03476f8a-15ce-445d-b484-102c5da8fbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:57.952186 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.951928 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-systemd-units\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.952186 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.951888 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-systemd-units\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.952186 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.951964 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-host-var-lib-cni-bin\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.952186 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.951982 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34bc662a-193e-440f-9c2e-1dee8a208524-system-cni-dir\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.952186 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952002 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ce7cf71b-3181-4cb7-84c1-caec23780d1c-konnectivity-ca\") pod \"konnectivity-agent-fgpzd\" (UID: \"ce7cf71b-3181-4cb7-84c1-caec23780d1c\") " pod="kube-system/konnectivity-agent-fgpzd" Apr 17 09:20:57.952186 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952025 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-host-kubelet\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.952186 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952031 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/34bc662a-193e-440f-9c2e-1dee8a208524-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.952186 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952040 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/34bc662a-193e-440f-9c2e-1dee8a208524-system-cni-dir\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.952186 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:57.952049 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:20:57.952186 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952051 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dlrrl\" (UniqueName: \"kubernetes.io/projected/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-kube-api-access-dlrrl\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.952186 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952043 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-host-var-lib-cni-bin\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.952186 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952087 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-host-kubelet\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.952719 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:57.952208 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs podName:fba6f7ca-a68b-4315-91fd-d249cb9d13d1 nodeName:}" failed. No retries permitted until 2026-04-17 09:20:58.452156559 +0000 UTC m=+3.048208452 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs") pod "network-metrics-daemon-22cz6" (UID: "fba6f7ca-a68b-4315-91fd-d249cb9d13d1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:20:57.952719 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952233 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-run-openvswitch\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.952719 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952262 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/19bff221-f968-4a84-9891-8578f50203f2-env-overrides\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.952719 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952284 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-run-openvswitch\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.952719 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952286 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/03476f8a-15ce-445d-b484-102c5da8fbed-etc-selinux\") pod \"aws-ebs-csi-driver-node-c7s55\" (UID: \"03476f8a-15ce-445d-b484-102c5da8fbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:57.952719 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952325 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-cnibin\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.952719 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952359 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-multus-socket-dir-parent\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.952719 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952377 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/03476f8a-15ce-445d-b484-102c5da8fbed-etc-selinux\") pod \"aws-ebs-csi-driver-node-c7s55\" (UID: \"03476f8a-15ce-445d-b484-102c5da8fbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:57.952719 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952385 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-etc-sysctl-conf\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.952719 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952428 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-multus-socket-dir-parent\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.952719 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952431 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/34bc662a-193e-440f-9c2e-1dee8a208524-cnibin\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.952719 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952436 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-cnibin\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.952719 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952463 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x24pr\" (UniqueName: \"kubernetes.io/projected/2a16b37e-cd01-4bbd-9f94-16d59a30ae97-kube-api-access-x24pr\") pod \"iptables-alerter-2g9gn\" (UID: \"2a16b37e-cd01-4bbd-9f94-16d59a30ae97\") " pod="openshift-network-operator/iptables-alerter-2g9gn" Apr 17 09:20:57.952719 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952492 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-host-var-lib-cni-multus\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.952719 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952519 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-etc-sysctl-conf\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.952719 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952521 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lthfs\" (UniqueName: \"kubernetes.io/projected/7a3125fc-e8c4-420c-8d7b-684643355422-kube-api-access-lthfs\") pod \"node-ca-l7tg5\" (UID: \"7a3125fc-e8c4-420c-8d7b-684643355422\") " pod="openshift-image-registry/node-ca-l7tg5" Apr 17 09:20:57.952719 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952465 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/34bc662a-193e-440f-9c2e-1dee8a208524-cnibin\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.953602 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952584 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-host-var-lib-cni-multus\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.953602 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952617 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0755186c-8ac0-47fe-abc7-dd4eae84ad55-hosts-file\") pod \"node-resolver-g5ls5\" (UID: \"0755186c-8ac0-47fe-abc7-dd4eae84ad55\") " pod="openshift-dns/node-resolver-g5ls5" Apr 17 09:20:57.953602 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952652 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vt4tb\" (UniqueName: \"kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb\") pod \"network-check-target-v6qts\" (UID: \"da8df46a-d006-4e3a-8a95-df428038ed39\") " pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:20:57.953602 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952654 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ce7cf71b-3181-4cb7-84c1-caec23780d1c-konnectivity-ca\") pod \"konnectivity-agent-fgpzd\" (UID: \"ce7cf71b-3181-4cb7-84c1-caec23780d1c\") " pod="kube-system/konnectivity-agent-fgpzd" Apr 17 09:20:57.953602 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952677 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/19bff221-f968-4a84-9891-8578f50203f2-ovnkube-script-lib\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.953602 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952683 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0755186c-8ac0-47fe-abc7-dd4eae84ad55-hosts-file\") pod \"node-resolver-g5ls5\" (UID: \"0755186c-8ac0-47fe-abc7-dd4eae84ad55\") " pod="openshift-dns/node-resolver-g5ls5" Apr 17 09:20:57.953602 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952704 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/34bc662a-193e-440f-9c2e-1dee8a208524-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.953602 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952730 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-os-release\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.953602 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952754 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e2050949-863c-4e07-8b7f-adfdaf82601d-multus-daemon-config\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.953602 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952779 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-run\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.953602 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952804 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7a3125fc-e8c4-420c-8d7b-684643355422-serviceca\") pod \"node-ca-l7tg5\" (UID: \"7a3125fc-e8c4-420c-8d7b-684643355422\") " pod="openshift-image-registry/node-ca-l7tg5" Apr 17 09:20:57.953602 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952806 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/19bff221-f968-4a84-9891-8578f50203f2-env-overrides\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.953602 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952817 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-os-release\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.953602 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952828 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-host-var-lib-kubelet\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.953602 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952853 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-multus-conf-dir\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.953602 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952870 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-run\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.953602 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952877 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-etc-sysconfig\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.953602 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952921 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-host-var-lib-kubelet\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.954546 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952931 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-etc-sysconfig\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.954546 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952948 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/34bc662a-193e-440f-9c2e-1dee8a208524-cni-binary-copy\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.954546 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952973 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-multus-conf-dir\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.954546 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.952975 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bbrk\" (UniqueName: \"kubernetes.io/projected/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-kube-api-access-7bbrk\") pod \"network-metrics-daemon-22cz6\" (UID: \"fba6f7ca-a68b-4315-91fd-d249cb9d13d1\") " pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:20:57.954546 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953002 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-node-log\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.954546 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953028 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-host-run-k8s-cni-cncf-io\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.954546 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953053 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-etc-modprobe-d\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.954546 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953077 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-etc-tuned\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.954546 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953092 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/34bc662a-193e-440f-9c2e-1dee8a208524-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.954546 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953101 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwcb8\" (UniqueName: \"kubernetes.io/projected/0755186c-8ac0-47fe-abc7-dd4eae84ad55-kube-api-access-dwcb8\") pod \"node-resolver-g5ls5\" (UID: \"0755186c-8ac0-47fe-abc7-dd4eae84ad55\") " pod="openshift-dns/node-resolver-g5ls5" Apr 17 09:20:57.954546 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953137 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-host-run-k8s-cni-cncf-io\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.954546 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953149 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-var-lib-openvswitch\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.954546 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953163 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-node-log\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.954546 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953194 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-log-socket\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.954546 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953220 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-host-run-ovn-kubernetes\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.954546 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953244 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-host-cni-bin\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.954546 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953246 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/19bff221-f968-4a84-9891-8578f50203f2-ovnkube-script-lib\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.955379 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953283 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7a3125fc-e8c4-420c-8d7b-684643355422-serviceca\") pod \"node-ca-l7tg5\" (UID: \"7a3125fc-e8c4-420c-8d7b-684643355422\") " pod="openshift-image-registry/node-ca-l7tg5" Apr 17 09:20:57.955379 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953316 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-log-socket\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.955379 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953333 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-host-run-ovn-kubernetes\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.955379 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953370 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-hostroot\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.955379 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953270 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-hostroot\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.955379 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953382 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-host-cni-bin\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.955379 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953398 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e2050949-863c-4e07-8b7f-adfdaf82601d-multus-daemon-config\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.955379 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953405 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 09:20:57.955379 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953426 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-etc-kubernetes\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.955379 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953437 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-var-lib-openvswitch\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.955379 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953453 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-tmp\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.955379 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953474 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-etc-modprobe-d\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.955379 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953483 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-host-run-netns\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.955379 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953491 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-etc-kubernetes\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.955379 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953506 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-host-cni-netd\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.955379 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953526 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-host-run-netns\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.955379 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953531 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.955379 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953539 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-host-cni-netd\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.956144 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953559 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx85m\" (UniqueName: \"kubernetes.io/projected/e2050949-863c-4e07-8b7f-adfdaf82601d-kube-api-access-tx85m\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.956144 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953563 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.956144 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953585 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrpbj\" (UniqueName: \"kubernetes.io/projected/03476f8a-15ce-445d-b484-102c5da8fbed-kube-api-access-hrpbj\") pod \"aws-ebs-csi-driver-node-c7s55\" (UID: \"03476f8a-15ce-445d-b484-102c5da8fbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:57.956144 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953611 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e2050949-863c-4e07-8b7f-adfdaf82601d-cni-binary-copy\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.956144 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953635 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-run-ovn\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.956144 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953659 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-system-cni-dir\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.956144 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953684 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-host-run-netns\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.956144 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953708 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-etc-kubernetes\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.956144 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953735 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/34bc662a-193e-440f-9c2e-1dee8a208524-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.956144 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953738 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-run-ovn\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.956144 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953784 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-host-run-netns\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.956144 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953808 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-etc-kubernetes\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.956144 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953839 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlwvp\" (UniqueName: \"kubernetes.io/projected/34bc662a-193e-440f-9c2e-1dee8a208524-kube-api-access-xlwvp\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.956144 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953855 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-system-cni-dir\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.956144 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953863 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ce7cf71b-3181-4cb7-84c1-caec23780d1c-agent-certs\") pod \"konnectivity-agent-fgpzd\" (UID: \"ce7cf71b-3181-4cb7-84c1-caec23780d1c\") " pod="kube-system/konnectivity-agent-fgpzd" Apr 17 09:20:57.956144 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953900 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7f7w\" (UniqueName: \"kubernetes.io/projected/19bff221-f968-4a84-9891-8578f50203f2-kube-api-access-p7f7w\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.956144 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953927 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2a16b37e-cd01-4bbd-9f94-16d59a30ae97-host-slash\") pod \"iptables-alerter-2g9gn\" (UID: \"2a16b37e-cd01-4bbd-9f94-16d59a30ae97\") " pod="openshift-network-operator/iptables-alerter-2g9gn" Apr 17 09:20:57.956871 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953951 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/03476f8a-15ce-445d-b484-102c5da8fbed-registration-dir\") pod \"aws-ebs-csi-driver-node-c7s55\" (UID: \"03476f8a-15ce-445d-b484-102c5da8fbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:57.956871 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953973 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-host\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.956871 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.953993 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a3125fc-e8c4-420c-8d7b-684643355422-host\") pod \"node-ca-l7tg5\" (UID: \"7a3125fc-e8c4-420c-8d7b-684643355422\") " pod="openshift-image-registry/node-ca-l7tg5" Apr 17 09:20:57.956871 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954027 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-sys\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.956871 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954047 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/34bc662a-193e-440f-9c2e-1dee8a208524-os-release\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.956871 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954066 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-run-systemd\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.956871 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954088 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/19bff221-f968-4a84-9891-8578f50203f2-ovn-node-metrics-cert\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.956871 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954107 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0755186c-8ac0-47fe-abc7-dd4eae84ad55-tmp-dir\") pod \"node-resolver-g5ls5\" (UID: \"0755186c-8ac0-47fe-abc7-dd4eae84ad55\") " pod="openshift-dns/node-resolver-g5ls5" Apr 17 09:20:57.956871 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954127 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03476f8a-15ce-445d-b484-102c5da8fbed-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c7s55\" (UID: \"03476f8a-15ce-445d-b484-102c5da8fbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:57.956871 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954147 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/03476f8a-15ce-445d-b484-102c5da8fbed-device-dir\") pod \"aws-ebs-csi-driver-node-c7s55\" (UID: \"03476f8a-15ce-445d-b484-102c5da8fbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:57.956871 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954145 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/34bc662a-193e-440f-9c2e-1dee8a208524-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.956871 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954184 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-host-run-multus-certs\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.956871 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954197 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/34bc662a-193e-440f-9c2e-1dee8a208524-cni-binary-copy\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.956871 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954206 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-etc-systemd\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.956871 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954228 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-lib-modules\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.956871 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954248 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/19bff221-f968-4a84-9891-8578f50203f2-ovnkube-config\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.956871 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954268 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/34bc662a-193e-440f-9c2e-1dee8a208524-os-release\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.957565 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954270 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2a16b37e-cd01-4bbd-9f94-16d59a30ae97-iptables-alerter-script\") pod \"iptables-alerter-2g9gn\" (UID: \"2a16b37e-cd01-4bbd-9f94-16d59a30ae97\") " pod="openshift-network-operator/iptables-alerter-2g9gn" Apr 17 09:20:57.957565 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954309 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/03476f8a-15ce-445d-b484-102c5da8fbed-socket-dir\") pod \"aws-ebs-csi-driver-node-c7s55\" (UID: \"03476f8a-15ce-445d-b484-102c5da8fbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:57.957565 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954334 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-multus-cni-dir\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.957565 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954367 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-etc-sysctl-d\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.957565 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954395 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-var-lib-kubelet\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.957565 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954405 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/03476f8a-15ce-445d-b484-102c5da8fbed-device-dir\") pod \"aws-ebs-csi-driver-node-c7s55\" (UID: \"03476f8a-15ce-445d-b484-102c5da8fbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:57.957565 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954514 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-sys\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.957565 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954764 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a3125fc-e8c4-420c-8d7b-684643355422-host\") pod \"node-ca-l7tg5\" (UID: \"7a3125fc-e8c4-420c-8d7b-684643355422\") " pod="openshift-image-registry/node-ca-l7tg5" Apr 17 09:20:57.957565 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954781 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e2050949-863c-4e07-8b7f-adfdaf82601d-cni-binary-copy\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.957565 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954786 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2a16b37e-cd01-4bbd-9f94-16d59a30ae97-iptables-alerter-script\") pod \"iptables-alerter-2g9gn\" (UID: \"2a16b37e-cd01-4bbd-9f94-16d59a30ae97\") " pod="openshift-network-operator/iptables-alerter-2g9gn" Apr 17 09:20:57.957565 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954812 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2a16b37e-cd01-4bbd-9f94-16d59a30ae97-host-slash\") pod \"iptables-alerter-2g9gn\" (UID: \"2a16b37e-cd01-4bbd-9f94-16d59a30ae97\") " pod="openshift-network-operator/iptables-alerter-2g9gn" Apr 17 09:20:57.957565 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954848 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-host-run-multus-certs\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.957565 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954885 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-etc-systemd\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.957565 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.954980 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03476f8a-15ce-445d-b484-102c5da8fbed-kubelet-dir\") pod \"aws-ebs-csi-driver-node-c7s55\" (UID: \"03476f8a-15ce-445d-b484-102c5da8fbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:57.957565 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.955024 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/03476f8a-15ce-445d-b484-102c5da8fbed-socket-dir\") pod \"aws-ebs-csi-driver-node-c7s55\" (UID: \"03476f8a-15ce-445d-b484-102c5da8fbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:57.957565 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.955063 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-etc-sysctl-d\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.957565 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.955070 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-host\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.957565 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.955103 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/19bff221-f968-4a84-9891-8578f50203f2-run-systemd\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.958361 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.955214 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e2050949-863c-4e07-8b7f-adfdaf82601d-multus-cni-dir\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.958361 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.955236 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/03476f8a-15ce-445d-b484-102c5da8fbed-registration-dir\") pod \"aws-ebs-csi-driver-node-c7s55\" (UID: \"03476f8a-15ce-445d-b484-102c5da8fbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:57.958361 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.955330 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-lib-modules\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.958361 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.955334 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-var-lib-kubelet\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.958361 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.955351 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0755186c-8ac0-47fe-abc7-dd4eae84ad55-tmp-dir\") pod \"node-resolver-g5ls5\" (UID: \"0755186c-8ac0-47fe-abc7-dd4eae84ad55\") " pod="openshift-dns/node-resolver-g5ls5" Apr 17 09:20:57.958361 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.955758 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/19bff221-f968-4a84-9891-8578f50203f2-ovnkube-config\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.958361 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.956932 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-tmp\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.958361 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.957126 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ce7cf71b-3181-4cb7-84c1-caec23780d1c-agent-certs\") pod \"konnectivity-agent-fgpzd\" (UID: \"ce7cf71b-3181-4cb7-84c1-caec23780d1c\") " pod="kube-system/konnectivity-agent-fgpzd" Apr 17 09:20:57.958361 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.957660 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-etc-tuned\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.958361 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.957741 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/19bff221-f968-4a84-9891-8578f50203f2-ovn-node-metrics-cert\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:57.960231 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:57.960208 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:20:57.960231 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:57.960233 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:20:57.960376 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:57.960245 2574 projected.go:194] Error preparing data for projected volume kube-api-access-vt4tb for pod openshift-network-diagnostics/network-check-target-v6qts: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:20:57.960376 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:57.960301 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb podName:da8df46a-d006-4e3a-8a95-df428038ed39 nodeName:}" failed. No retries permitted until 2026-04-17 09:20:58.460283634 +0000 UTC m=+3.056335520 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vt4tb" (UniqueName: "kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb") pod "network-check-target-v6qts" (UID: "da8df46a-d006-4e3a-8a95-df428038ed39") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:20:57.960693 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.960667 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlrrl\" (UniqueName: \"kubernetes.io/projected/a0dde629-4b00-4d8e-8f44-daa979a1e1a8-kube-api-access-dlrrl\") pod \"tuned-nl7c6\" (UID: \"a0dde629-4b00-4d8e-8f44-daa979a1e1a8\") " pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:57.962532 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.962472 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lthfs\" (UniqueName: \"kubernetes.io/projected/7a3125fc-e8c4-420c-8d7b-684643355422-kube-api-access-lthfs\") pod \"node-ca-l7tg5\" (UID: \"7a3125fc-e8c4-420c-8d7b-684643355422\") " pod="openshift-image-registry/node-ca-l7tg5" Apr 17 09:20:57.962532 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.962490 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx85m\" (UniqueName: \"kubernetes.io/projected/e2050949-863c-4e07-8b7f-adfdaf82601d-kube-api-access-tx85m\") pod \"multus-c8z79\" (UID: \"e2050949-863c-4e07-8b7f-adfdaf82601d\") " pod="openshift-multus/multus-c8z79" Apr 17 09:20:57.963640 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.963586 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwcb8\" (UniqueName: \"kubernetes.io/projected/0755186c-8ac0-47fe-abc7-dd4eae84ad55-kube-api-access-dwcb8\") pod \"node-resolver-g5ls5\" (UID: \"0755186c-8ac0-47fe-abc7-dd4eae84ad55\") " pod="openshift-dns/node-resolver-g5ls5" Apr 17 09:20:57.964132 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.964093 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlwvp\" (UniqueName: \"kubernetes.io/projected/34bc662a-193e-440f-9c2e-1dee8a208524-kube-api-access-xlwvp\") pod \"multus-additional-cni-plugins-g62mf\" (UID: \"34bc662a-193e-440f-9c2e-1dee8a208524\") " pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:57.964418 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.964345 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bbrk\" (UniqueName: \"kubernetes.io/projected/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-kube-api-access-7bbrk\") pod \"network-metrics-daemon-22cz6\" (UID: \"fba6f7ca-a68b-4315-91fd-d249cb9d13d1\") " pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:20:57.964749 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.964661 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrpbj\" (UniqueName: \"kubernetes.io/projected/03476f8a-15ce-445d-b484-102c5da8fbed-kube-api-access-hrpbj\") pod \"aws-ebs-csi-driver-node-c7s55\" (UID: \"03476f8a-15ce-445d-b484-102c5da8fbed\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:57.965334 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.965274 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x24pr\" (UniqueName: \"kubernetes.io/projected/2a16b37e-cd01-4bbd-9f94-16d59a30ae97-kube-api-access-x24pr\") pod \"iptables-alerter-2g9gn\" (UID: \"2a16b37e-cd01-4bbd-9f94-16d59a30ae97\") " pod="openshift-network-operator/iptables-alerter-2g9gn" Apr 17 09:20:57.965535 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:57.965503 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7f7w\" (UniqueName: \"kubernetes.io/projected/19bff221-f968-4a84-9891-8578f50203f2-kube-api-access-p7f7w\") pod \"ovnkube-node-rb9kg\" (UID: \"19bff221-f968-4a84-9891-8578f50203f2\") " pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:58.140033 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:58.139990 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fgpzd" Apr 17 09:20:58.145824 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:58.145802 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" Apr 17 09:20:58.154679 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:58.154660 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l7tg5" Apr 17 09:20:58.160231 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:58.160213 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g62mf" Apr 17 09:20:58.166828 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:58.166811 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:20:58.173377 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:58.173358 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2g9gn" Apr 17 09:20:58.179918 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:58.179899 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" Apr 17 09:20:58.187478 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:58.187458 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-g5ls5" Apr 17 09:20:58.193080 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:58.193055 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-c8z79" Apr 17 09:20:58.292511 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:58.292487 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:20:58.349128 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:58.349095 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 09:20:58.457211 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:58.457123 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs\") pod \"network-metrics-daemon-22cz6\" (UID: \"fba6f7ca-a68b-4315-91fd-d249cb9d13d1\") " pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:20:58.457355 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:58.457276 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:20:58.457396 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:58.457354 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs podName:fba6f7ca-a68b-4315-91fd-d249cb9d13d1 nodeName:}" failed. No retries permitted until 2026-04-17 09:20:59.457337008 +0000 UTC m=+4.053388884 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs") pod "network-metrics-daemon-22cz6" (UID: "fba6f7ca-a68b-4315-91fd-d249cb9d13d1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:20:58.517071 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:58.517041 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34bc662a_193e_440f_9c2e_1dee8a208524.slice/crio-62c977d6ce6581777277940ab5e99028b382887ba60e773533beea97533fe24a WatchSource:0}: Error finding container 62c977d6ce6581777277940ab5e99028b382887ba60e773533beea97533fe24a: Status 404 returned error can't find the container with id 62c977d6ce6581777277940ab5e99028b382887ba60e773533beea97533fe24a Apr 17 09:20:58.518232 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:58.518106 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0dde629_4b00_4d8e_8f44_daa979a1e1a8.slice/crio-7b05c8310af1b4cbc3c0abfa67c0bd04b72d71ef8071cf7dc3229405c9101205 WatchSource:0}: Error finding container 7b05c8310af1b4cbc3c0abfa67c0bd04b72d71ef8071cf7dc3229405c9101205: Status 404 returned error can't find the container with id 7b05c8310af1b4cbc3c0abfa67c0bd04b72d71ef8071cf7dc3229405c9101205 Apr 17 09:20:58.519222 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:58.519164 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a3125fc_e8c4_420c_8d7b_684643355422.slice/crio-c319f4a920a99f5ade4d6dfb25f74b97b39d260bb69c76688d201461612a7485 WatchSource:0}: Error finding container c319f4a920a99f5ade4d6dfb25f74b97b39d260bb69c76688d201461612a7485: Status 404 returned error can't find the container with id c319f4a920a99f5ade4d6dfb25f74b97b39d260bb69c76688d201461612a7485 Apr 17 09:20:58.521007 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:58.520986 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a16b37e_cd01_4bbd_9f94_16d59a30ae97.slice/crio-b7bd8f7cdb8f493ba812936a963b84fd4d25de6e6052465cfdf9a5755185c43e WatchSource:0}: Error finding container b7bd8f7cdb8f493ba812936a963b84fd4d25de6e6052465cfdf9a5755185c43e: Status 404 returned error can't find the container with id b7bd8f7cdb8f493ba812936a963b84fd4d25de6e6052465cfdf9a5755185c43e Apr 17 09:20:58.522558 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:58.522425 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2050949_863c_4e07_8b7f_adfdaf82601d.slice/crio-6495a8e5949e1faeb888eadcc761ecae4ac05148c8ce60364c8145932706a79b WatchSource:0}: Error finding container 6495a8e5949e1faeb888eadcc761ecae4ac05148c8ce60364c8145932706a79b: Status 404 returned error can't find the container with id 6495a8e5949e1faeb888eadcc761ecae4ac05148c8ce60364c8145932706a79b Apr 17 09:20:58.524877 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:58.524796 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03476f8a_15ce_445d_b484_102c5da8fbed.slice/crio-090313632b7d1cc14819359b0f385d7e3f52d8288924387f21bbef6de648f29a WatchSource:0}: Error finding container 090313632b7d1cc14819359b0f385d7e3f52d8288924387f21bbef6de648f29a: Status 404 returned error can't find the container with id 090313632b7d1cc14819359b0f385d7e3f52d8288924387f21bbef6de648f29a Apr 17 09:20:58.525534 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:58.525512 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0755186c_8ac0_47fe_abc7_dd4eae84ad55.slice/crio-5fd96644d6cf5b4320f51781dbc10f28b565fe534e380c39b6264f845dca4027 WatchSource:0}: Error finding container 5fd96644d6cf5b4320f51781dbc10f28b565fe534e380c39b6264f845dca4027: Status 404 returned error can't find the container with id 5fd96644d6cf5b4320f51781dbc10f28b565fe534e380c39b6264f845dca4027 Apr 17 09:20:58.529059 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:58.528606 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce7cf71b_3181_4cb7_84c1_caec23780d1c.slice/crio-f0ae4ae3fd0b8292efd538705332d0bafb92a939c8c26ae7afcbfa3fa67eeb69 WatchSource:0}: Error finding container f0ae4ae3fd0b8292efd538705332d0bafb92a939c8c26ae7afcbfa3fa67eeb69: Status 404 returned error can't find the container with id f0ae4ae3fd0b8292efd538705332d0bafb92a939c8c26ae7afcbfa3fa67eeb69 Apr 17 09:20:58.530979 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:20:58.530953 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19bff221_f968_4a84_9891_8578f50203f2.slice/crio-2fadcad160a2faaee6e12ccd3c788ed3d518a5941356ee3a429f38344aea582a WatchSource:0}: Error finding container 2fadcad160a2faaee6e12ccd3c788ed3d518a5941356ee3a429f38344aea582a: Status 404 returned error can't find the container with id 2fadcad160a2faaee6e12ccd3c788ed3d518a5941356ee3a429f38344aea582a Apr 17 09:20:58.557785 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:58.557766 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vt4tb\" (UniqueName: \"kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb\") pod \"network-check-target-v6qts\" (UID: \"da8df46a-d006-4e3a-8a95-df428038ed39\") " pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:20:58.557907 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:58.557895 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:20:58.557947 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:58.557910 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:20:58.557947 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:58.557919 2574 projected.go:194] Error preparing data for projected volume kube-api-access-vt4tb for pod openshift-network-diagnostics/network-check-target-v6qts: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:20:58.558019 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:58.557962 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb podName:da8df46a-d006-4e3a-8a95-df428038ed39 nodeName:}" failed. No retries permitted until 2026-04-17 09:20:59.557945912 +0000 UTC m=+4.153997785 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-vt4tb" (UniqueName: "kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb") pod "network-check-target-v6qts" (UID: "da8df46a-d006-4e3a-8a95-df428038ed39") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:20:58.878666 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:58.878409 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 09:15:56 +0000 UTC" deadline="2027-12-08 21:18:32.986791861 +0000 UTC" Apr 17 09:20:58.878666 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:58.878642 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14411h57m34.10815797s" Apr 17 09:20:58.972997 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:58.972960 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c8z79" event={"ID":"e2050949-863c-4e07-8b7f-adfdaf82601d","Type":"ContainerStarted","Data":"6495a8e5949e1faeb888eadcc761ecae4ac05148c8ce60364c8145932706a79b"} Apr 17 09:20:58.982495 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:58.980622 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" event={"ID":"a0dde629-4b00-4d8e-8f44-daa979a1e1a8","Type":"ContainerStarted","Data":"7b05c8310af1b4cbc3c0abfa67c0bd04b72d71ef8071cf7dc3229405c9101205"} Apr 17 09:20:58.982495 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:58.982194 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l7tg5" event={"ID":"7a3125fc-e8c4-420c-8d7b-684643355422","Type":"ContainerStarted","Data":"c319f4a920a99f5ade4d6dfb25f74b97b39d260bb69c76688d201461612a7485"} Apr 17 09:20:58.985111 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:58.985066 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" event={"ID":"19bff221-f968-4a84-9891-8578f50203f2","Type":"ContainerStarted","Data":"2fadcad160a2faaee6e12ccd3c788ed3d518a5941356ee3a429f38344aea582a"} Apr 17 09:20:58.986408 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:58.986385 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-g5ls5" event={"ID":"0755186c-8ac0-47fe-abc7-dd4eae84ad55","Type":"ContainerStarted","Data":"5fd96644d6cf5b4320f51781dbc10f28b565fe534e380c39b6264f845dca4027"} Apr 17 09:20:58.992039 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:58.992007 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2g9gn" event={"ID":"2a16b37e-cd01-4bbd-9f94-16d59a30ae97","Type":"ContainerStarted","Data":"b7bd8f7cdb8f493ba812936a963b84fd4d25de6e6052465cfdf9a5755185c43e"} Apr 17 09:20:58.994897 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:58.994875 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g62mf" event={"ID":"34bc662a-193e-440f-9c2e-1dee8a208524","Type":"ContainerStarted","Data":"62c977d6ce6581777277940ab5e99028b382887ba60e773533beea97533fe24a"} Apr 17 09:20:59.015728 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:59.015701 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-237.ec2.internal" event={"ID":"9ca6567298fdecb7125a67e335ee762b","Type":"ContainerStarted","Data":"1c04d2be8df4b836debb5970e40027563febd31db2e0272e584c99402e76763d"} Apr 17 09:20:59.018835 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:59.018790 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fgpzd" event={"ID":"ce7cf71b-3181-4cb7-84c1-caec23780d1c","Type":"ContainerStarted","Data":"f0ae4ae3fd0b8292efd538705332d0bafb92a939c8c26ae7afcbfa3fa67eeb69"} Apr 17 09:20:59.023825 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:59.023779 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" event={"ID":"03476f8a-15ce-445d-b484-102c5da8fbed","Type":"ContainerStarted","Data":"090313632b7d1cc14819359b0f385d7e3f52d8288924387f21bbef6de648f29a"} Apr 17 09:20:59.030959 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:59.030239 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-237.ec2.internal" podStartSLOduration=2.030224942 podStartE2EDuration="2.030224942s" podCreationTimestamp="2026-04-17 09:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:20:59.029906439 +0000 UTC m=+3.625958335" watchObservedRunningTime="2026-04-17 09:20:59.030224942 +0000 UTC m=+3.626276836" Apr 17 09:20:59.465615 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:59.465585 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs\") pod \"network-metrics-daemon-22cz6\" (UID: \"fba6f7ca-a68b-4315-91fd-d249cb9d13d1\") " pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:20:59.465749 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:59.465735 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:20:59.465824 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:59.465797 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs podName:fba6f7ca-a68b-4315-91fd-d249cb9d13d1 nodeName:}" failed. No retries permitted until 2026-04-17 09:21:01.465777449 +0000 UTC m=+6.061829356 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs") pod "network-metrics-daemon-22cz6" (UID: "fba6f7ca-a68b-4315-91fd-d249cb9d13d1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:20:59.566832 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:59.566799 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vt4tb\" (UniqueName: \"kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb\") pod \"network-check-target-v6qts\" (UID: \"da8df46a-d006-4e3a-8a95-df428038ed39\") " pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:20:59.567013 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:59.566962 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:20:59.567013 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:59.566987 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:20:59.567013 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:59.566999 2574 projected.go:194] Error preparing data for projected volume kube-api-access-vt4tb for pod openshift-network-diagnostics/network-check-target-v6qts: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:20:59.567169 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:59.567051 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb podName:da8df46a-d006-4e3a-8a95-df428038ed39 nodeName:}" failed. No retries permitted until 2026-04-17 09:21:01.567033174 +0000 UTC m=+6.163085062 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-vt4tb" (UniqueName: "kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb") pod "network-check-target-v6qts" (UID: "da8df46a-d006-4e3a-8a95-df428038ed39") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:20:59.964961 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:59.964929 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:20:59.965414 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:59.965054 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6qts" podUID="da8df46a-d006-4e3a-8a95-df428038ed39" Apr 17 09:20:59.965910 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:20:59.965554 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:20:59.965910 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:20:59.965721 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-22cz6" podUID="fba6f7ca-a68b-4315-91fd-d249cb9d13d1" Apr 17 09:21:00.051785 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:00.051753 2574 generic.go:358] "Generic (PLEG): container finished" podID="eb58e212e7c3efb18409513c0745b3f7" containerID="970fb7445171efd9b31b6c1a8d025cfe8c0dfd69624d254c150e541bc64246bc" exitCode=0 Apr 17 09:21:00.051948 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:00.051850 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-237.ec2.internal" event={"ID":"eb58e212e7c3efb18409513c0745b3f7","Type":"ContainerDied","Data":"970fb7445171efd9b31b6c1a8d025cfe8c0dfd69624d254c150e541bc64246bc"} Apr 17 09:21:01.062195 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:01.061956 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-237.ec2.internal" event={"ID":"eb58e212e7c3efb18409513c0745b3f7","Type":"ContainerStarted","Data":"46c2a22a9c82003a4ea03635881ceff5892067468219079afeace42f44009d71"} Apr 17 09:21:01.481013 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:01.480977 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs\") pod \"network-metrics-daemon-22cz6\" (UID: \"fba6f7ca-a68b-4315-91fd-d249cb9d13d1\") " pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:21:01.481223 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:01.481168 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:21:01.481284 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:01.481246 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs podName:fba6f7ca-a68b-4315-91fd-d249cb9d13d1 nodeName:}" failed. No retries permitted until 2026-04-17 09:21:05.481227562 +0000 UTC m=+10.077279668 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs") pod "network-metrics-daemon-22cz6" (UID: "fba6f7ca-a68b-4315-91fd-d249cb9d13d1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:21:01.581924 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:01.581888 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vt4tb\" (UniqueName: \"kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb\") pod \"network-check-target-v6qts\" (UID: \"da8df46a-d006-4e3a-8a95-df428038ed39\") " pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:21:01.582116 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:01.582052 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:21:01.582116 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:01.582077 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:21:01.582116 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:01.582089 2574 projected.go:194] Error preparing data for projected volume kube-api-access-vt4tb for pod openshift-network-diagnostics/network-check-target-v6qts: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:21:01.582300 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:01.582146 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb podName:da8df46a-d006-4e3a-8a95-df428038ed39 nodeName:}" failed. No retries permitted until 2026-04-17 09:21:05.582127721 +0000 UTC m=+10.178179609 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-vt4tb" (UniqueName: "kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb") pod "network-check-target-v6qts" (UID: "da8df46a-d006-4e3a-8a95-df428038ed39") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:21:01.963340 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:01.963308 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:21:01.963517 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:01.963432 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6qts" podUID="da8df46a-d006-4e3a-8a95-df428038ed39" Apr 17 09:21:01.963990 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:01.963968 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:21:01.964141 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:01.964079 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-22cz6" podUID="fba6f7ca-a68b-4315-91fd-d249cb9d13d1" Apr 17 09:21:03.962566 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:03.962536 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:21:03.962566 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:03.962564 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:21:03.963067 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:03.962682 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6qts" podUID="da8df46a-d006-4e3a-8a95-df428038ed39" Apr 17 09:21:03.963140 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:03.963112 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-22cz6" podUID="fba6f7ca-a68b-4315-91fd-d249cb9d13d1" Apr 17 09:21:05.512133 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:05.512095 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs\") pod \"network-metrics-daemon-22cz6\" (UID: \"fba6f7ca-a68b-4315-91fd-d249cb9d13d1\") " pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:21:05.512586 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:05.512269 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:21:05.512586 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:05.512334 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs podName:fba6f7ca-a68b-4315-91fd-d249cb9d13d1 nodeName:}" failed. No retries permitted until 2026-04-17 09:21:13.512313991 +0000 UTC m=+18.108365874 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs") pod "network-metrics-daemon-22cz6" (UID: "fba6f7ca-a68b-4315-91fd-d249cb9d13d1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:21:05.612835 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:05.612798 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vt4tb\" (UniqueName: \"kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb\") pod \"network-check-target-v6qts\" (UID: \"da8df46a-d006-4e3a-8a95-df428038ed39\") " pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:21:05.613092 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:05.612956 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:21:05.613092 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:05.612974 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:21:05.613092 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:05.612988 2574 projected.go:194] Error preparing data for projected volume kube-api-access-vt4tb for pod openshift-network-diagnostics/network-check-target-v6qts: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:21:05.613092 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:05.613049 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb podName:da8df46a-d006-4e3a-8a95-df428038ed39 nodeName:}" failed. No retries permitted until 2026-04-17 09:21:13.613031171 +0000 UTC m=+18.209083048 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-vt4tb" (UniqueName: "kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb") pod "network-check-target-v6qts" (UID: "da8df46a-d006-4e3a-8a95-df428038ed39") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:21:05.964307 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:05.964270 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:21:05.964513 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:05.964379 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6qts" podUID="da8df46a-d006-4e3a-8a95-df428038ed39" Apr 17 09:21:05.964513 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:05.964467 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:21:05.964655 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:05.964587 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-22cz6" podUID="fba6f7ca-a68b-4315-91fd-d249cb9d13d1" Apr 17 09:21:07.884240 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:07.884183 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-237.ec2.internal" podStartSLOduration=10.884151959 podStartE2EDuration="10.884151959s" podCreationTimestamp="2026-04-17 09:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:21:01.078367652 +0000 UTC m=+5.674419546" watchObservedRunningTime="2026-04-17 09:21:07.884151959 +0000 UTC m=+12.480203863" Apr 17 09:21:07.884951 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:07.884926 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-tp8sk"] Apr 17 09:21:07.887746 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:07.887726 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:07.887850 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:07.887807 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tp8sk" podUID="c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11" Apr 17 09:21:07.929388 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:07.929365 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-original-pull-secret\") pod \"global-pull-secret-syncer-tp8sk\" (UID: \"c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11\") " pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:07.929499 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:07.929399 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-kubelet-config\") pod \"global-pull-secret-syncer-tp8sk\" (UID: \"c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11\") " pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:07.929499 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:07.929432 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-dbus\") pod \"global-pull-secret-syncer-tp8sk\" (UID: \"c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11\") " pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:07.962452 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:07.962429 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:21:07.962573 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:07.962548 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-22cz6" podUID="fba6f7ca-a68b-4315-91fd-d249cb9d13d1" Apr 17 09:21:07.962651 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:07.962635 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:21:07.962765 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:07.962741 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6qts" podUID="da8df46a-d006-4e3a-8a95-df428038ed39" Apr 17 09:21:08.030369 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:08.030338 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-original-pull-secret\") pod \"global-pull-secret-syncer-tp8sk\" (UID: \"c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11\") " pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:08.030478 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:08.030380 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-kubelet-config\") pod \"global-pull-secret-syncer-tp8sk\" (UID: \"c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11\") " pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:08.030478 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:08.030406 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-dbus\") pod \"global-pull-secret-syncer-tp8sk\" (UID: \"c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11\") " pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:08.030478 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:08.030439 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 09:21:08.030633 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:08.030520 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-original-pull-secret podName:c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11 nodeName:}" failed. No retries permitted until 2026-04-17 09:21:08.530499557 +0000 UTC m=+13.126551442 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-original-pull-secret") pod "global-pull-secret-syncer-tp8sk" (UID: "c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11") : object "kube-system"/"original-pull-secret" not registered Apr 17 09:21:08.030633 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:08.030513 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-kubelet-config\") pod \"global-pull-secret-syncer-tp8sk\" (UID: \"c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11\") " pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:08.030633 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:08.030555 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-dbus\") pod \"global-pull-secret-syncer-tp8sk\" (UID: \"c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11\") " pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:08.533939 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:08.533909 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-original-pull-secret\") pod \"global-pull-secret-syncer-tp8sk\" (UID: \"c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11\") " pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:08.534132 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:08.534072 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 09:21:08.534132 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:08.534130 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-original-pull-secret podName:c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11 nodeName:}" failed. No retries permitted until 2026-04-17 09:21:09.534112258 +0000 UTC m=+14.130164134 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-original-pull-secret") pod "global-pull-secret-syncer-tp8sk" (UID: "c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11") : object "kube-system"/"original-pull-secret" not registered Apr 17 09:21:09.541730 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:09.541693 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-original-pull-secret\") pod \"global-pull-secret-syncer-tp8sk\" (UID: \"c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11\") " pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:09.542166 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:09.541824 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 09:21:09.542166 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:09.541892 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-original-pull-secret podName:c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11 nodeName:}" failed. No retries permitted until 2026-04-17 09:21:11.541870179 +0000 UTC m=+16.137922063 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-original-pull-secret") pod "global-pull-secret-syncer-tp8sk" (UID: "c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11") : object "kube-system"/"original-pull-secret" not registered Apr 17 09:21:09.962658 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:09.962624 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:21:09.962836 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:09.962624 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:21:09.962836 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:09.962756 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6qts" podUID="da8df46a-d006-4e3a-8a95-df428038ed39" Apr 17 09:21:09.962836 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:09.962624 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:09.963011 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:09.962874 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-22cz6" podUID="fba6f7ca-a68b-4315-91fd-d249cb9d13d1" Apr 17 09:21:09.963011 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:09.962906 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tp8sk" podUID="c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11" Apr 17 09:21:11.555705 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:11.555668 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-original-pull-secret\") pod \"global-pull-secret-syncer-tp8sk\" (UID: \"c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11\") " pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:11.556112 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:11.555819 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 09:21:11.556112 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:11.555889 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-original-pull-secret podName:c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11 nodeName:}" failed. No retries permitted until 2026-04-17 09:21:15.555870386 +0000 UTC m=+20.151922258 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-original-pull-secret") pod "global-pull-secret-syncer-tp8sk" (UID: "c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11") : object "kube-system"/"original-pull-secret" not registered Apr 17 09:21:11.962355 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:11.962321 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:11.962355 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:11.962340 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:21:11.962593 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:11.962387 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:21:11.962593 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:11.962492 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tp8sk" podUID="c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11" Apr 17 09:21:11.962593 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:11.962530 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6qts" podUID="da8df46a-d006-4e3a-8a95-df428038ed39" Apr 17 09:21:11.962778 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:11.962589 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-22cz6" podUID="fba6f7ca-a68b-4315-91fd-d249cb9d13d1" Apr 17 09:21:13.568935 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:13.568889 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs\") pod \"network-metrics-daemon-22cz6\" (UID: \"fba6f7ca-a68b-4315-91fd-d249cb9d13d1\") " pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:21:13.569398 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:13.569068 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:21:13.569398 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:13.569148 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs podName:fba6f7ca-a68b-4315-91fd-d249cb9d13d1 nodeName:}" failed. No retries permitted until 2026-04-17 09:21:29.569126309 +0000 UTC m=+34.165178194 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs") pod "network-metrics-daemon-22cz6" (UID: "fba6f7ca-a68b-4315-91fd-d249cb9d13d1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:21:13.669405 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:13.669366 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vt4tb\" (UniqueName: \"kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb\") pod \"network-check-target-v6qts\" (UID: \"da8df46a-d006-4e3a-8a95-df428038ed39\") " pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:21:13.669565 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:13.669544 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:21:13.669565 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:13.669564 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:21:13.669639 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:13.669575 2574 projected.go:194] Error preparing data for projected volume kube-api-access-vt4tb for pod openshift-network-diagnostics/network-check-target-v6qts: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:21:13.669639 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:13.669633 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb podName:da8df46a-d006-4e3a-8a95-df428038ed39 nodeName:}" failed. No retries permitted until 2026-04-17 09:21:29.66961322 +0000 UTC m=+34.265665100 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-vt4tb" (UniqueName: "kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb") pod "network-check-target-v6qts" (UID: "da8df46a-d006-4e3a-8a95-df428038ed39") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:21:13.962255 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:13.962221 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:21:13.962431 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:13.962228 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:21:13.962431 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:13.962339 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6qts" podUID="da8df46a-d006-4e3a-8a95-df428038ed39" Apr 17 09:21:13.962540 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:13.962440 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-22cz6" podUID="fba6f7ca-a68b-4315-91fd-d249cb9d13d1" Apr 17 09:21:13.962540 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:13.962230 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:13.962632 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:13.962545 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tp8sk" podUID="c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11" Apr 17 09:21:15.585741 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:15.585596 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-original-pull-secret\") pod \"global-pull-secret-syncer-tp8sk\" (UID: \"c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11\") " pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:15.585741 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:15.585732 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 09:21:15.586077 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:15.585789 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-original-pull-secret podName:c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11 nodeName:}" failed. No retries permitted until 2026-04-17 09:21:23.585773874 +0000 UTC m=+28.181825746 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-original-pull-secret") pod "global-pull-secret-syncer-tp8sk" (UID: "c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11") : object "kube-system"/"original-pull-secret" not registered Apr 17 09:21:15.963146 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:15.962804 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:21:15.963146 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:15.963104 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6qts" podUID="da8df46a-d006-4e3a-8a95-df428038ed39" Apr 17 09:21:15.963146 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:15.962910 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:21:15.963422 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:15.962883 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:15.963422 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:15.963206 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-22cz6" podUID="fba6f7ca-a68b-4315-91fd-d249cb9d13d1" Apr 17 09:21:15.963422 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:15.963312 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tp8sk" podUID="c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11" Apr 17 09:21:16.086358 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.086323 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-c8z79" event={"ID":"e2050949-863c-4e07-8b7f-adfdaf82601d","Type":"ContainerStarted","Data":"1dfeaeb99dfb1ba304120e1b976155aa7d3f78c657bab867a07a25e11e685535"} Apr 17 09:21:16.087582 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.087558 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" event={"ID":"a0dde629-4b00-4d8e-8f44-daa979a1e1a8","Type":"ContainerStarted","Data":"ecca43ac3c4a2abf902911f56d17e1a4cf8e174a53f41ed192d7edf7a8483695"} Apr 17 09:21:16.088826 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.088804 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l7tg5" event={"ID":"7a3125fc-e8c4-420c-8d7b-684643355422","Type":"ContainerStarted","Data":"342cac105aaef6e9ea5c577f3bc62be833d5287c291dad2900462901a7397e62"} Apr 17 09:21:16.091792 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.091771 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rb9kg_19bff221-f968-4a84-9891-8578f50203f2/ovn-acl-logging/0.log" Apr 17 09:21:16.092107 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.092088 2574 generic.go:358] "Generic (PLEG): container finished" podID="19bff221-f968-4a84-9891-8578f50203f2" containerID="d7ca7a2d049a58d719e1765811884f67e86e2e5453c9547dea0b09a7cd443cf6" exitCode=1 Apr 17 09:21:16.092219 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.092162 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" event={"ID":"19bff221-f968-4a84-9891-8578f50203f2","Type":"ContainerStarted","Data":"97a4edc5263d7b75b58a0f50cfbcd6984ba2c8f8fef033c9bde4b10eb029be00"} Apr 17 09:21:16.092219 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.092212 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" event={"ID":"19bff221-f968-4a84-9891-8578f50203f2","Type":"ContainerStarted","Data":"972816875f9e3a1a1e1a07e106965d16505d22100a0a36b62e4d3ad7494a005f"} Apr 17 09:21:16.092300 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.092227 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" event={"ID":"19bff221-f968-4a84-9891-8578f50203f2","Type":"ContainerStarted","Data":"258867efb8cc0a659ec58ced4d2d5f5d3a302f02d706bc249455e349fd2417f3"} Apr 17 09:21:16.092300 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.092241 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" event={"ID":"19bff221-f968-4a84-9891-8578f50203f2","Type":"ContainerStarted","Data":"4ea7a8e24830721a8784a7eeb7693dff5be55e89ea935086ae2de34e3147815c"} Apr 17 09:21:16.092300 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.092253 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" event={"ID":"19bff221-f968-4a84-9891-8578f50203f2","Type":"ContainerDied","Data":"d7ca7a2d049a58d719e1765811884f67e86e2e5453c9547dea0b09a7cd443cf6"} Apr 17 09:21:16.092300 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.092269 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" event={"ID":"19bff221-f968-4a84-9891-8578f50203f2","Type":"ContainerStarted","Data":"edfd5749a8aa6aa80ef95a10cf050e45a774dbf25e81cc5392f94d1a5ab5a542"} Apr 17 09:21:16.093400 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.093372 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-g5ls5" event={"ID":"0755186c-8ac0-47fe-abc7-dd4eae84ad55","Type":"ContainerStarted","Data":"1c484141ad3f5b55e243d9ccd8f9af245d4ef7440da9dc082787ab8f4c62c825"} Apr 17 09:21:16.094629 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.094610 2574 generic.go:358] "Generic (PLEG): container finished" podID="34bc662a-193e-440f-9c2e-1dee8a208524" containerID="c42db168bf465f22f4bccc7b984f757253b0d66f2a27307b8aa36516a0db65ef" exitCode=0 Apr 17 09:21:16.094705 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.094666 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g62mf" event={"ID":"34bc662a-193e-440f-9c2e-1dee8a208524","Type":"ContainerDied","Data":"c42db168bf465f22f4bccc7b984f757253b0d66f2a27307b8aa36516a0db65ef"} Apr 17 09:21:16.095986 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.095905 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fgpzd" event={"ID":"ce7cf71b-3181-4cb7-84c1-caec23780d1c","Type":"ContainerStarted","Data":"ff64769b4f353985c8d8d6e29371fd70486b84b5ee79b14822e21ff5f97d5927"} Apr 17 09:21:16.097346 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.097325 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" event={"ID":"03476f8a-15ce-445d-b484-102c5da8fbed","Type":"ContainerStarted","Data":"72ada92842a956db8b2ae2e0e3926213f0ca0a5d7e8f4650a74cbba57d0bb866"} Apr 17 09:21:16.108384 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.107518 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-c8z79" podStartSLOduration=3.40274885 podStartE2EDuration="20.107496726s" podCreationTimestamp="2026-04-17 09:20:56 +0000 UTC" firstStartedPulling="2026-04-17 09:20:58.524534943 +0000 UTC m=+3.120586815" lastFinishedPulling="2026-04-17 09:21:15.229282805 +0000 UTC m=+19.825334691" observedRunningTime="2026-04-17 09:21:16.105481373 +0000 UTC m=+20.701533266" watchObservedRunningTime="2026-04-17 09:21:16.107496726 +0000 UTC m=+20.703548621" Apr 17 09:21:16.119963 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.119922 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-fgpzd" podStartSLOduration=3.47623996 podStartE2EDuration="20.119910641s" podCreationTimestamp="2026-04-17 09:20:56 +0000 UTC" firstStartedPulling="2026-04-17 09:20:58.531963737 +0000 UTC m=+3.128015618" lastFinishedPulling="2026-04-17 09:21:15.175634412 +0000 UTC m=+19.771686299" observedRunningTime="2026-04-17 09:21:16.119448245 +0000 UTC m=+20.715500138" watchObservedRunningTime="2026-04-17 09:21:16.119910641 +0000 UTC m=+20.715962535" Apr 17 09:21:16.134429 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.134384 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-nl7c6" podStartSLOduration=3.477121708 podStartE2EDuration="20.134370224s" podCreationTimestamp="2026-04-17 09:20:56 +0000 UTC" firstStartedPulling="2026-04-17 09:20:58.520117719 +0000 UTC m=+3.116169595" lastFinishedPulling="2026-04-17 09:21:15.177366226 +0000 UTC m=+19.773418111" observedRunningTime="2026-04-17 09:21:16.133984748 +0000 UTC m=+20.730036642" watchObservedRunningTime="2026-04-17 09:21:16.134370224 +0000 UTC m=+20.730422118" Apr 17 09:21:16.172205 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.172148 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-l7tg5" podStartSLOduration=3.565447527 podStartE2EDuration="20.172133212s" podCreationTimestamp="2026-04-17 09:20:56 +0000 UTC" firstStartedPulling="2026-04-17 09:20:58.52099835 +0000 UTC m=+3.117050229" lastFinishedPulling="2026-04-17 09:21:15.127684028 +0000 UTC m=+19.723735914" observedRunningTime="2026-04-17 09:21:16.172078923 +0000 UTC m=+20.768130819" watchObservedRunningTime="2026-04-17 09:21:16.172133212 +0000 UTC m=+20.768185106" Apr 17 09:21:16.186863 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.186816 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-g5ls5" podStartSLOduration=3.534185486 podStartE2EDuration="20.186800429s" podCreationTimestamp="2026-04-17 09:20:56 +0000 UTC" firstStartedPulling="2026-04-17 09:20:58.529143293 +0000 UTC m=+3.125195165" lastFinishedPulling="2026-04-17 09:21:15.181758237 +0000 UTC m=+19.777810108" observedRunningTime="2026-04-17 09:21:16.186746604 +0000 UTC m=+20.782798497" watchObservedRunningTime="2026-04-17 09:21:16.186800429 +0000 UTC m=+20.782852326" Apr 17 09:21:16.273260 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.273186 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-fgpzd" Apr 17 09:21:16.273752 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.273735 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-fgpzd" Apr 17 09:21:16.845505 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.845465 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 09:21:16.906932 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.906757 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T09:21:16.845486393Z","UUID":"b0a66fa4-4fd7-4a16-894c-aaa4824950d2","Handler":null,"Name":"","Endpoint":""} Apr 17 09:21:16.908256 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.908195 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 09:21:16.908256 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:16.908222 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 09:21:17.101410 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:17.101371 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" event={"ID":"03476f8a-15ce-445d-b484-102c5da8fbed","Type":"ContainerStarted","Data":"11c40b3261507cab92efe79b3f2f21116b162ecdf86457474f47c0c9cbdbdb1d"} Apr 17 09:21:17.102959 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:17.102929 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2g9gn" event={"ID":"2a16b37e-cd01-4bbd-9f94-16d59a30ae97","Type":"ContainerStarted","Data":"ec7aacd9ed0933eb818732ba0726597e38a3e28a71f8ccf698a16ed4a067a269"} Apr 17 09:21:17.117019 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:17.116972 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2g9gn" podStartSLOduration=4.464626485 podStartE2EDuration="21.11696027s" podCreationTimestamp="2026-04-17 09:20:56 +0000 UTC" firstStartedPulling="2026-04-17 09:20:58.52327785 +0000 UTC m=+3.119329731" lastFinishedPulling="2026-04-17 09:21:15.175611627 +0000 UTC m=+19.771663516" observedRunningTime="2026-04-17 09:21:17.116788683 +0000 UTC m=+21.712840576" watchObservedRunningTime="2026-04-17 09:21:17.11696027 +0000 UTC m=+21.713012163" Apr 17 09:21:17.962658 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:17.962460 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:21:17.962658 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:17.962473 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:17.962658 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:17.962584 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6qts" podUID="da8df46a-d006-4e3a-8a95-df428038ed39" Apr 17 09:21:17.962658 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:17.962625 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tp8sk" podUID="c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11" Apr 17 09:21:17.963216 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:17.962667 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:21:17.963216 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:17.962719 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-22cz6" podUID="fba6f7ca-a68b-4315-91fd-d249cb9d13d1" Apr 17 09:21:18.106593 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:18.106559 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" event={"ID":"03476f8a-15ce-445d-b484-102c5da8fbed","Type":"ContainerStarted","Data":"e2a3a0e6d8b885b8647661458fd2fcff787dcbf6a82cfd0b61f08464a332013b"} Apr 17 09:21:18.109493 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:18.109474 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rb9kg_19bff221-f968-4a84-9891-8578f50203f2/ovn-acl-logging/0.log" Apr 17 09:21:18.109907 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:18.109878 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" event={"ID":"19bff221-f968-4a84-9891-8578f50203f2","Type":"ContainerStarted","Data":"518e87cbc427efb443b14450b5fd9da45ba4622c57ec73dd00500e0a8fa2d970"} Apr 17 09:21:18.110000 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:18.109892 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 09:21:18.124781 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:18.124735 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-c7s55" podStartSLOduration=2.755892396 podStartE2EDuration="22.124722349s" podCreationTimestamp="2026-04-17 09:20:56 +0000 UTC" firstStartedPulling="2026-04-17 09:20:58.527793024 +0000 UTC m=+3.123844911" lastFinishedPulling="2026-04-17 09:21:17.896622984 +0000 UTC m=+22.492674864" observedRunningTime="2026-04-17 09:21:18.124540008 +0000 UTC m=+22.720591906" watchObservedRunningTime="2026-04-17 09:21:18.124722349 +0000 UTC m=+22.720774242" Apr 17 09:21:19.812294 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:19.812105 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-fgpzd" Apr 17 09:21:19.812724 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:19.812371 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 09:21:19.812724 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:19.812689 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-fgpzd" Apr 17 09:21:19.962681 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:19.962649 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:21:19.962835 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:19.962649 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:19.962835 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:19.962783 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6qts" podUID="da8df46a-d006-4e3a-8a95-df428038ed39" Apr 17 09:21:19.962931 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:19.962853 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tp8sk" podUID="c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11" Apr 17 09:21:19.962931 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:19.962649 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:21:19.963029 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:19.962958 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-22cz6" podUID="fba6f7ca-a68b-4315-91fd-d249cb9d13d1" Apr 17 09:21:21.117607 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:21.117446 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rb9kg_19bff221-f968-4a84-9891-8578f50203f2/ovn-acl-logging/0.log" Apr 17 09:21:21.118332 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:21.117883 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" event={"ID":"19bff221-f968-4a84-9891-8578f50203f2","Type":"ContainerStarted","Data":"81357f6b4057a7565a53b1b5cd3b7011a43e76546e772ca0eaaf9594a9a99589"} Apr 17 09:21:21.118332 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:21.118160 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:21:21.118455 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:21.118435 2574 scope.go:117] "RemoveContainer" containerID="d7ca7a2d049a58d719e1765811884f67e86e2e5453c9547dea0b09a7cd443cf6" Apr 17 09:21:21.119699 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:21.119680 2574 generic.go:358] "Generic (PLEG): container finished" podID="34bc662a-193e-440f-9c2e-1dee8a208524" containerID="a6274e7567c262ba65df7f3c60315eacb59d19f927830ac10702a990a917c616" exitCode=0 Apr 17 09:21:21.119776 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:21.119714 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g62mf" event={"ID":"34bc662a-193e-440f-9c2e-1dee8a208524","Type":"ContainerDied","Data":"a6274e7567c262ba65df7f3c60315eacb59d19f927830ac10702a990a917c616"} Apr 17 09:21:21.132976 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:21.132958 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:21:21.962124 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:21.962084 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:21:21.962288 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:21.962128 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:21.962288 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:21.962093 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:21:21.962288 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:21.962242 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-22cz6" podUID="fba6f7ca-a68b-4315-91fd-d249cb9d13d1" Apr 17 09:21:21.962464 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:21.962361 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tp8sk" podUID="c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11" Apr 17 09:21:21.962464 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:21.962440 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6qts" podUID="da8df46a-d006-4e3a-8a95-df428038ed39" Apr 17 09:21:22.126221 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:22.126197 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rb9kg_19bff221-f968-4a84-9891-8578f50203f2/ovn-acl-logging/0.log" Apr 17 09:21:22.126603 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:22.126578 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" event={"ID":"19bff221-f968-4a84-9891-8578f50203f2","Type":"ContainerStarted","Data":"0f1663bc8476c4b8d39d60719cf444066dd613982644a91ca71fd962aae4c586"} Apr 17 09:21:22.126760 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:22.126743 2574 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 17 09:21:22.127055 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:22.127012 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:21:22.129028 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:22.128739 2574 generic.go:358] "Generic (PLEG): container finished" podID="34bc662a-193e-440f-9c2e-1dee8a208524" containerID="9a1f571136f5e1792c1c59b7127b08a2bf17a0db1737781df27d5497824d89ef" exitCode=0 Apr 17 09:21:22.129028 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:22.128783 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g62mf" event={"ID":"34bc662a-193e-440f-9c2e-1dee8a208524","Type":"ContainerDied","Data":"9a1f571136f5e1792c1c59b7127b08a2bf17a0db1737781df27d5497824d89ef"} Apr 17 09:21:22.142360 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:22.142342 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:21:22.157479 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:22.157444 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" podStartSLOduration=9.382555651 podStartE2EDuration="26.157433927s" podCreationTimestamp="2026-04-17 09:20:56 +0000 UTC" firstStartedPulling="2026-04-17 09:20:58.532924387 +0000 UTC m=+3.128976269" lastFinishedPulling="2026-04-17 09:21:15.307802664 +0000 UTC m=+19.903854545" observedRunningTime="2026-04-17 09:21:22.156027328 +0000 UTC m=+26.752079222" watchObservedRunningTime="2026-04-17 09:21:22.157433927 +0000 UTC m=+26.753485821" Apr 17 09:21:22.341424 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:22.341396 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-v6qts"] Apr 17 09:21:22.341592 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:22.341485 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:21:22.341592 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:22.341564 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6qts" podUID="da8df46a-d006-4e3a-8a95-df428038ed39" Apr 17 09:21:22.344689 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:22.344660 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tp8sk"] Apr 17 09:21:22.344817 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:22.344753 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:22.344879 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:22.344858 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tp8sk" podUID="c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11" Apr 17 09:21:22.345485 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:22.345392 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-22cz6"] Apr 17 09:21:22.345485 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:22.345484 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:21:22.345652 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:22.345585 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-22cz6" podUID="fba6f7ca-a68b-4315-91fd-d249cb9d13d1" Apr 17 09:21:22.522853 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:22.522813 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:21:23.134646 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:23.134618 2574 generic.go:358] "Generic (PLEG): container finished" podID="34bc662a-193e-440f-9c2e-1dee8a208524" containerID="aeb70fbb3e0a4d4930133a0635a5322fd1c74f0e9102851c2924e6ff703502f4" exitCode=0 Apr 17 09:21:23.135011 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:23.134695 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g62mf" event={"ID":"34bc662a-193e-440f-9c2e-1dee8a208524","Type":"ContainerDied","Data":"aeb70fbb3e0a4d4930133a0635a5322fd1c74f0e9102851c2924e6ff703502f4"} Apr 17 09:21:23.646133 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:23.646102 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-original-pull-secret\") pod \"global-pull-secret-syncer-tp8sk\" (UID: \"c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11\") " pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:23.646300 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:23.646261 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 09:21:23.646353 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:23.646322 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-original-pull-secret podName:c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11 nodeName:}" failed. No retries permitted until 2026-04-17 09:21:39.646303302 +0000 UTC m=+44.242355197 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-original-pull-secret") pod "global-pull-secret-syncer-tp8sk" (UID: "c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11") : object "kube-system"/"original-pull-secret" not registered Apr 17 09:21:23.962273 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:23.962201 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:23.962424 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:23.962201 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:21:23.962424 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:23.962329 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tp8sk" podUID="c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11" Apr 17 09:21:23.962424 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:23.962201 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:21:23.962424 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:23.962411 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-22cz6" podUID="fba6f7ca-a68b-4315-91fd-d249cb9d13d1" Apr 17 09:21:23.962632 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:23.962492 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6qts" podUID="da8df46a-d006-4e3a-8a95-df428038ed39" Apr 17 09:21:25.963576 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:25.963378 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:21:25.964159 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:25.963438 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:25.964159 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:25.963796 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tp8sk" podUID="c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11" Apr 17 09:21:25.964159 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:25.963460 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:21:25.964159 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:25.963658 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6qts" podUID="da8df46a-d006-4e3a-8a95-df428038ed39" Apr 17 09:21:25.964159 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:25.963945 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-22cz6" podUID="fba6f7ca-a68b-4315-91fd-d249cb9d13d1" Apr 17 09:21:27.962707 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:27.962672 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:21:27.963396 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:27.962672 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:27.963396 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:27.962800 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-v6qts" podUID="da8df46a-d006-4e3a-8a95-df428038ed39" Apr 17 09:21:27.963396 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:27.962880 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-tp8sk" podUID="c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11" Apr 17 09:21:27.963396 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:27.962678 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:21:27.963396 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:27.963000 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-22cz6" podUID="fba6f7ca-a68b-4315-91fd-d249cb9d13d1" Apr 17 09:21:28.216212 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.216121 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-237.ec2.internal" event="NodeReady" Apr 17 09:21:28.216382 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.216288 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 09:21:28.259366 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.259329 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zkvsl"] Apr 17 09:21:28.290821 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.290789 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-v2rkj"] Apr 17 09:21:28.290993 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.290971 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zkvsl" Apr 17 09:21:28.293845 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.293820 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nblxl\"" Apr 17 09:21:28.294354 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.294167 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 09:21:28.294354 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.294302 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 09:21:28.308539 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.308515 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zkvsl"] Apr 17 09:21:28.308539 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.308540 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-v2rkj"] Apr 17 09:21:28.308742 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.308635 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v2rkj" Apr 17 09:21:28.311428 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.311403 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9szrj\"" Apr 17 09:21:28.311525 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.311447 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 09:21:28.311525 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.311457 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 09:21:28.311525 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.311466 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 09:21:28.402191 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.402148 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7281a4b6-76d6-494b-98e2-8fd1f322c7de-config-volume\") pod \"dns-default-zkvsl\" (UID: \"7281a4b6-76d6-494b-98e2-8fd1f322c7de\") " pod="openshift-dns/dns-default-zkvsl" Apr 17 09:21:28.402362 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.402197 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn297\" (UniqueName: \"kubernetes.io/projected/7281a4b6-76d6-494b-98e2-8fd1f322c7de-kube-api-access-sn297\") pod \"dns-default-zkvsl\" (UID: \"7281a4b6-76d6-494b-98e2-8fd1f322c7de\") " pod="openshift-dns/dns-default-zkvsl" Apr 17 09:21:28.402362 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.402230 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert\") pod \"ingress-canary-v2rkj\" (UID: \"2721fe42-279c-4536-9769-411e4918ceac\") " pod="openshift-ingress-canary/ingress-canary-v2rkj" Apr 17 09:21:28.402362 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.402251 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f92qs\" (UniqueName: \"kubernetes.io/projected/2721fe42-279c-4536-9769-411e4918ceac-kube-api-access-f92qs\") pod \"ingress-canary-v2rkj\" (UID: \"2721fe42-279c-4536-9769-411e4918ceac\") " pod="openshift-ingress-canary/ingress-canary-v2rkj" Apr 17 09:21:28.402362 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.402276 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls\") pod \"dns-default-zkvsl\" (UID: \"7281a4b6-76d6-494b-98e2-8fd1f322c7de\") " pod="openshift-dns/dns-default-zkvsl" Apr 17 09:21:28.402362 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.402301 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7281a4b6-76d6-494b-98e2-8fd1f322c7de-tmp-dir\") pod \"dns-default-zkvsl\" (UID: \"7281a4b6-76d6-494b-98e2-8fd1f322c7de\") " pod="openshift-dns/dns-default-zkvsl" Apr 17 09:21:28.503461 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.503389 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f92qs\" (UniqueName: \"kubernetes.io/projected/2721fe42-279c-4536-9769-411e4918ceac-kube-api-access-f92qs\") pod \"ingress-canary-v2rkj\" (UID: \"2721fe42-279c-4536-9769-411e4918ceac\") " pod="openshift-ingress-canary/ingress-canary-v2rkj" Apr 17 09:21:28.503461 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.503448 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls\") pod \"dns-default-zkvsl\" (UID: \"7281a4b6-76d6-494b-98e2-8fd1f322c7de\") " pod="openshift-dns/dns-default-zkvsl" Apr 17 09:21:28.503693 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.503495 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7281a4b6-76d6-494b-98e2-8fd1f322c7de-tmp-dir\") pod \"dns-default-zkvsl\" (UID: \"7281a4b6-76d6-494b-98e2-8fd1f322c7de\") " pod="openshift-dns/dns-default-zkvsl" Apr 17 09:21:28.503693 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:28.503546 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:21:28.503693 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:28.503609 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls podName:7281a4b6-76d6-494b-98e2-8fd1f322c7de nodeName:}" failed. No retries permitted until 2026-04-17 09:21:29.003593475 +0000 UTC m=+33.599645346 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls") pod "dns-default-zkvsl" (UID: "7281a4b6-76d6-494b-98e2-8fd1f322c7de") : secret "dns-default-metrics-tls" not found Apr 17 09:21:28.503693 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.503553 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7281a4b6-76d6-494b-98e2-8fd1f322c7de-config-volume\") pod \"dns-default-zkvsl\" (UID: \"7281a4b6-76d6-494b-98e2-8fd1f322c7de\") " pod="openshift-dns/dns-default-zkvsl" Apr 17 09:21:28.503894 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.503699 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn297\" (UniqueName: \"kubernetes.io/projected/7281a4b6-76d6-494b-98e2-8fd1f322c7de-kube-api-access-sn297\") pod \"dns-default-zkvsl\" (UID: \"7281a4b6-76d6-494b-98e2-8fd1f322c7de\") " pod="openshift-dns/dns-default-zkvsl" Apr 17 09:21:28.503894 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.503738 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert\") pod \"ingress-canary-v2rkj\" (UID: \"2721fe42-279c-4536-9769-411e4918ceac\") " pod="openshift-ingress-canary/ingress-canary-v2rkj" Apr 17 09:21:28.503894 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.503809 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7281a4b6-76d6-494b-98e2-8fd1f322c7de-tmp-dir\") pod \"dns-default-zkvsl\" (UID: \"7281a4b6-76d6-494b-98e2-8fd1f322c7de\") " pod="openshift-dns/dns-default-zkvsl" Apr 17 09:21:28.503894 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:28.503851 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:21:28.504032 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:28.503910 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert podName:2721fe42-279c-4536-9769-411e4918ceac nodeName:}" failed. No retries permitted until 2026-04-17 09:21:29.003893411 +0000 UTC m=+33.599945285 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert") pod "ingress-canary-v2rkj" (UID: "2721fe42-279c-4536-9769-411e4918ceac") : secret "canary-serving-cert" not found Apr 17 09:21:28.504032 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.504001 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7281a4b6-76d6-494b-98e2-8fd1f322c7de-config-volume\") pod \"dns-default-zkvsl\" (UID: \"7281a4b6-76d6-494b-98e2-8fd1f322c7de\") " pod="openshift-dns/dns-default-zkvsl" Apr 17 09:21:28.515783 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.515758 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f92qs\" (UniqueName: \"kubernetes.io/projected/2721fe42-279c-4536-9769-411e4918ceac-kube-api-access-f92qs\") pod \"ingress-canary-v2rkj\" (UID: \"2721fe42-279c-4536-9769-411e4918ceac\") " pod="openshift-ingress-canary/ingress-canary-v2rkj" Apr 17 09:21:28.515783 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:28.515769 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn297\" (UniqueName: \"kubernetes.io/projected/7281a4b6-76d6-494b-98e2-8fd1f322c7de-kube-api-access-sn297\") pod \"dns-default-zkvsl\" (UID: \"7281a4b6-76d6-494b-98e2-8fd1f322c7de\") " pod="openshift-dns/dns-default-zkvsl" Apr 17 09:21:29.006994 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:29.006964 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert\") pod \"ingress-canary-v2rkj\" (UID: \"2721fe42-279c-4536-9769-411e4918ceac\") " pod="openshift-ingress-canary/ingress-canary-v2rkj" Apr 17 09:21:29.007503 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:29.007016 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls\") pod \"dns-default-zkvsl\" (UID: \"7281a4b6-76d6-494b-98e2-8fd1f322c7de\") " pod="openshift-dns/dns-default-zkvsl" Apr 17 09:21:29.007503 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:29.007100 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:21:29.007503 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:29.007145 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls podName:7281a4b6-76d6-494b-98e2-8fd1f322c7de nodeName:}" failed. No retries permitted until 2026-04-17 09:21:30.007132225 +0000 UTC m=+34.603184096 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls") pod "dns-default-zkvsl" (UID: "7281a4b6-76d6-494b-98e2-8fd1f322c7de") : secret "dns-default-metrics-tls" not found Apr 17 09:21:29.007503 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:29.007098 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:21:29.007503 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:29.007241 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert podName:2721fe42-279c-4536-9769-411e4918ceac nodeName:}" failed. No retries permitted until 2026-04-17 09:21:30.007227243 +0000 UTC m=+34.603279115 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert") pod "ingress-canary-v2rkj" (UID: "2721fe42-279c-4536-9769-411e4918ceac") : secret "canary-serving-cert" not found Apr 17 09:21:29.150799 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:29.150773 2574 generic.go:358] "Generic (PLEG): container finished" podID="34bc662a-193e-440f-9c2e-1dee8a208524" containerID="7d4a905dc0095493f88d537a14fb0456586d59b787e78e12524b3bf064d3593b" exitCode=0 Apr 17 09:21:29.150916 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:29.150826 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g62mf" event={"ID":"34bc662a-193e-440f-9c2e-1dee8a208524","Type":"ContainerDied","Data":"7d4a905dc0095493f88d537a14fb0456586d59b787e78e12524b3bf064d3593b"} Apr 17 09:21:29.612123 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:29.612091 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs\") pod \"network-metrics-daemon-22cz6\" (UID: \"fba6f7ca-a68b-4315-91fd-d249cb9d13d1\") " pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:21:29.612285 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:29.612261 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:21:29.612334 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:29.612324 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs podName:fba6f7ca-a68b-4315-91fd-d249cb9d13d1 nodeName:}" failed. No retries permitted until 2026-04-17 09:22:01.612310318 +0000 UTC m=+66.208362194 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs") pod "network-metrics-daemon-22cz6" (UID: "fba6f7ca-a68b-4315-91fd-d249cb9d13d1") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 09:21:29.713276 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:29.713244 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vt4tb\" (UniqueName: \"kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb\") pod \"network-check-target-v6qts\" (UID: \"da8df46a-d006-4e3a-8a95-df428038ed39\") " pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:21:29.713424 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:29.713363 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 09:21:29.713424 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:29.713376 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 09:21:29.713424 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:29.713385 2574 projected.go:194] Error preparing data for projected volume kube-api-access-vt4tb for pod openshift-network-diagnostics/network-check-target-v6qts: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:21:29.713537 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:29.713429 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb podName:da8df46a-d006-4e3a-8a95-df428038ed39 nodeName:}" failed. No retries permitted until 2026-04-17 09:22:01.713416158 +0000 UTC m=+66.309468030 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-vt4tb" (UniqueName: "kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb") pod "network-check-target-v6qts" (UID: "da8df46a-d006-4e3a-8a95-df428038ed39") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 09:21:29.962697 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:29.962660 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:21:29.962863 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:29.962778 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:29.962973 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:29.962945 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:21:29.965572 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:29.965551 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 09:21:29.965572 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:29.965559 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 09:21:29.965745 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:29.965585 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 09:21:29.965745 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:29.965551 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4w67p\"" Apr 17 09:21:29.965745 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:29.965684 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-8wjkc\"" Apr 17 09:21:29.965858 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:29.965788 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 09:21:30.014856 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:30.014830 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls\") pod \"dns-default-zkvsl\" (UID: \"7281a4b6-76d6-494b-98e2-8fd1f322c7de\") " pod="openshift-dns/dns-default-zkvsl" Apr 17 09:21:30.015160 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:30.014893 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert\") pod \"ingress-canary-v2rkj\" (UID: \"2721fe42-279c-4536-9769-411e4918ceac\") " pod="openshift-ingress-canary/ingress-canary-v2rkj" Apr 17 09:21:30.015160 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:30.014975 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:21:30.015160 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:30.014985 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:21:30.015160 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:30.015031 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls podName:7281a4b6-76d6-494b-98e2-8fd1f322c7de nodeName:}" failed. No retries permitted until 2026-04-17 09:21:32.015013947 +0000 UTC m=+36.611065818 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls") pod "dns-default-zkvsl" (UID: "7281a4b6-76d6-494b-98e2-8fd1f322c7de") : secret "dns-default-metrics-tls" not found Apr 17 09:21:30.015160 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:30.015048 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert podName:2721fe42-279c-4536-9769-411e4918ceac nodeName:}" failed. No retries permitted until 2026-04-17 09:21:32.015041731 +0000 UTC m=+36.611093603 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert") pod "ingress-canary-v2rkj" (UID: "2721fe42-279c-4536-9769-411e4918ceac") : secret "canary-serving-cert" not found Apr 17 09:21:30.155074 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:30.155047 2574 generic.go:358] "Generic (PLEG): container finished" podID="34bc662a-193e-440f-9c2e-1dee8a208524" containerID="71a7b7032653d79f147a52680508a3a4bd0fa4d880ceb1bbb69a90f3a470941f" exitCode=0 Apr 17 09:21:30.155231 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:30.155095 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g62mf" event={"ID":"34bc662a-193e-440f-9c2e-1dee8a208524","Type":"ContainerDied","Data":"71a7b7032653d79f147a52680508a3a4bd0fa4d880ceb1bbb69a90f3a470941f"} Apr 17 09:21:31.160291 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:31.160113 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g62mf" event={"ID":"34bc662a-193e-440f-9c2e-1dee8a208524","Type":"ContainerStarted","Data":"c9b6ad1fc1a001407569ec998b8d194d96e41b38138b19740be225f45eff77c2"} Apr 17 09:21:31.183327 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:31.183286 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-g62mf" podStartSLOduration=5.017049379 podStartE2EDuration="35.183274096s" podCreationTimestamp="2026-04-17 09:20:56 +0000 UTC" firstStartedPulling="2026-04-17 09:20:58.518785226 +0000 UTC m=+3.114837110" lastFinishedPulling="2026-04-17 09:21:28.685009955 +0000 UTC m=+33.281061827" observedRunningTime="2026-04-17 09:21:31.181650723 +0000 UTC m=+35.777702628" watchObservedRunningTime="2026-04-17 09:21:31.183274096 +0000 UTC m=+35.779325989" Apr 17 09:21:32.028132 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:32.028095 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert\") pod \"ingress-canary-v2rkj\" (UID: \"2721fe42-279c-4536-9769-411e4918ceac\") " pod="openshift-ingress-canary/ingress-canary-v2rkj" Apr 17 09:21:32.028298 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:32.028142 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls\") pod \"dns-default-zkvsl\" (UID: \"7281a4b6-76d6-494b-98e2-8fd1f322c7de\") " pod="openshift-dns/dns-default-zkvsl" Apr 17 09:21:32.028298 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:32.028244 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:21:32.028298 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:32.028248 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:21:32.028445 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:32.028304 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert podName:2721fe42-279c-4536-9769-411e4918ceac nodeName:}" failed. No retries permitted until 2026-04-17 09:21:36.028290084 +0000 UTC m=+40.624341956 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert") pod "ingress-canary-v2rkj" (UID: "2721fe42-279c-4536-9769-411e4918ceac") : secret "canary-serving-cert" not found Apr 17 09:21:32.028445 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:32.028319 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls podName:7281a4b6-76d6-494b-98e2-8fd1f322c7de nodeName:}" failed. No retries permitted until 2026-04-17 09:21:36.028313169 +0000 UTC m=+40.624365040 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls") pod "dns-default-zkvsl" (UID: "7281a4b6-76d6-494b-98e2-8fd1f322c7de") : secret "dns-default-metrics-tls" not found Apr 17 09:21:36.052315 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:36.052267 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert\") pod \"ingress-canary-v2rkj\" (UID: \"2721fe42-279c-4536-9769-411e4918ceac\") " pod="openshift-ingress-canary/ingress-canary-v2rkj" Apr 17 09:21:36.052782 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:36.052333 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls\") pod \"dns-default-zkvsl\" (UID: \"7281a4b6-76d6-494b-98e2-8fd1f322c7de\") " pod="openshift-dns/dns-default-zkvsl" Apr 17 09:21:36.052782 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:36.052409 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:21:36.052782 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:36.052439 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:21:36.052782 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:36.052465 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert podName:2721fe42-279c-4536-9769-411e4918ceac nodeName:}" failed. No retries permitted until 2026-04-17 09:21:44.052451377 +0000 UTC m=+48.648503254 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert") pod "ingress-canary-v2rkj" (UID: "2721fe42-279c-4536-9769-411e4918ceac") : secret "canary-serving-cert" not found Apr 17 09:21:36.052782 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:36.052501 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls podName:7281a4b6-76d6-494b-98e2-8fd1f322c7de nodeName:}" failed. No retries permitted until 2026-04-17 09:21:44.052485062 +0000 UTC m=+48.648536934 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls") pod "dns-default-zkvsl" (UID: "7281a4b6-76d6-494b-98e2-8fd1f322c7de") : secret "dns-default-metrics-tls" not found Apr 17 09:21:39.677282 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:39.677251 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-original-pull-secret\") pod \"global-pull-secret-syncer-tp8sk\" (UID: \"c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11\") " pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:39.680121 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:39.680098 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11-original-pull-secret\") pod \"global-pull-secret-syncer-tp8sk\" (UID: \"c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11\") " pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:39.878160 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:39.878125 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-tp8sk" Apr 17 09:21:40.058321 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:40.058291 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-tp8sk"] Apr 17 09:21:40.061277 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:21:40.061235 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3c029a5_8abb_404f_bd4f_0b2cbf2dfe11.slice/crio-1e91f78fea3ce34f5e20c72665a9a552ad1c97b91ce08a510266376f50998f83 WatchSource:0}: Error finding container 1e91f78fea3ce34f5e20c72665a9a552ad1c97b91ce08a510266376f50998f83: Status 404 returned error can't find the container with id 1e91f78fea3ce34f5e20c72665a9a552ad1c97b91ce08a510266376f50998f83 Apr 17 09:21:40.177698 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:40.177658 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tp8sk" event={"ID":"c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11","Type":"ContainerStarted","Data":"1e91f78fea3ce34f5e20c72665a9a552ad1c97b91ce08a510266376f50998f83"} Apr 17 09:21:44.110599 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:44.110546 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert\") pod \"ingress-canary-v2rkj\" (UID: \"2721fe42-279c-4536-9769-411e4918ceac\") " pod="openshift-ingress-canary/ingress-canary-v2rkj" Apr 17 09:21:44.110950 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:44.110624 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls\") pod \"dns-default-zkvsl\" (UID: \"7281a4b6-76d6-494b-98e2-8fd1f322c7de\") " pod="openshift-dns/dns-default-zkvsl" Apr 17 09:21:44.110950 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:44.110701 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:21:44.110950 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:44.110721 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:21:44.110950 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:44.110782 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert podName:2721fe42-279c-4536-9769-411e4918ceac nodeName:}" failed. No retries permitted until 2026-04-17 09:22:00.110758823 +0000 UTC m=+64.706810708 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert") pod "ingress-canary-v2rkj" (UID: "2721fe42-279c-4536-9769-411e4918ceac") : secret "canary-serving-cert" not found Apr 17 09:21:44.110950 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:21:44.110805 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls podName:7281a4b6-76d6-494b-98e2-8fd1f322c7de nodeName:}" failed. No retries permitted until 2026-04-17 09:22:00.110793255 +0000 UTC m=+64.706845134 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls") pod "dns-default-zkvsl" (UID: "7281a4b6-76d6-494b-98e2-8fd1f322c7de") : secret "dns-default-metrics-tls" not found Apr 17 09:21:45.189005 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:45.188973 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-tp8sk" event={"ID":"c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11","Type":"ContainerStarted","Data":"100b098ff3747ade8f0b4e1d5046615b4cc46d4d11d78cb17718853b12266100"} Apr 17 09:21:45.205082 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:45.205024 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-tp8sk" podStartSLOduration=34.031235645 podStartE2EDuration="38.205008557s" podCreationTimestamp="2026-04-17 09:21:07 +0000 UTC" firstStartedPulling="2026-04-17 09:21:40.062951319 +0000 UTC m=+44.659003205" lastFinishedPulling="2026-04-17 09:21:44.236724245 +0000 UTC m=+48.832776117" observedRunningTime="2026-04-17 09:21:45.204771069 +0000 UTC m=+49.800822962" watchObservedRunningTime="2026-04-17 09:21:45.205008557 +0000 UTC m=+49.801060451" Apr 17 09:21:54.147918 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:21:54.147891 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rb9kg" Apr 17 09:22:00.209746 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:00.209713 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls\") pod \"dns-default-zkvsl\" (UID: \"7281a4b6-76d6-494b-98e2-8fd1f322c7de\") " pod="openshift-dns/dns-default-zkvsl" Apr 17 09:22:00.210195 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:00.209786 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert\") pod \"ingress-canary-v2rkj\" (UID: \"2721fe42-279c-4536-9769-411e4918ceac\") " pod="openshift-ingress-canary/ingress-canary-v2rkj" Apr 17 09:22:00.210195 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:00.209861 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:22:00.210195 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:00.209882 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:22:00.210195 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:00.209936 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls podName:7281a4b6-76d6-494b-98e2-8fd1f322c7de nodeName:}" failed. No retries permitted until 2026-04-17 09:22:32.209920449 +0000 UTC m=+96.805972320 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls") pod "dns-default-zkvsl" (UID: "7281a4b6-76d6-494b-98e2-8fd1f322c7de") : secret "dns-default-metrics-tls" not found Apr 17 09:22:00.210195 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:00.209950 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert podName:2721fe42-279c-4536-9769-411e4918ceac nodeName:}" failed. No retries permitted until 2026-04-17 09:22:32.20994459 +0000 UTC m=+96.805996463 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert") pod "ingress-canary-v2rkj" (UID: "2721fe42-279c-4536-9769-411e4918ceac") : secret "canary-serving-cert" not found Apr 17 09:22:01.619505 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:01.619461 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs\") pod \"network-metrics-daemon-22cz6\" (UID: \"fba6f7ca-a68b-4315-91fd-d249cb9d13d1\") " pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:22:01.622551 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:01.622531 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 09:22:01.629774 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:01.629753 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 09:22:01.629873 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:01.629818 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs podName:fba6f7ca-a68b-4315-91fd-d249cb9d13d1 nodeName:}" failed. No retries permitted until 2026-04-17 09:23:05.6297981 +0000 UTC m=+130.225849972 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs") pod "network-metrics-daemon-22cz6" (UID: "fba6f7ca-a68b-4315-91fd-d249cb9d13d1") : secret "metrics-daemon-secret" not found Apr 17 09:22:01.720344 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:01.720314 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vt4tb\" (UniqueName: \"kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb\") pod \"network-check-target-v6qts\" (UID: \"da8df46a-d006-4e3a-8a95-df428038ed39\") " pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:22:01.722932 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:01.722915 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 09:22:01.733259 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:01.733240 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 09:22:01.743514 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:01.743490 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt4tb\" (UniqueName: \"kubernetes.io/projected/da8df46a-d006-4e3a-8a95-df428038ed39-kube-api-access-vt4tb\") pod \"network-check-target-v6qts\" (UID: \"da8df46a-d006-4e3a-8a95-df428038ed39\") " pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:22:01.774990 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:01.774963 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4w67p\"" Apr 17 09:22:01.783117 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:01.783089 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:22:01.887843 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:01.887814 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-v6qts"] Apr 17 09:22:01.890778 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:22:01.890753 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda8df46a_d006_4e3a_8a95_df428038ed39.slice/crio-69cfcbc0f3847e0cd8c163da3c73ce82918504a3fc5795b606a5f8faacfe7449 WatchSource:0}: Error finding container 69cfcbc0f3847e0cd8c163da3c73ce82918504a3fc5795b606a5f8faacfe7449: Status 404 returned error can't find the container with id 69cfcbc0f3847e0cd8c163da3c73ce82918504a3fc5795b606a5f8faacfe7449 Apr 17 09:22:02.220426 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:02.220352 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-v6qts" event={"ID":"da8df46a-d006-4e3a-8a95-df428038ed39","Type":"ContainerStarted","Data":"69cfcbc0f3847e0cd8c163da3c73ce82918504a3fc5795b606a5f8faacfe7449"} Apr 17 09:22:05.227423 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:05.227379 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-v6qts" event={"ID":"da8df46a-d006-4e3a-8a95-df428038ed39","Type":"ContainerStarted","Data":"01adde30df8c0410efad0f7c2e763d879c738a99553315699d413c90b58c3c9e"} Apr 17 09:22:05.227863 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:05.227600 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:22:05.245099 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:05.245061 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-v6qts" podStartSLOduration=66.702269156 podStartE2EDuration="1m9.245047755s" podCreationTimestamp="2026-04-17 09:20:56 +0000 UTC" firstStartedPulling="2026-04-17 09:22:01.8925928 +0000 UTC m=+66.488644675" lastFinishedPulling="2026-04-17 09:22:04.435371393 +0000 UTC m=+69.031423274" observedRunningTime="2026-04-17 09:22:05.244819225 +0000 UTC m=+69.840871123" watchObservedRunningTime="2026-04-17 09:22:05.245047755 +0000 UTC m=+69.841099648" Apr 17 09:22:32.220228 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:32.220138 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert\") pod \"ingress-canary-v2rkj\" (UID: \"2721fe42-279c-4536-9769-411e4918ceac\") " pod="openshift-ingress-canary/ingress-canary-v2rkj" Apr 17 09:22:32.220228 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:32.220193 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls\") pod \"dns-default-zkvsl\" (UID: \"7281a4b6-76d6-494b-98e2-8fd1f322c7de\") " pod="openshift-dns/dns-default-zkvsl" Apr 17 09:22:32.220602 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:32.220286 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 09:22:32.220602 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:32.220287 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 09:22:32.220602 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:32.220345 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls podName:7281a4b6-76d6-494b-98e2-8fd1f322c7de nodeName:}" failed. No retries permitted until 2026-04-17 09:23:36.220329803 +0000 UTC m=+160.816381677 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls") pod "dns-default-zkvsl" (UID: "7281a4b6-76d6-494b-98e2-8fd1f322c7de") : secret "dns-default-metrics-tls" not found Apr 17 09:22:32.220602 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:32.220360 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert podName:2721fe42-279c-4536-9769-411e4918ceac nodeName:}" failed. No retries permitted until 2026-04-17 09:23:36.220354263 +0000 UTC m=+160.816406135 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert") pod "ingress-canary-v2rkj" (UID: "2721fe42-279c-4536-9769-411e4918ceac") : secret "canary-serving-cert" not found Apr 17 09:22:36.231899 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:36.231866 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-v6qts" Apr 17 09:22:55.969288 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.969257 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zjszk"] Apr 17 09:22:55.974047 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.974029 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-59cd84dcb8-fhdx4"] Apr 17 09:22:55.974204 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.974187 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zjszk" Apr 17 09:22:55.976827 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.976805 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-nkmjl\"" Apr 17 09:22:55.976827 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.976821 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 17 09:22:55.977010 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.976995 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-gmmqr"] Apr 17 09:22:55.977088 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.977075 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 17 09:22:55.977137 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.977123 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:22:55.979927 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.979902 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 09:22:55.980063 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.979950 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 09:22:55.980063 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.980035 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-bdvnv\"" Apr 17 09:22:55.980198 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.980078 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zjszk"] Apr 17 09:22:55.980198 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.980194 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-gmmqr" Apr 17 09:22:55.980312 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.980219 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 09:22:55.980381 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.980360 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 09:22:55.980457 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.980373 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 09:22:55.980734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.980709 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 09:22:55.983541 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.983523 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 17 09:22:55.984393 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.984068 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 17 09:22:55.984506 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.984312 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 17 09:22:55.984506 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.984334 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-gmmqr"] Apr 17 09:22:55.984663 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.984542 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-nnr7p\"" Apr 17 09:22:55.984821 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.984805 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 17 09:22:55.984955 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:55.984931 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-59cd84dcb8-fhdx4"] Apr 17 09:22:56.083101 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.083070 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq78g\" (UniqueName: \"kubernetes.io/projected/32be7a79-e2e8-447f-9b7b-731ca24adef9-kube-api-access-lq78g\") pod \"volume-data-source-validator-7c6cbb6c87-zjszk\" (UID: \"32be7a79-e2e8-447f-9b7b-731ca24adef9\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zjszk" Apr 17 09:22:56.083101 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.083111 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-service-ca-bundle\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:22:56.083383 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.083126 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-metrics-certs\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:22:56.083383 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.083151 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-default-certificate\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:22:56.083383 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.083218 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdxgk\" (UniqueName: \"kubernetes.io/projected/824a0058-01c3-4126-965f-c9f5a5d55e99-kube-api-access-qdxgk\") pod \"service-ca-operator-d6fc45fc5-gmmqr\" (UID: \"824a0058-01c3-4126-965f-c9f5a5d55e99\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-gmmqr" Apr 17 09:22:56.083383 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.083251 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-stats-auth\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:22:56.083383 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.083299 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lcgv\" (UniqueName: \"kubernetes.io/projected/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-kube-api-access-2lcgv\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:22:56.083383 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.083315 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/824a0058-01c3-4126-965f-c9f5a5d55e99-serving-cert\") pod \"service-ca-operator-d6fc45fc5-gmmqr\" (UID: \"824a0058-01c3-4126-965f-c9f5a5d55e99\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-gmmqr" Apr 17 09:22:56.083383 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.083361 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/824a0058-01c3-4126-965f-c9f5a5d55e99-config\") pod \"service-ca-operator-d6fc45fc5-gmmqr\" (UID: \"824a0058-01c3-4126-965f-c9f5a5d55e99\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-gmmqr" Apr 17 09:22:56.121770 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.121746 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-4c5bp"] Apr 17 09:22:56.128294 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.125085 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c5bp" Apr 17 09:22:56.133369 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.133348 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 09:22:56.133493 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.133383 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-bn4bx\"" Apr 17 09:22:56.134287 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.134264 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 09:22:56.135023 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.135005 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-sfz45"] Apr 17 09:22:56.137997 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.137978 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-sfz45" Apr 17 09:22:56.142437 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.142404 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 09:22:56.142780 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.142760 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-g2knl\"" Apr 17 09:22:56.142867 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.142766 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 09:22:56.142867 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.142801 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 09:22:56.143131 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.143115 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 09:22:56.151458 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.151439 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 09:22:56.173389 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.173351 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-4c5bp"] Apr 17 09:22:56.174072 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.174052 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-sfz45"] Apr 17 09:22:56.184269 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.184250 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-service-ca-bundle\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:22:56.184326 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.184278 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-metrics-certs\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:22:56.184326 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.184309 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-default-certificate\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:22:56.184411 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:56.184391 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 09:22:56.184474 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:56.184464 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-metrics-certs podName:74b4cdd2-7175-4d47-9486-0863bdb1bdb2 nodeName:}" failed. No retries permitted until 2026-04-17 09:22:56.684445586 +0000 UTC m=+121.280497458 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-metrics-certs") pod "router-default-59cd84dcb8-fhdx4" (UID: "74b4cdd2-7175-4d47-9486-0863bdb1bdb2") : secret "router-metrics-certs-default" not found Apr 17 09:22:56.184562 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:56.184547 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-service-ca-bundle podName:74b4cdd2-7175-4d47-9486-0863bdb1bdb2 nodeName:}" failed. No retries permitted until 2026-04-17 09:22:56.684532421 +0000 UTC m=+121.280584297 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-service-ca-bundle") pod "router-default-59cd84dcb8-fhdx4" (UID: "74b4cdd2-7175-4d47-9486-0863bdb1bdb2") : configmap references non-existent config key: service-ca.crt Apr 17 09:22:56.184607 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.184573 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdxgk\" (UniqueName: \"kubernetes.io/projected/824a0058-01c3-4126-965f-c9f5a5d55e99-kube-api-access-qdxgk\") pod \"service-ca-operator-d6fc45fc5-gmmqr\" (UID: \"824a0058-01c3-4126-965f-c9f5a5d55e99\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-gmmqr" Apr 17 09:22:56.184607 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.184601 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-4c5bp\" (UID: \"c68efb67-a1eb-4e0b-9af1-47c6e61f4d10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c5bp" Apr 17 09:22:56.184705 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.184643 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-stats-auth\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:22:56.184756 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.184704 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lcgv\" (UniqueName: \"kubernetes.io/projected/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-kube-api-access-2lcgv\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:22:56.184756 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.184725 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/824a0058-01c3-4126-965f-c9f5a5d55e99-serving-cert\") pod \"service-ca-operator-d6fc45fc5-gmmqr\" (UID: \"824a0058-01c3-4126-965f-c9f5a5d55e99\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-gmmqr" Apr 17 09:22:56.184849 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.184806 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/824a0058-01c3-4126-965f-c9f5a5d55e99-config\") pod \"service-ca-operator-d6fc45fc5-gmmqr\" (UID: \"824a0058-01c3-4126-965f-c9f5a5d55e99\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-gmmqr" Apr 17 09:22:56.184849 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.184832 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lq78g\" (UniqueName: \"kubernetes.io/projected/32be7a79-e2e8-447f-9b7b-731ca24adef9-kube-api-access-lq78g\") pod \"volume-data-source-validator-7c6cbb6c87-zjszk\" (UID: \"32be7a79-e2e8-447f-9b7b-731ca24adef9\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zjszk" Apr 17 09:22:56.184947 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.184873 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4c5bp\" (UID: \"c68efb67-a1eb-4e0b-9af1-47c6e61f4d10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c5bp" Apr 17 09:22:56.185319 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.185288 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/824a0058-01c3-4126-965f-c9f5a5d55e99-config\") pod \"service-ca-operator-d6fc45fc5-gmmqr\" (UID: \"824a0058-01c3-4126-965f-c9f5a5d55e99\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-gmmqr" Apr 17 09:22:56.186777 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.186750 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-default-certificate\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:22:56.186863 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.186788 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/824a0058-01c3-4126-965f-c9f5a5d55e99-serving-cert\") pod \"service-ca-operator-d6fc45fc5-gmmqr\" (UID: \"824a0058-01c3-4126-965f-c9f5a5d55e99\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-gmmqr" Apr 17 09:22:56.186863 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.186800 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-stats-auth\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:22:56.220690 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.220628 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lcgv\" (UniqueName: \"kubernetes.io/projected/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-kube-api-access-2lcgv\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:22:56.221069 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.221049 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdxgk\" (UniqueName: \"kubernetes.io/projected/824a0058-01c3-4126-965f-c9f5a5d55e99-kube-api-access-qdxgk\") pod \"service-ca-operator-d6fc45fc5-gmmqr\" (UID: \"824a0058-01c3-4126-965f-c9f5a5d55e99\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-gmmqr" Apr 17 09:22:56.221969 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.221944 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq78g\" (UniqueName: \"kubernetes.io/projected/32be7a79-e2e8-447f-9b7b-731ca24adef9-kube-api-access-lq78g\") pod \"volume-data-source-validator-7c6cbb6c87-zjszk\" (UID: \"32be7a79-e2e8-447f-9b7b-731ca24adef9\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zjszk" Apr 17 09:22:56.285308 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.285285 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51eeb544-5d28-4c8c-8577-6f932bfee2ce-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-sfz45\" (UID: \"51eeb544-5d28-4c8c-8577-6f932bfee2ce\") " pod="openshift-insights/insights-operator-585dfdc468-sfz45" Apr 17 09:22:56.285425 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.285313 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qjvt\" (UniqueName: \"kubernetes.io/projected/51eeb544-5d28-4c8c-8577-6f932bfee2ce-kube-api-access-6qjvt\") pod \"insights-operator-585dfdc468-sfz45\" (UID: \"51eeb544-5d28-4c8c-8577-6f932bfee2ce\") " pod="openshift-insights/insights-operator-585dfdc468-sfz45" Apr 17 09:22:56.285425 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.285354 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51eeb544-5d28-4c8c-8577-6f932bfee2ce-tmp\") pod \"insights-operator-585dfdc468-sfz45\" (UID: \"51eeb544-5d28-4c8c-8577-6f932bfee2ce\") " pod="openshift-insights/insights-operator-585dfdc468-sfz45" Apr 17 09:22:56.285425 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.285369 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/51eeb544-5d28-4c8c-8577-6f932bfee2ce-snapshots\") pod \"insights-operator-585dfdc468-sfz45\" (UID: \"51eeb544-5d28-4c8c-8577-6f932bfee2ce\") " pod="openshift-insights/insights-operator-585dfdc468-sfz45" Apr 17 09:22:56.285425 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.285414 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zjszk" Apr 17 09:22:56.285556 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.285515 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51eeb544-5d28-4c8c-8577-6f932bfee2ce-serving-cert\") pod \"insights-operator-585dfdc468-sfz45\" (UID: \"51eeb544-5d28-4c8c-8577-6f932bfee2ce\") " pod="openshift-insights/insights-operator-585dfdc468-sfz45" Apr 17 09:22:56.285589 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.285575 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4c5bp\" (UID: \"c68efb67-a1eb-4e0b-9af1-47c6e61f4d10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c5bp" Apr 17 09:22:56.285658 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.285644 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51eeb544-5d28-4c8c-8577-6f932bfee2ce-service-ca-bundle\") pod \"insights-operator-585dfdc468-sfz45\" (UID: \"51eeb544-5d28-4c8c-8577-6f932bfee2ce\") " pod="openshift-insights/insights-operator-585dfdc468-sfz45" Apr 17 09:22:56.285658 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:56.285653 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 09:22:56.285719 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.285681 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-4c5bp\" (UID: \"c68efb67-a1eb-4e0b-9af1-47c6e61f4d10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c5bp" Apr 17 09:22:56.285719 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:56.285699 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-networking-console-plugin-cert podName:c68efb67-a1eb-4e0b-9af1-47c6e61f4d10 nodeName:}" failed. No retries permitted until 2026-04-17 09:22:56.785685561 +0000 UTC m=+121.381737432 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4c5bp" (UID: "c68efb67-a1eb-4e0b-9af1-47c6e61f4d10") : secret "networking-console-plugin-cert" not found Apr 17 09:22:56.286334 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.286226 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-4c5bp\" (UID: \"c68efb67-a1eb-4e0b-9af1-47c6e61f4d10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c5bp" Apr 17 09:22:56.299152 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.299128 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-gmmqr" Apr 17 09:22:56.386671 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.386626 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51eeb544-5d28-4c8c-8577-6f932bfee2ce-tmp\") pod \"insights-operator-585dfdc468-sfz45\" (UID: \"51eeb544-5d28-4c8c-8577-6f932bfee2ce\") " pod="openshift-insights/insights-operator-585dfdc468-sfz45" Apr 17 09:22:56.386822 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.386742 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/51eeb544-5d28-4c8c-8577-6f932bfee2ce-snapshots\") pod \"insights-operator-585dfdc468-sfz45\" (UID: \"51eeb544-5d28-4c8c-8577-6f932bfee2ce\") " pod="openshift-insights/insights-operator-585dfdc468-sfz45" Apr 17 09:22:56.386822 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.386794 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51eeb544-5d28-4c8c-8577-6f932bfee2ce-serving-cert\") pod \"insights-operator-585dfdc468-sfz45\" (UID: \"51eeb544-5d28-4c8c-8577-6f932bfee2ce\") " pod="openshift-insights/insights-operator-585dfdc468-sfz45" Apr 17 09:22:56.387015 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.386899 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51eeb544-5d28-4c8c-8577-6f932bfee2ce-service-ca-bundle\") pod \"insights-operator-585dfdc468-sfz45\" (UID: \"51eeb544-5d28-4c8c-8577-6f932bfee2ce\") " pod="openshift-insights/insights-operator-585dfdc468-sfz45" Apr 17 09:22:56.387015 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.386939 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51eeb544-5d28-4c8c-8577-6f932bfee2ce-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-sfz45\" (UID: \"51eeb544-5d28-4c8c-8577-6f932bfee2ce\") " pod="openshift-insights/insights-operator-585dfdc468-sfz45" Apr 17 09:22:56.387015 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.386964 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6qjvt\" (UniqueName: \"kubernetes.io/projected/51eeb544-5d28-4c8c-8577-6f932bfee2ce-kube-api-access-6qjvt\") pod \"insights-operator-585dfdc468-sfz45\" (UID: \"51eeb544-5d28-4c8c-8577-6f932bfee2ce\") " pod="openshift-insights/insights-operator-585dfdc468-sfz45" Apr 17 09:22:56.387248 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.387007 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/51eeb544-5d28-4c8c-8577-6f932bfee2ce-tmp\") pod \"insights-operator-585dfdc468-sfz45\" (UID: \"51eeb544-5d28-4c8c-8577-6f932bfee2ce\") " pod="openshift-insights/insights-operator-585dfdc468-sfz45" Apr 17 09:22:56.387917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.387364 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/51eeb544-5d28-4c8c-8577-6f932bfee2ce-snapshots\") pod \"insights-operator-585dfdc468-sfz45\" (UID: \"51eeb544-5d28-4c8c-8577-6f932bfee2ce\") " pod="openshift-insights/insights-operator-585dfdc468-sfz45" Apr 17 09:22:56.387917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.387557 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51eeb544-5d28-4c8c-8577-6f932bfee2ce-service-ca-bundle\") pod \"insights-operator-585dfdc468-sfz45\" (UID: \"51eeb544-5d28-4c8c-8577-6f932bfee2ce\") " pod="openshift-insights/insights-operator-585dfdc468-sfz45" Apr 17 09:22:56.388815 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.388789 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51eeb544-5d28-4c8c-8577-6f932bfee2ce-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-sfz45\" (UID: \"51eeb544-5d28-4c8c-8577-6f932bfee2ce\") " pod="openshift-insights/insights-operator-585dfdc468-sfz45" Apr 17 09:22:56.389690 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.389667 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51eeb544-5d28-4c8c-8577-6f932bfee2ce-serving-cert\") pod \"insights-operator-585dfdc468-sfz45\" (UID: \"51eeb544-5d28-4c8c-8577-6f932bfee2ce\") " pod="openshift-insights/insights-operator-585dfdc468-sfz45" Apr 17 09:22:56.395850 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.395826 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qjvt\" (UniqueName: \"kubernetes.io/projected/51eeb544-5d28-4c8c-8577-6f932bfee2ce-kube-api-access-6qjvt\") pod \"insights-operator-585dfdc468-sfz45\" (UID: \"51eeb544-5d28-4c8c-8577-6f932bfee2ce\") " pod="openshift-insights/insights-operator-585dfdc468-sfz45" Apr 17 09:22:56.409006 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.408983 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zjszk"] Apr 17 09:22:56.411938 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:22:56.411909 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32be7a79_e2e8_447f_9b7b_731ca24adef9.slice/crio-a64e2887a27b19809aa99966f2e46cd7aceae453accf8aaf8001018ba1f92909 WatchSource:0}: Error finding container a64e2887a27b19809aa99966f2e46cd7aceae453accf8aaf8001018ba1f92909: Status 404 returned error can't find the container with id a64e2887a27b19809aa99966f2e46cd7aceae453accf8aaf8001018ba1f92909 Apr 17 09:22:56.426399 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.426378 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-gmmqr"] Apr 17 09:22:56.428848 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:22:56.428827 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod824a0058_01c3_4126_965f_c9f5a5d55e99.slice/crio-73fdf52dcd6066b5239d2bc794ccf70a39ebc51b49d1e920dce669d75d458cdb WatchSource:0}: Error finding container 73fdf52dcd6066b5239d2bc794ccf70a39ebc51b49d1e920dce669d75d458cdb: Status 404 returned error can't find the container with id 73fdf52dcd6066b5239d2bc794ccf70a39ebc51b49d1e920dce669d75d458cdb Apr 17 09:22:56.446221 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.446203 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-sfz45" Apr 17 09:22:56.559700 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.559666 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-sfz45"] Apr 17 09:22:56.562740 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:22:56.562707 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51eeb544_5d28_4c8c_8577_6f932bfee2ce.slice/crio-8b57f13ec64b066e7c31b6115b80be6b05e6a04f518b20f56c96d0fa91a8b48e WatchSource:0}: Error finding container 8b57f13ec64b066e7c31b6115b80be6b05e6a04f518b20f56c96d0fa91a8b48e: Status 404 returned error can't find the container with id 8b57f13ec64b066e7c31b6115b80be6b05e6a04f518b20f56c96d0fa91a8b48e Apr 17 09:22:56.689999 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.689958 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-service-ca-bundle\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:22:56.689999 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.689998 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-metrics-certs\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:22:56.690201 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:56.690137 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-service-ca-bundle podName:74b4cdd2-7175-4d47-9486-0863bdb1bdb2 nodeName:}" failed. No retries permitted until 2026-04-17 09:22:57.690116477 +0000 UTC m=+122.286168367 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-service-ca-bundle") pod "router-default-59cd84dcb8-fhdx4" (UID: "74b4cdd2-7175-4d47-9486-0863bdb1bdb2") : configmap references non-existent config key: service-ca.crt Apr 17 09:22:56.690201 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:56.690137 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 09:22:56.690201 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:56.690199 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-metrics-certs podName:74b4cdd2-7175-4d47-9486-0863bdb1bdb2 nodeName:}" failed. No retries permitted until 2026-04-17 09:22:57.69016683 +0000 UTC m=+122.286218701 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-metrics-certs") pod "router-default-59cd84dcb8-fhdx4" (UID: "74b4cdd2-7175-4d47-9486-0863bdb1bdb2") : secret "router-metrics-certs-default" not found Apr 17 09:22:56.791104 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:56.791016 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4c5bp\" (UID: \"c68efb67-a1eb-4e0b-9af1-47c6e61f4d10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c5bp" Apr 17 09:22:56.791249 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:56.791160 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 09:22:56.791249 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:56.791237 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-networking-console-plugin-cert podName:c68efb67-a1eb-4e0b-9af1-47c6e61f4d10 nodeName:}" failed. No retries permitted until 2026-04-17 09:22:57.791221622 +0000 UTC m=+122.387273495 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4c5bp" (UID: "c68efb67-a1eb-4e0b-9af1-47c6e61f4d10") : secret "networking-console-plugin-cert" not found Apr 17 09:22:57.320823 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:57.320787 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zjszk" event={"ID":"32be7a79-e2e8-447f-9b7b-731ca24adef9","Type":"ContainerStarted","Data":"a64e2887a27b19809aa99966f2e46cd7aceae453accf8aaf8001018ba1f92909"} Apr 17 09:22:57.321945 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:57.321915 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-sfz45" event={"ID":"51eeb544-5d28-4c8c-8577-6f932bfee2ce","Type":"ContainerStarted","Data":"8b57f13ec64b066e7c31b6115b80be6b05e6a04f518b20f56c96d0fa91a8b48e"} Apr 17 09:22:57.323045 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:57.323002 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-gmmqr" event={"ID":"824a0058-01c3-4126-965f-c9f5a5d55e99","Type":"ContainerStarted","Data":"73fdf52dcd6066b5239d2bc794ccf70a39ebc51b49d1e920dce669d75d458cdb"} Apr 17 09:22:57.699978 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:57.699941 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-service-ca-bundle\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:22:57.700153 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:57.699988 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-metrics-certs\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:22:57.700153 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:57.700132 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-service-ca-bundle podName:74b4cdd2-7175-4d47-9486-0863bdb1bdb2 nodeName:}" failed. No retries permitted until 2026-04-17 09:22:59.70010952 +0000 UTC m=+124.296161406 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-service-ca-bundle") pod "router-default-59cd84dcb8-fhdx4" (UID: "74b4cdd2-7175-4d47-9486-0863bdb1bdb2") : configmap references non-existent config key: service-ca.crt Apr 17 09:22:57.700303 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:57.700188 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 09:22:57.700303 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:57.700255 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-metrics-certs podName:74b4cdd2-7175-4d47-9486-0863bdb1bdb2 nodeName:}" failed. No retries permitted until 2026-04-17 09:22:59.70023793 +0000 UTC m=+124.296289814 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-metrics-certs") pod "router-default-59cd84dcb8-fhdx4" (UID: "74b4cdd2-7175-4d47-9486-0863bdb1bdb2") : secret "router-metrics-certs-default" not found Apr 17 09:22:57.801478 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:57.801438 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4c5bp\" (UID: \"c68efb67-a1eb-4e0b-9af1-47c6e61f4d10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c5bp" Apr 17 09:22:57.801658 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:57.801560 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 09:22:57.801658 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:57.801631 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-networking-console-plugin-cert podName:c68efb67-a1eb-4e0b-9af1-47c6e61f4d10 nodeName:}" failed. No retries permitted until 2026-04-17 09:22:59.80161273 +0000 UTC m=+124.397664613 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4c5bp" (UID: "c68efb67-a1eb-4e0b-9af1-47c6e61f4d10") : secret "networking-console-plugin-cert" not found Apr 17 09:22:59.329109 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:59.329018 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-sfz45" event={"ID":"51eeb544-5d28-4c8c-8577-6f932bfee2ce","Type":"ContainerStarted","Data":"7ba8277748fb166a5f3eaef23b020670b13d5f262e7b8bcb48a70f2c7ba7ee87"} Apr 17 09:22:59.330440 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:59.330413 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-gmmqr" event={"ID":"824a0058-01c3-4126-965f-c9f5a5d55e99","Type":"ContainerStarted","Data":"b80419a9144e6874815284e9e47349e6eddca1fd6859dce4dfa58c0ee59cafa8"} Apr 17 09:22:59.331612 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:59.331589 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zjszk" event={"ID":"32be7a79-e2e8-447f-9b7b-731ca24adef9","Type":"ContainerStarted","Data":"85d15de1c0a3fed9d25f47d722deae33878114b2a32c4ea5bb4e1a34fd45f2b5"} Apr 17 09:22:59.345752 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:59.345704 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-sfz45" podStartSLOduration=0.859164405 podStartE2EDuration="3.345691501s" podCreationTimestamp="2026-04-17 09:22:56 +0000 UTC" firstStartedPulling="2026-04-17 09:22:56.564552576 +0000 UTC m=+121.160604451" lastFinishedPulling="2026-04-17 09:22:59.051079667 +0000 UTC m=+123.647131547" observedRunningTime="2026-04-17 09:22:59.344631853 +0000 UTC m=+123.940683749" watchObservedRunningTime="2026-04-17 09:22:59.345691501 +0000 UTC m=+123.941743396" Apr 17 09:22:59.359230 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:59.359168 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-gmmqr" podStartSLOduration=1.741801675 podStartE2EDuration="4.359152513s" podCreationTimestamp="2026-04-17 09:22:55 +0000 UTC" firstStartedPulling="2026-04-17 09:22:56.430738933 +0000 UTC m=+121.026790808" lastFinishedPulling="2026-04-17 09:22:59.048089769 +0000 UTC m=+123.644141646" observedRunningTime="2026-04-17 09:22:59.358961559 +0000 UTC m=+123.955013454" watchObservedRunningTime="2026-04-17 09:22:59.359152513 +0000 UTC m=+123.955204408" Apr 17 09:22:59.379236 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:59.379196 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zjszk" podStartSLOduration=1.748053998 podStartE2EDuration="4.379158324s" podCreationTimestamp="2026-04-17 09:22:55 +0000 UTC" firstStartedPulling="2026-04-17 09:22:56.413762669 +0000 UTC m=+121.009814546" lastFinishedPulling="2026-04-17 09:22:59.044866993 +0000 UTC m=+123.640918872" observedRunningTime="2026-04-17 09:22:59.378445046 +0000 UTC m=+123.974496941" watchObservedRunningTime="2026-04-17 09:22:59.379158324 +0000 UTC m=+123.975210217" Apr 17 09:22:59.716456 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:59.716419 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-service-ca-bundle\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:22:59.716456 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:59.716459 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-metrics-certs\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:22:59.716704 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:59.716585 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 09:22:59.716704 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:59.716600 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-service-ca-bundle podName:74b4cdd2-7175-4d47-9486-0863bdb1bdb2 nodeName:}" failed. No retries permitted until 2026-04-17 09:23:03.716578486 +0000 UTC m=+128.312630371 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-service-ca-bundle") pod "router-default-59cd84dcb8-fhdx4" (UID: "74b4cdd2-7175-4d47-9486-0863bdb1bdb2") : configmap references non-existent config key: service-ca.crt Apr 17 09:22:59.716704 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:59.716622 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-metrics-certs podName:74b4cdd2-7175-4d47-9486-0863bdb1bdb2 nodeName:}" failed. No retries permitted until 2026-04-17 09:23:03.716612071 +0000 UTC m=+128.312663942 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-metrics-certs") pod "router-default-59cd84dcb8-fhdx4" (UID: "74b4cdd2-7175-4d47-9486-0863bdb1bdb2") : secret "router-metrics-certs-default" not found Apr 17 09:22:59.817828 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:59.817795 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4c5bp\" (UID: \"c68efb67-a1eb-4e0b-9af1-47c6e61f4d10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c5bp" Apr 17 09:22:59.817981 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:59.817952 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 09:22:59.818031 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:22:59.818020 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-networking-console-plugin-cert podName:c68efb67-a1eb-4e0b-9af1-47c6e61f4d10 nodeName:}" failed. No retries permitted until 2026-04-17 09:23:03.817999453 +0000 UTC m=+128.414051325 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4c5bp" (UID: "c68efb67-a1eb-4e0b-9af1-47c6e61f4d10") : secret "networking-console-plugin-cert" not found Apr 17 09:22:59.974399 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:59.974330 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-f9crx"] Apr 17 09:22:59.977491 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:59.977457 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-f9crx" Apr 17 09:22:59.980281 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:59.980260 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 09:22:59.980423 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:59.980407 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-77plf\"" Apr 17 09:22:59.980468 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:59.980459 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 09:22:59.986711 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:22:59.986691 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-f9crx"] Apr 17 09:23:00.119261 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:00.119231 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnkrb\" (UniqueName: \"kubernetes.io/projected/5b0284d6-e5f1-458f-a124-d9b4696c61af-kube-api-access-pnkrb\") pod \"migrator-74bb7799d9-f9crx\" (UID: \"5b0284d6-e5f1-458f-a124-d9b4696c61af\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-f9crx" Apr 17 09:23:00.219598 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:00.219571 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnkrb\" (UniqueName: \"kubernetes.io/projected/5b0284d6-e5f1-458f-a124-d9b4696c61af-kube-api-access-pnkrb\") pod \"migrator-74bb7799d9-f9crx\" (UID: \"5b0284d6-e5f1-458f-a124-d9b4696c61af\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-f9crx" Apr 17 09:23:00.227806 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:00.227743 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnkrb\" (UniqueName: \"kubernetes.io/projected/5b0284d6-e5f1-458f-a124-d9b4696c61af-kube-api-access-pnkrb\") pod \"migrator-74bb7799d9-f9crx\" (UID: \"5b0284d6-e5f1-458f-a124-d9b4696c61af\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-f9crx" Apr 17 09:23:00.287023 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:00.287002 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-f9crx" Apr 17 09:23:00.400278 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:00.400242 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-f9crx"] Apr 17 09:23:00.404019 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:23:00.403989 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b0284d6_e5f1_458f_a124_d9b4696c61af.slice/crio-8a39e7aaa9cfdcf964bb6dc8336a39044a58731ba56299459733a2a221d25177 WatchSource:0}: Error finding container 8a39e7aaa9cfdcf964bb6dc8336a39044a58731ba56299459733a2a221d25177: Status 404 returned error can't find the container with id 8a39e7aaa9cfdcf964bb6dc8336a39044a58731ba56299459733a2a221d25177 Apr 17 09:23:01.343048 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:01.342999 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-f9crx" event={"ID":"5b0284d6-e5f1-458f-a124-d9b4696c61af","Type":"ContainerStarted","Data":"8a39e7aaa9cfdcf964bb6dc8336a39044a58731ba56299459733a2a221d25177"} Apr 17 09:23:02.277066 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:02.277033 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-44k6p"] Apr 17 09:23:02.280023 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:02.280007 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-44k6p" Apr 17 09:23:02.282714 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:02.282686 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 09:23:02.282823 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:02.282693 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 09:23:02.284020 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:02.283991 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 09:23:02.284020 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:02.284013 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-pv5qp\"" Apr 17 09:23:02.284119 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:02.284084 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 09:23:02.286771 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:02.286533 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-44k6p"] Apr 17 09:23:02.347020 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:02.346990 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-f9crx" event={"ID":"5b0284d6-e5f1-458f-a124-d9b4696c61af","Type":"ContainerStarted","Data":"87732d6b9201ef4e35c070862bc18cb98dd6c1ed0210f7a78b5bf11fa028bd28"} Apr 17 09:23:02.347148 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:02.347026 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-f9crx" event={"ID":"5b0284d6-e5f1-458f-a124-d9b4696c61af","Type":"ContainerStarted","Data":"b55a7d4e303c5b960ef05933f5e4ce10391d9d14b817c82ba365556c2a5fff3d"} Apr 17 09:23:02.361831 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:02.361790 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-f9crx" podStartSLOduration=2.23867051 podStartE2EDuration="3.361777768s" podCreationTimestamp="2026-04-17 09:22:59 +0000 UTC" firstStartedPulling="2026-04-17 09:23:00.40592402 +0000 UTC m=+125.001975905" lastFinishedPulling="2026-04-17 09:23:01.529031291 +0000 UTC m=+126.125083163" observedRunningTime="2026-04-17 09:23:02.360910236 +0000 UTC m=+126.956962129" watchObservedRunningTime="2026-04-17 09:23:02.361777768 +0000 UTC m=+126.957829662" Apr 17 09:23:02.438603 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:02.438574 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/35dfea14-84cc-4879-ad5e-6c2cc44d00de-signing-key\") pod \"service-ca-865cb79987-44k6p\" (UID: \"35dfea14-84cc-4879-ad5e-6c2cc44d00de\") " pod="openshift-service-ca/service-ca-865cb79987-44k6p" Apr 17 09:23:02.438718 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:02.438656 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxq9b\" (UniqueName: \"kubernetes.io/projected/35dfea14-84cc-4879-ad5e-6c2cc44d00de-kube-api-access-rxq9b\") pod \"service-ca-865cb79987-44k6p\" (UID: \"35dfea14-84cc-4879-ad5e-6c2cc44d00de\") " pod="openshift-service-ca/service-ca-865cb79987-44k6p" Apr 17 09:23:02.438718 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:02.438710 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/35dfea14-84cc-4879-ad5e-6c2cc44d00de-signing-cabundle\") pod \"service-ca-865cb79987-44k6p\" (UID: \"35dfea14-84cc-4879-ad5e-6c2cc44d00de\") " pod="openshift-service-ca/service-ca-865cb79987-44k6p" Apr 17 09:23:02.539926 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:02.539849 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/35dfea14-84cc-4879-ad5e-6c2cc44d00de-signing-cabundle\") pod \"service-ca-865cb79987-44k6p\" (UID: \"35dfea14-84cc-4879-ad5e-6c2cc44d00de\") " pod="openshift-service-ca/service-ca-865cb79987-44k6p" Apr 17 09:23:02.539926 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:02.539900 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/35dfea14-84cc-4879-ad5e-6c2cc44d00de-signing-key\") pod \"service-ca-865cb79987-44k6p\" (UID: \"35dfea14-84cc-4879-ad5e-6c2cc44d00de\") " pod="openshift-service-ca/service-ca-865cb79987-44k6p" Apr 17 09:23:02.540096 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:02.540073 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxq9b\" (UniqueName: \"kubernetes.io/projected/35dfea14-84cc-4879-ad5e-6c2cc44d00de-kube-api-access-rxq9b\") pod \"service-ca-865cb79987-44k6p\" (UID: \"35dfea14-84cc-4879-ad5e-6c2cc44d00de\") " pod="openshift-service-ca/service-ca-865cb79987-44k6p" Apr 17 09:23:02.541091 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:02.541074 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/35dfea14-84cc-4879-ad5e-6c2cc44d00de-signing-cabundle\") pod \"service-ca-865cb79987-44k6p\" (UID: \"35dfea14-84cc-4879-ad5e-6c2cc44d00de\") " pod="openshift-service-ca/service-ca-865cb79987-44k6p" Apr 17 09:23:02.542281 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:02.542266 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/35dfea14-84cc-4879-ad5e-6c2cc44d00de-signing-key\") pod \"service-ca-865cb79987-44k6p\" (UID: \"35dfea14-84cc-4879-ad5e-6c2cc44d00de\") " pod="openshift-service-ca/service-ca-865cb79987-44k6p" Apr 17 09:23:02.548111 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:02.548087 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxq9b\" (UniqueName: \"kubernetes.io/projected/35dfea14-84cc-4879-ad5e-6c2cc44d00de-kube-api-access-rxq9b\") pod \"service-ca-865cb79987-44k6p\" (UID: \"35dfea14-84cc-4879-ad5e-6c2cc44d00de\") " pod="openshift-service-ca/service-ca-865cb79987-44k6p" Apr 17 09:23:02.589080 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:02.589056 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-44k6p" Apr 17 09:23:02.713187 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:02.713153 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-44k6p"] Apr 17 09:23:02.716478 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:23:02.716454 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35dfea14_84cc_4879_ad5e_6c2cc44d00de.slice/crio-31351482338f9b559e30017a32bbd3b117e61547398b2caa41bc60d390d27a01 WatchSource:0}: Error finding container 31351482338f9b559e30017a32bbd3b117e61547398b2caa41bc60d390d27a01: Status 404 returned error can't find the container with id 31351482338f9b559e30017a32bbd3b117e61547398b2caa41bc60d390d27a01 Apr 17 09:23:03.351871 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:03.351829 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-44k6p" event={"ID":"35dfea14-84cc-4879-ad5e-6c2cc44d00de","Type":"ContainerStarted","Data":"e2a73aae57316b9e31bdcbd9651aba17f234010cb364c702c836e874633d17bd"} Apr 17 09:23:03.352238 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:03.351878 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-44k6p" event={"ID":"35dfea14-84cc-4879-ad5e-6c2cc44d00de","Type":"ContainerStarted","Data":"31351482338f9b559e30017a32bbd3b117e61547398b2caa41bc60d390d27a01"} Apr 17 09:23:03.369450 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:03.369399 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-44k6p" podStartSLOduration=1.369383826 podStartE2EDuration="1.369383826s" podCreationTimestamp="2026-04-17 09:23:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:23:03.368928369 +0000 UTC m=+127.964980263" watchObservedRunningTime="2026-04-17 09:23:03.369383826 +0000 UTC m=+127.965435721" Apr 17 09:23:03.604939 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:03.604874 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-g5ls5_0755186c-8ac0-47fe-abc7-dd4eae84ad55/dns-node-resolver/0.log" Apr 17 09:23:03.752284 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:03.752251 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-service-ca-bundle\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:23:03.752440 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:03.752330 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-metrics-certs\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:23:03.752506 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:23:03.752444 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-service-ca-bundle podName:74b4cdd2-7175-4d47-9486-0863bdb1bdb2 nodeName:}" failed. No retries permitted until 2026-04-17 09:23:11.752420692 +0000 UTC m=+136.348472568 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-service-ca-bundle") pod "router-default-59cd84dcb8-fhdx4" (UID: "74b4cdd2-7175-4d47-9486-0863bdb1bdb2") : configmap references non-existent config key: service-ca.crt Apr 17 09:23:03.752506 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:23:03.752453 2574 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 09:23:03.752595 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:23:03.752509 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-metrics-certs podName:74b4cdd2-7175-4d47-9486-0863bdb1bdb2 nodeName:}" failed. No retries permitted until 2026-04-17 09:23:11.752491972 +0000 UTC m=+136.348543851 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-metrics-certs") pod "router-default-59cd84dcb8-fhdx4" (UID: "74b4cdd2-7175-4d47-9486-0863bdb1bdb2") : secret "router-metrics-certs-default" not found Apr 17 09:23:03.853063 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:03.853037 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4c5bp\" (UID: \"c68efb67-a1eb-4e0b-9af1-47c6e61f4d10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c5bp" Apr 17 09:23:03.853219 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:23:03.853132 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 09:23:03.853219 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:23:03.853205 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-networking-console-plugin-cert podName:c68efb67-a1eb-4e0b-9af1-47c6e61f4d10 nodeName:}" failed. No retries permitted until 2026-04-17 09:23:11.85316817 +0000 UTC m=+136.449220041 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4c5bp" (UID: "c68efb67-a1eb-4e0b-9af1-47c6e61f4d10") : secret "networking-console-plugin-cert" not found Apr 17 09:23:04.205402 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:04.205368 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-l7tg5_7a3125fc-e8c4-420c-8d7b-684643355422/node-ca/0.log" Apr 17 09:23:05.606102 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:05.606069 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-f9crx_5b0284d6-e5f1-458f-a124-d9b4696c61af/migrator/0.log" Apr 17 09:23:05.668770 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:05.668742 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs\") pod \"network-metrics-daemon-22cz6\" (UID: \"fba6f7ca-a68b-4315-91fd-d249cb9d13d1\") " pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:23:05.668923 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:23:05.668873 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 09:23:05.668981 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:23:05.668926 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs podName:fba6f7ca-a68b-4315-91fd-d249cb9d13d1 nodeName:}" failed. No retries permitted until 2026-04-17 09:25:07.668912656 +0000 UTC m=+252.264964532 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs") pod "network-metrics-daemon-22cz6" (UID: "fba6f7ca-a68b-4315-91fd-d249cb9d13d1") : secret "metrics-daemon-secret" not found Apr 17 09:23:05.806024 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:05.806001 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-f9crx_5b0284d6-e5f1-458f-a124-d9b4696c61af/graceful-termination/0.log" Apr 17 09:23:11.821394 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:11.821359 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-service-ca-bundle\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:23:11.821808 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:11.821435 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-metrics-certs\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:23:11.821945 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:11.821923 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-service-ca-bundle\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:23:11.823652 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:11.823628 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/74b4cdd2-7175-4d47-9486-0863bdb1bdb2-metrics-certs\") pod \"router-default-59cd84dcb8-fhdx4\" (UID: \"74b4cdd2-7175-4d47-9486-0863bdb1bdb2\") " pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:23:11.894034 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:11.893998 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:23:11.921929 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:11.921894 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4c5bp\" (UID: \"c68efb67-a1eb-4e0b-9af1-47c6e61f4d10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c5bp" Apr 17 09:23:11.922072 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:23:11.922034 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 09:23:11.922111 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:23:11.922094 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-networking-console-plugin-cert podName:c68efb67-a1eb-4e0b-9af1-47c6e61f4d10 nodeName:}" failed. No retries permitted until 2026-04-17 09:23:27.922076867 +0000 UTC m=+152.518128749 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4c5bp" (UID: "c68efb67-a1eb-4e0b-9af1-47c6e61f4d10") : secret "networking-console-plugin-cert" not found Apr 17 09:23:12.010184 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:12.010120 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-59cd84dcb8-fhdx4"] Apr 17 09:23:12.013959 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:23:12.013934 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74b4cdd2_7175_4d47_9486_0863bdb1bdb2.slice/crio-c03591778a295b133bf0bed1824a6b9edc50a0550a0990c5bd2dce2e947c9dc4 WatchSource:0}: Error finding container c03591778a295b133bf0bed1824a6b9edc50a0550a0990c5bd2dce2e947c9dc4: Status 404 returned error can't find the container with id c03591778a295b133bf0bed1824a6b9edc50a0550a0990c5bd2dce2e947c9dc4 Apr 17 09:23:12.376720 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:12.376685 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" event={"ID":"74b4cdd2-7175-4d47-9486-0863bdb1bdb2","Type":"ContainerStarted","Data":"f217ddca4630677896ebd0149826a2fd47fdae4a220285d993216fef1b8fd776"} Apr 17 09:23:12.376720 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:12.376721 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" event={"ID":"74b4cdd2-7175-4d47-9486-0863bdb1bdb2","Type":"ContainerStarted","Data":"c03591778a295b133bf0bed1824a6b9edc50a0550a0990c5bd2dce2e947c9dc4"} Apr 17 09:23:12.393731 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:12.393680 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" podStartSLOduration=17.393667921 podStartE2EDuration="17.393667921s" podCreationTimestamp="2026-04-17 09:22:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:23:12.393498723 +0000 UTC m=+136.989550617" watchObservedRunningTime="2026-04-17 09:23:12.393667921 +0000 UTC m=+136.989719815" Apr 17 09:23:12.894373 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:12.894338 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:23:12.896911 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:12.896885 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:23:13.379630 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:13.379600 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:23:13.380791 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:13.380767 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-59cd84dcb8-fhdx4" Apr 17 09:23:25.120011 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.119980 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-d5b4r"] Apr 17 09:23:25.124414 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.124394 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-d5b4r" Apr 17 09:23:25.127062 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.127035 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 09:23:25.128333 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.128314 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 09:23:25.128429 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.128345 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-fqvbv\"" Apr 17 09:23:25.132146 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.132115 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-d5b4r"] Apr 17 09:23:25.222321 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.222292 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a38ec03e-4cba-4d52-80e2-d579913e7f31-data-volume\") pod \"insights-runtime-extractor-d5b4r\" (UID: \"a38ec03e-4cba-4d52-80e2-d579913e7f31\") " pod="openshift-insights/insights-runtime-extractor-d5b4r" Apr 17 09:23:25.222450 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.222325 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m864h\" (UniqueName: \"kubernetes.io/projected/a38ec03e-4cba-4d52-80e2-d579913e7f31-kube-api-access-m864h\") pod \"insights-runtime-extractor-d5b4r\" (UID: \"a38ec03e-4cba-4d52-80e2-d579913e7f31\") " pod="openshift-insights/insights-runtime-extractor-d5b4r" Apr 17 09:23:25.222450 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.222394 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a38ec03e-4cba-4d52-80e2-d579913e7f31-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d5b4r\" (UID: \"a38ec03e-4cba-4d52-80e2-d579913e7f31\") " pod="openshift-insights/insights-runtime-extractor-d5b4r" Apr 17 09:23:25.222450 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.222445 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a38ec03e-4cba-4d52-80e2-d579913e7f31-crio-socket\") pod \"insights-runtime-extractor-d5b4r\" (UID: \"a38ec03e-4cba-4d52-80e2-d579913e7f31\") " pod="openshift-insights/insights-runtime-extractor-d5b4r" Apr 17 09:23:25.222546 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.222475 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a38ec03e-4cba-4d52-80e2-d579913e7f31-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d5b4r\" (UID: \"a38ec03e-4cba-4d52-80e2-d579913e7f31\") " pod="openshift-insights/insights-runtime-extractor-d5b4r" Apr 17 09:23:25.323602 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.323579 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a38ec03e-4cba-4d52-80e2-d579913e7f31-crio-socket\") pod \"insights-runtime-extractor-d5b4r\" (UID: \"a38ec03e-4cba-4d52-80e2-d579913e7f31\") " pod="openshift-insights/insights-runtime-extractor-d5b4r" Apr 17 09:23:25.323720 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.323616 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a38ec03e-4cba-4d52-80e2-d579913e7f31-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d5b4r\" (UID: \"a38ec03e-4cba-4d52-80e2-d579913e7f31\") " pod="openshift-insights/insights-runtime-extractor-d5b4r" Apr 17 09:23:25.323720 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.323668 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a38ec03e-4cba-4d52-80e2-d579913e7f31-data-volume\") pod \"insights-runtime-extractor-d5b4r\" (UID: \"a38ec03e-4cba-4d52-80e2-d579913e7f31\") " pod="openshift-insights/insights-runtime-extractor-d5b4r" Apr 17 09:23:25.323720 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.323701 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m864h\" (UniqueName: \"kubernetes.io/projected/a38ec03e-4cba-4d52-80e2-d579913e7f31-kube-api-access-m864h\") pod \"insights-runtime-extractor-d5b4r\" (UID: \"a38ec03e-4cba-4d52-80e2-d579913e7f31\") " pod="openshift-insights/insights-runtime-extractor-d5b4r" Apr 17 09:23:25.323858 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.323767 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a38ec03e-4cba-4d52-80e2-d579913e7f31-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d5b4r\" (UID: \"a38ec03e-4cba-4d52-80e2-d579913e7f31\") " pod="openshift-insights/insights-runtime-extractor-d5b4r" Apr 17 09:23:25.323858 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.323700 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a38ec03e-4cba-4d52-80e2-d579913e7f31-crio-socket\") pod \"insights-runtime-extractor-d5b4r\" (UID: \"a38ec03e-4cba-4d52-80e2-d579913e7f31\") " pod="openshift-insights/insights-runtime-extractor-d5b4r" Apr 17 09:23:25.324105 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.324087 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a38ec03e-4cba-4d52-80e2-d579913e7f31-data-volume\") pod \"insights-runtime-extractor-d5b4r\" (UID: \"a38ec03e-4cba-4d52-80e2-d579913e7f31\") " pod="openshift-insights/insights-runtime-extractor-d5b4r" Apr 17 09:23:25.324330 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.324309 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a38ec03e-4cba-4d52-80e2-d579913e7f31-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-d5b4r\" (UID: \"a38ec03e-4cba-4d52-80e2-d579913e7f31\") " pod="openshift-insights/insights-runtime-extractor-d5b4r" Apr 17 09:23:25.326089 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.326070 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a38ec03e-4cba-4d52-80e2-d579913e7f31-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-d5b4r\" (UID: \"a38ec03e-4cba-4d52-80e2-d579913e7f31\") " pod="openshift-insights/insights-runtime-extractor-d5b4r" Apr 17 09:23:25.331775 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.331726 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m864h\" (UniqueName: \"kubernetes.io/projected/a38ec03e-4cba-4d52-80e2-d579913e7f31-kube-api-access-m864h\") pod \"insights-runtime-extractor-d5b4r\" (UID: \"a38ec03e-4cba-4d52-80e2-d579913e7f31\") " pod="openshift-insights/insights-runtime-extractor-d5b4r" Apr 17 09:23:25.433936 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.433905 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-d5b4r" Apr 17 09:23:25.549498 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:25.549473 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-d5b4r"] Apr 17 09:23:25.552689 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:23:25.552653 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda38ec03e_4cba_4d52_80e2_d579913e7f31.slice/crio-e19a50d210c63f1125fc4c2666c6d49856d01442928649092e48102666c86d32 WatchSource:0}: Error finding container e19a50d210c63f1125fc4c2666c6d49856d01442928649092e48102666c86d32: Status 404 returned error can't find the container with id e19a50d210c63f1125fc4c2666c6d49856d01442928649092e48102666c86d32 Apr 17 09:23:26.417451 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:26.417375 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d5b4r" event={"ID":"a38ec03e-4cba-4d52-80e2-d579913e7f31","Type":"ContainerStarted","Data":"f722ca1ef04b7a58e1e27e8119a858e2690cd80e921763c50a4d0b7ed36bea4f"} Apr 17 09:23:26.417451 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:26.417414 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d5b4r" event={"ID":"a38ec03e-4cba-4d52-80e2-d579913e7f31","Type":"ContainerStarted","Data":"10162827f0eea05ada90211fa8f7d96bdf4c6bb2c52794d738b7f22e70cb19e0"} Apr 17 09:23:26.417451 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:26.417424 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d5b4r" event={"ID":"a38ec03e-4cba-4d52-80e2-d579913e7f31","Type":"ContainerStarted","Data":"e19a50d210c63f1125fc4c2666c6d49856d01442928649092e48102666c86d32"} Apr 17 09:23:27.946058 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:27.946028 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4c5bp\" (UID: \"c68efb67-a1eb-4e0b-9af1-47c6e61f4d10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c5bp" Apr 17 09:23:27.948344 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:27.948326 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c68efb67-a1eb-4e0b-9af1-47c6e61f4d10-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4c5bp\" (UID: \"c68efb67-a1eb-4e0b-9af1-47c6e61f4d10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c5bp" Apr 17 09:23:28.234927 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:28.234855 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c5bp" Apr 17 09:23:28.344542 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:28.344399 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-4c5bp"] Apr 17 09:23:28.346973 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:23:28.346945 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc68efb67_a1eb_4e0b_9af1_47c6e61f4d10.slice/crio-5a1f443443240973d7f07915e5ff5cf2035d16a2c258c398c1f3af0f0762ea4d WatchSource:0}: Error finding container 5a1f443443240973d7f07915e5ff5cf2035d16a2c258c398c1f3af0f0762ea4d: Status 404 returned error can't find the container with id 5a1f443443240973d7f07915e5ff5cf2035d16a2c258c398c1f3af0f0762ea4d Apr 17 09:23:28.423810 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:28.423778 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-d5b4r" event={"ID":"a38ec03e-4cba-4d52-80e2-d579913e7f31","Type":"ContainerStarted","Data":"31fd0d2a3c74430bd26fc4c0b338ca733e884ef54046179420dfbdb040672aa3"} Apr 17 09:23:28.424778 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:28.424756 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c5bp" event={"ID":"c68efb67-a1eb-4e0b-9af1-47c6e61f4d10","Type":"ContainerStarted","Data":"5a1f443443240973d7f07915e5ff5cf2035d16a2c258c398c1f3af0f0762ea4d"} Apr 17 09:23:28.441529 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:28.441488 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-d5b4r" podStartSLOduration=1.471667203 podStartE2EDuration="3.441476365s" podCreationTimestamp="2026-04-17 09:23:25 +0000 UTC" firstStartedPulling="2026-04-17 09:23:25.606886824 +0000 UTC m=+150.202938696" lastFinishedPulling="2026-04-17 09:23:27.57669597 +0000 UTC m=+152.172747858" observedRunningTime="2026-04-17 09:23:28.440258515 +0000 UTC m=+153.036310408" watchObservedRunningTime="2026-04-17 09:23:28.441476365 +0000 UTC m=+153.037528261" Apr 17 09:23:29.428238 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:29.428209 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c5bp" event={"ID":"c68efb67-a1eb-4e0b-9af1-47c6e61f4d10","Type":"ContainerStarted","Data":"cc8f154b2a4f0338e62ea2baf050cda3456ed94f28fef5dfe1ba25704ff51b87"} Apr 17 09:23:29.445390 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:29.445349 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4c5bp" podStartSLOduration=32.452045798 podStartE2EDuration="33.445335381s" podCreationTimestamp="2026-04-17 09:22:56 +0000 UTC" firstStartedPulling="2026-04-17 09:23:28.348985464 +0000 UTC m=+152.945037353" lastFinishedPulling="2026-04-17 09:23:29.342275065 +0000 UTC m=+153.938326936" observedRunningTime="2026-04-17 09:23:29.444743216 +0000 UTC m=+154.040795111" watchObservedRunningTime="2026-04-17 09:23:29.445335381 +0000 UTC m=+154.041387297" Apr 17 09:23:31.308450 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:23:31.305944 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-zkvsl" podUID="7281a4b6-76d6-494b-98e2-8fd1f322c7de" Apr 17 09:23:31.318334 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:23:31.318305 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-v2rkj" podUID="2721fe42-279c-4536-9769-411e4918ceac" Apr 17 09:23:31.433111 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:31.433088 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zkvsl" Apr 17 09:23:32.983039 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:23:32.983000 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-22cz6" podUID="fba6f7ca-a68b-4315-91fd-d249cb9d13d1" Apr 17 09:23:34.523212 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:34.523132 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-nnbft"] Apr 17 09:23:34.525266 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:34.525251 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-nnbft" Apr 17 09:23:34.527918 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:34.527896 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-7w559\"" Apr 17 09:23:34.527918 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:34.527910 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 09:23:34.528089 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:34.527960 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 09:23:34.529372 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:34.529343 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 09:23:34.529482 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:34.529408 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 09:23:34.529482 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:34.529444 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 09:23:34.533820 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:34.533799 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-nnbft"] Apr 17 09:23:34.595535 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:34.595511 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/976b87df-daa5-4000-84e8-1d40b45adac8-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-nnbft\" (UID: \"976b87df-daa5-4000-84e8-1d40b45adac8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nnbft" Apr 17 09:23:34.595620 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:34.595575 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/976b87df-daa5-4000-84e8-1d40b45adac8-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-nnbft\" (UID: \"976b87df-daa5-4000-84e8-1d40b45adac8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nnbft" Apr 17 09:23:34.595660 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:34.595625 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/976b87df-daa5-4000-84e8-1d40b45adac8-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-nnbft\" (UID: \"976b87df-daa5-4000-84e8-1d40b45adac8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nnbft" Apr 17 09:23:34.595695 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:34.595673 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q992t\" (UniqueName: \"kubernetes.io/projected/976b87df-daa5-4000-84e8-1d40b45adac8-kube-api-access-q992t\") pod \"prometheus-operator-5676c8c784-nnbft\" (UID: \"976b87df-daa5-4000-84e8-1d40b45adac8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nnbft" Apr 17 09:23:34.696726 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:34.696705 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/976b87df-daa5-4000-84e8-1d40b45adac8-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-nnbft\" (UID: \"976b87df-daa5-4000-84e8-1d40b45adac8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nnbft" Apr 17 09:23:34.696776 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:34.696749 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/976b87df-daa5-4000-84e8-1d40b45adac8-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-nnbft\" (UID: \"976b87df-daa5-4000-84e8-1d40b45adac8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nnbft" Apr 17 09:23:34.696815 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:34.696790 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/976b87df-daa5-4000-84e8-1d40b45adac8-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-nnbft\" (UID: \"976b87df-daa5-4000-84e8-1d40b45adac8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nnbft" Apr 17 09:23:34.696888 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:23:34.696870 2574 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 17 09:23:34.696956 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:23:34.696945 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/976b87df-daa5-4000-84e8-1d40b45adac8-prometheus-operator-tls podName:976b87df-daa5-4000-84e8-1d40b45adac8 nodeName:}" failed. No retries permitted until 2026-04-17 09:23:35.196925223 +0000 UTC m=+159.792977103 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/976b87df-daa5-4000-84e8-1d40b45adac8-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-nnbft" (UID: "976b87df-daa5-4000-84e8-1d40b45adac8") : secret "prometheus-operator-tls" not found Apr 17 09:23:34.697041 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:34.697001 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q992t\" (UniqueName: \"kubernetes.io/projected/976b87df-daa5-4000-84e8-1d40b45adac8-kube-api-access-q992t\") pod \"prometheus-operator-5676c8c784-nnbft\" (UID: \"976b87df-daa5-4000-84e8-1d40b45adac8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nnbft" Apr 17 09:23:34.697465 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:34.697439 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/976b87df-daa5-4000-84e8-1d40b45adac8-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-nnbft\" (UID: \"976b87df-daa5-4000-84e8-1d40b45adac8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nnbft" Apr 17 09:23:34.699198 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:34.699162 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/976b87df-daa5-4000-84e8-1d40b45adac8-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-nnbft\" (UID: \"976b87df-daa5-4000-84e8-1d40b45adac8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nnbft" Apr 17 09:23:34.705637 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:34.705614 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q992t\" (UniqueName: \"kubernetes.io/projected/976b87df-daa5-4000-84e8-1d40b45adac8-kube-api-access-q992t\") pod \"prometheus-operator-5676c8c784-nnbft\" (UID: \"976b87df-daa5-4000-84e8-1d40b45adac8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nnbft" Apr 17 09:23:35.200452 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:35.200414 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/976b87df-daa5-4000-84e8-1d40b45adac8-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-nnbft\" (UID: \"976b87df-daa5-4000-84e8-1d40b45adac8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nnbft" Apr 17 09:23:35.202736 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:35.202715 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/976b87df-daa5-4000-84e8-1d40b45adac8-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-nnbft\" (UID: \"976b87df-daa5-4000-84e8-1d40b45adac8\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-nnbft" Apr 17 09:23:35.434263 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:35.434239 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-nnbft" Apr 17 09:23:35.565058 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:35.565030 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-nnbft"] Apr 17 09:23:35.567694 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:23:35.567667 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod976b87df_daa5_4000_84e8_1d40b45adac8.slice/crio-625743997cf8f399fe0854a9126177573f9294837182465aefe9469ca4827071 WatchSource:0}: Error finding container 625743997cf8f399fe0854a9126177573f9294837182465aefe9469ca4827071: Status 404 returned error can't find the container with id 625743997cf8f399fe0854a9126177573f9294837182465aefe9469ca4827071 Apr 17 09:23:36.309430 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:36.309397 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls\") pod \"dns-default-zkvsl\" (UID: \"7281a4b6-76d6-494b-98e2-8fd1f322c7de\") " pod="openshift-dns/dns-default-zkvsl" Apr 17 09:23:36.309611 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:36.309455 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert\") pod \"ingress-canary-v2rkj\" (UID: \"2721fe42-279c-4536-9769-411e4918ceac\") " pod="openshift-ingress-canary/ingress-canary-v2rkj" Apr 17 09:23:36.311945 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:36.311917 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7281a4b6-76d6-494b-98e2-8fd1f322c7de-metrics-tls\") pod \"dns-default-zkvsl\" (UID: \"7281a4b6-76d6-494b-98e2-8fd1f322c7de\") " pod="openshift-dns/dns-default-zkvsl" Apr 17 09:23:36.312220 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:36.312198 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2721fe42-279c-4536-9769-411e4918ceac-cert\") pod \"ingress-canary-v2rkj\" (UID: \"2721fe42-279c-4536-9769-411e4918ceac\") " pod="openshift-ingress-canary/ingress-canary-v2rkj" Apr 17 09:23:36.450900 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:36.450854 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-nnbft" event={"ID":"976b87df-daa5-4000-84e8-1d40b45adac8","Type":"ContainerStarted","Data":"625743997cf8f399fe0854a9126177573f9294837182465aefe9469ca4827071"} Apr 17 09:23:36.537016 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:36.536986 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nblxl\"" Apr 17 09:23:36.545235 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:36.545204 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zkvsl" Apr 17 09:23:36.889092 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:36.889071 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zkvsl"] Apr 17 09:23:36.891221 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:23:36.891192 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7281a4b6_76d6_494b_98e2_8fd1f322c7de.slice/crio-b7497fdf66624e9560902600a2677cf92ab0f6db7231bd1842ead56b06ec7546 WatchSource:0}: Error finding container b7497fdf66624e9560902600a2677cf92ab0f6db7231bd1842ead56b06ec7546: Status 404 returned error can't find the container with id b7497fdf66624e9560902600a2677cf92ab0f6db7231bd1842ead56b06ec7546 Apr 17 09:23:37.455085 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:37.455039 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-nnbft" event={"ID":"976b87df-daa5-4000-84e8-1d40b45adac8","Type":"ContainerStarted","Data":"e82d43d5f83aeb5384a4a6c57d01782785144dc2f583da93cb5fcb8297fe6840"} Apr 17 09:23:37.455085 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:37.455092 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-nnbft" event={"ID":"976b87df-daa5-4000-84e8-1d40b45adac8","Type":"ContainerStarted","Data":"1c3ac2076182e3f7ca129ddc6839fee38ba773639c726b77af1ff6b3b4aaa7fd"} Apr 17 09:23:37.456065 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:37.456041 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zkvsl" event={"ID":"7281a4b6-76d6-494b-98e2-8fd1f322c7de","Type":"ContainerStarted","Data":"b7497fdf66624e9560902600a2677cf92ab0f6db7231bd1842ead56b06ec7546"} Apr 17 09:23:37.472028 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:37.471980 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-nnbft" podStartSLOduration=2.232809304 podStartE2EDuration="3.471966566s" podCreationTimestamp="2026-04-17 09:23:34 +0000 UTC" firstStartedPulling="2026-04-17 09:23:35.569363912 +0000 UTC m=+160.165415788" lastFinishedPulling="2026-04-17 09:23:36.80852117 +0000 UTC m=+161.404573050" observedRunningTime="2026-04-17 09:23:37.470438596 +0000 UTC m=+162.066490491" watchObservedRunningTime="2026-04-17 09:23:37.471966566 +0000 UTC m=+162.068018460" Apr 17 09:23:38.459626 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:38.459565 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zkvsl" event={"ID":"7281a4b6-76d6-494b-98e2-8fd1f322c7de","Type":"ContainerStarted","Data":"15c3514a249ceaffca94d329b03e26bd1b92b50df8598f95d0a8976c9f93e128"} Apr 17 09:23:38.459626 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:38.459598 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zkvsl" event={"ID":"7281a4b6-76d6-494b-98e2-8fd1f322c7de","Type":"ContainerStarted","Data":"3d1c9ecba86edad068a707644a521f2c65a1b01bde305bd76f02d75d7e62f535"} Apr 17 09:23:38.478290 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:38.478250 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zkvsl" podStartSLOduration=129.218125575 podStartE2EDuration="2m10.478236331s" podCreationTimestamp="2026-04-17 09:21:28 +0000 UTC" firstStartedPulling="2026-04-17 09:23:36.893084039 +0000 UTC m=+161.489135915" lastFinishedPulling="2026-04-17 09:23:38.153194783 +0000 UTC m=+162.749246671" observedRunningTime="2026-04-17 09:23:38.477733454 +0000 UTC m=+163.073785350" watchObservedRunningTime="2026-04-17 09:23:38.478236331 +0000 UTC m=+163.074288225" Apr 17 09:23:38.877326 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:38.877269 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-2xhdl"] Apr 17 09:23:38.879494 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:38.879479 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" Apr 17 09:23:38.883662 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:38.883559 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-d6vgd\"" Apr 17 09:23:38.883662 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:38.883648 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 09:23:38.883836 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:38.883665 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 09:23:38.883836 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:38.883567 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 09:23:38.891335 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:38.891316 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-2xhdl"] Apr 17 09:23:38.900800 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:38.900774 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-h622q"] Apr 17 09:23:38.902781 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:38.902764 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:38.905304 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:38.905283 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 09:23:38.905398 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:38.905334 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 09:23:38.905398 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:38.905349 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 09:23:38.905398 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:38.905294 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-hwqkh\"" Apr 17 09:23:38.934684 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:38.934664 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-2xhdl\" (UID: \"d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" Apr 17 09:23:38.934788 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:38.934721 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-2xhdl\" (UID: \"d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" Apr 17 09:23:38.934788 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:38.934769 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-2xhdl\" (UID: \"d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" Apr 17 09:23:38.934903 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:38.934817 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-2xhdl\" (UID: \"d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" Apr 17 09:23:38.934903 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:38.934854 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-2xhdl\" (UID: \"d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" Apr 17 09:23:38.935004 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:38.934906 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnmlv\" (UniqueName: \"kubernetes.io/projected/d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39-kube-api-access-lnmlv\") pod \"kube-state-metrics-69db897b98-2xhdl\" (UID: \"d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" Apr 17 09:23:39.035680 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.035655 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-2xhdl\" (UID: \"d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" Apr 17 09:23:39.035780 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.035686 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-2xhdl\" (UID: \"d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" Apr 17 09:23:39.035780 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.035716 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96qnx\" (UniqueName: \"kubernetes.io/projected/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-kube-api-access-96qnx\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.035780 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.035744 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-node-exporter-textfile\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.035780 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.035776 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-2xhdl\" (UID: \"d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" Apr 17 09:23:39.035931 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.035808 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-node-exporter-wtmp\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.035931 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.035888 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-node-exporter-tls\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.036005 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.035938 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lnmlv\" (UniqueName: \"kubernetes.io/projected/d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39-kube-api-access-lnmlv\") pod \"kube-state-metrics-69db897b98-2xhdl\" (UID: \"d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" Apr 17 09:23:39.036005 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.035972 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-sys\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.036102 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.036019 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-2xhdl\" (UID: \"d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" Apr 17 09:23:39.036102 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.036050 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-root\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.036102 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.036080 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-metrics-client-ca\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.036264 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.036107 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.036264 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.036135 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-node-exporter-accelerators-collector-config\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.036264 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.036213 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-2xhdl\" (UID: \"d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" Apr 17 09:23:39.036417 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.036384 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-2xhdl\" (UID: \"d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" Apr 17 09:23:39.036609 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.036585 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-2xhdl\" (UID: \"d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" Apr 17 09:23:39.037234 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.037214 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-2xhdl\" (UID: \"d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" Apr 17 09:23:39.038050 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.038035 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-2xhdl\" (UID: \"d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" Apr 17 09:23:39.038426 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.038405 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-2xhdl\" (UID: \"d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" Apr 17 09:23:39.044217 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.044166 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnmlv\" (UniqueName: \"kubernetes.io/projected/d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39-kube-api-access-lnmlv\") pod \"kube-state-metrics-69db897b98-2xhdl\" (UID: \"d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" Apr 17 09:23:39.136558 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.136485 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-root\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.136558 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.136517 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-metrics-client-ca\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.136558 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.136537 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.136806 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.136566 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-node-exporter-accelerators-collector-config\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.136806 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.136586 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-root\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.136806 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.136638 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96qnx\" (UniqueName: \"kubernetes.io/projected/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-kube-api-access-96qnx\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.136806 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.136664 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-node-exporter-textfile\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.136806 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.136702 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-node-exporter-wtmp\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.136806 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.136769 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-node-exporter-tls\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.136806 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.136802 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-sys\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.137139 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.136864 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-node-exporter-wtmp\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.137139 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.136873 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-sys\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.137257 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.137158 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-metrics-client-ca\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.137297 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.137274 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-node-exporter-textfile\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.137525 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.137501 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-node-exporter-accelerators-collector-config\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.139077 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.139042 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-node-exporter-tls\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.139077 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.139061 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.143782 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.143760 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96qnx\" (UniqueName: \"kubernetes.io/projected/6d362b3e-0a77-4adf-ae6b-f51342f9fb8c-kube-api-access-96qnx\") pod \"node-exporter-h622q\" (UID: \"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c\") " pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.188621 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.188601 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" Apr 17 09:23:39.213427 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.213406 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-h622q" Apr 17 09:23:39.220994 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:23:39.220963 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d362b3e_0a77_4adf_ae6b_f51342f9fb8c.slice/crio-a1ea3eaac9c52fb2a38bb5b33c495a47186d2696d9490aa191870d262797c649 WatchSource:0}: Error finding container a1ea3eaac9c52fb2a38bb5b33c495a47186d2696d9490aa191870d262797c649: Status 404 returned error can't find the container with id a1ea3eaac9c52fb2a38bb5b33c495a47186d2696d9490aa191870d262797c649 Apr 17 09:23:39.309315 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.309280 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-2xhdl"] Apr 17 09:23:39.313087 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:23:39.313062 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd60dd8b0_ebdb_4af4_9f01_1bfe1c043b39.slice/crio-e9664467a26a5b7978fb82964f08be3bd3db331d14dd24e9e94a3e96a4a13c3f WatchSource:0}: Error finding container e9664467a26a5b7978fb82964f08be3bd3db331d14dd24e9e94a3e96a4a13c3f: Status 404 returned error can't find the container with id e9664467a26a5b7978fb82964f08be3bd3db331d14dd24e9e94a3e96a4a13c3f Apr 17 09:23:39.464021 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.463956 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" event={"ID":"d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39","Type":"ContainerStarted","Data":"e9664467a26a5b7978fb82964f08be3bd3db331d14dd24e9e94a3e96a4a13c3f"} Apr 17 09:23:39.464938 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.464911 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h622q" event={"ID":"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c","Type":"ContainerStarted","Data":"a1ea3eaac9c52fb2a38bb5b33c495a47186d2696d9490aa191870d262797c649"} Apr 17 09:23:39.465206 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:39.465155 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-zkvsl" Apr 17 09:23:40.470118 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:40.470083 2574 generic.go:358] "Generic (PLEG): container finished" podID="6d362b3e-0a77-4adf-ae6b-f51342f9fb8c" containerID="16a318c20a771cd3e0f3afe6fcfd477b038608b6ac131f64f0ccc82dacddbcc5" exitCode=0 Apr 17 09:23:40.470581 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:40.470137 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h622q" event={"ID":"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c","Type":"ContainerDied","Data":"16a318c20a771cd3e0f3afe6fcfd477b038608b6ac131f64f0ccc82dacddbcc5"} Apr 17 09:23:41.475190 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:41.475145 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h622q" event={"ID":"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c","Type":"ContainerStarted","Data":"e93cbe53896928602ca6f547dc1241bb5f77f88d52303f0f861ad80a02148825"} Apr 17 09:23:41.475655 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:41.475201 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-h622q" event={"ID":"6d362b3e-0a77-4adf-ae6b-f51342f9fb8c","Type":"ContainerStarted","Data":"4183203d81108de77e6607393e45d1674c0361d3c6830ba0608632cc635be701"} Apr 17 09:23:41.476768 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:41.476744 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" event={"ID":"d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39","Type":"ContainerStarted","Data":"7d1704191fa24a04362fde39a88ab8483f2c7bac50ae0ef5dba569469cf1ed92"} Apr 17 09:23:41.476862 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:41.476774 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" event={"ID":"d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39","Type":"ContainerStarted","Data":"e35ca5b3c783ef5abdc7f3d99c0da3d8b99a9e9de70370578753c3a192081687"} Apr 17 09:23:41.476862 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:41.476788 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" event={"ID":"d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39","Type":"ContainerStarted","Data":"d9d6af9106c36a83588b35b183d89adae86cbb1c90d568ff31f87d7590acf382"} Apr 17 09:23:41.495884 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:41.495841 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-h622q" podStartSLOduration=2.68716781 podStartE2EDuration="3.495829918s" podCreationTimestamp="2026-04-17 09:23:38 +0000 UTC" firstStartedPulling="2026-04-17 09:23:39.22268494 +0000 UTC m=+163.818736815" lastFinishedPulling="2026-04-17 09:23:40.031347037 +0000 UTC m=+164.627398923" observedRunningTime="2026-04-17 09:23:41.494983451 +0000 UTC m=+166.091035344" watchObservedRunningTime="2026-04-17 09:23:41.495829918 +0000 UTC m=+166.091881811" Apr 17 09:23:41.515005 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:41.514967 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-2xhdl" podStartSLOduration=2.186346097 podStartE2EDuration="3.514954173s" podCreationTimestamp="2026-04-17 09:23:38 +0000 UTC" firstStartedPulling="2026-04-17 09:23:39.314886475 +0000 UTC m=+163.910938350" lastFinishedPulling="2026-04-17 09:23:40.643494549 +0000 UTC m=+165.239546426" observedRunningTime="2026-04-17 09:23:41.514000154 +0000 UTC m=+166.110052048" watchObservedRunningTime="2026-04-17 09:23:41.514954173 +0000 UTC m=+166.111006067" Apr 17 09:23:43.649168 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:43.649131 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-rcfh9"] Apr 17 09:23:43.652101 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:43.652079 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rcfh9" Apr 17 09:23:43.654780 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:43.654758 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-7h5l5\"" Apr 17 09:23:43.654883 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:43.654758 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 09:23:43.658309 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:43.658290 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-rcfh9"] Apr 17 09:23:43.779495 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:43.779450 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ce41555e-e114-4c1c-bce1-00dec4a79c09-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rcfh9\" (UID: \"ce41555e-e114-4c1c-bce1-00dec4a79c09\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rcfh9" Apr 17 09:23:43.880535 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:43.880499 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ce41555e-e114-4c1c-bce1-00dec4a79c09-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rcfh9\" (UID: \"ce41555e-e114-4c1c-bce1-00dec4a79c09\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rcfh9" Apr 17 09:23:43.880703 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:23:43.880640 2574 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 09:23:43.880743 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:23:43.880704 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce41555e-e114-4c1c-bce1-00dec4a79c09-monitoring-plugin-cert podName:ce41555e-e114-4c1c-bce1-00dec4a79c09 nodeName:}" failed. No retries permitted until 2026-04-17 09:23:44.38068781 +0000 UTC m=+168.976739683 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/ce41555e-e114-4c1c-bce1-00dec4a79c09-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-rcfh9" (UID: "ce41555e-e114-4c1c-bce1-00dec4a79c09") : secret "monitoring-plugin-cert" not found Apr 17 09:23:44.384849 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:44.384812 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ce41555e-e114-4c1c-bce1-00dec4a79c09-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rcfh9\" (UID: \"ce41555e-e114-4c1c-bce1-00dec4a79c09\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rcfh9" Apr 17 09:23:44.387133 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:44.387109 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ce41555e-e114-4c1c-bce1-00dec4a79c09-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-rcfh9\" (UID: \"ce41555e-e114-4c1c-bce1-00dec4a79c09\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rcfh9" Apr 17 09:23:44.561963 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:44.561938 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rcfh9" Apr 17 09:23:44.690984 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:44.690852 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-rcfh9"] Apr 17 09:23:44.693430 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:23:44.693408 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce41555e_e114_4c1c_bce1_00dec4a79c09.slice/crio-cbaec4fbd423663c9a38757f071f8cca460f06ebb7d8afd5d78858620badddfc WatchSource:0}: Error finding container cbaec4fbd423663c9a38757f071f8cca460f06ebb7d8afd5d78858620badddfc: Status 404 returned error can't find the container with id cbaec4fbd423663c9a38757f071f8cca460f06ebb7d8afd5d78858620badddfc Apr 17 09:23:45.090208 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.089831 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 09:23:45.093273 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.093248 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.097114 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.097078 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 09:23:45.097281 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.097146 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 09:23:45.097281 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.097204 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 09:23:45.097281 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.097246 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 09:23:45.097281 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.097257 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 09:23:45.097497 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.097423 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 09:23:45.098535 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.098511 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 09:23:45.098642 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.098580 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 09:23:45.098705 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.098648 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-e2mdjdsu4919b\"" Apr 17 09:23:45.098705 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.098684 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 09:23:45.098929 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.098908 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-rzxkx\"" Apr 17 09:23:45.099148 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.099131 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 09:23:45.099260 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.099166 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 09:23:45.100781 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.100759 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 09:23:45.104334 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.104298 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 09:23:45.106461 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.106439 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 09:23:45.191279 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.191252 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.191452 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.191286 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.191452 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.191305 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.191452 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.191327 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-web-config\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.191452 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.191377 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.191452 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.191424 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.191452 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.191444 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.191691 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.191475 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.191691 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.191492 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.191691 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.191513 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.191691 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.191531 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.191691 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.191551 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.191691 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.191591 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.191691 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.191616 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-config\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.191691 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.191636 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.191691 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.191654 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.191691 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.191671 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48bvv\" (UniqueName: \"kubernetes.io/projected/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-kube-api-access-48bvv\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.191691 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.191690 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-config-out\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.292884 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.292826 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.292884 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.292878 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.293133 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.292910 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.293133 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.292945 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-web-config\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.293133 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.292974 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.293133 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.293019 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.293133 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.293044 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.293133 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.293083 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.293133 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.293109 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.293508 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.293140 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.293508 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.293190 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.293508 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.293216 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.293508 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.293269 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.293508 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.293312 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-config\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.293508 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.293336 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.293508 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.293362 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.293508 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.293389 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48bvv\" (UniqueName: \"kubernetes.io/projected/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-kube-api-access-48bvv\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.293508 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.293411 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-config-out\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.293948 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.293730 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.293948 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.293804 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.295679 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.295209 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.296097 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.295994 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.296428 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.296395 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-web-config\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.296529 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.296496 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.297509 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.296898 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.297509 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.297380 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.298140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.298097 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.298702 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.298658 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-config-out\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.299794 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.299745 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.300580 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.300560 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.300787 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.300740 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.301148 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.301127 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-config\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.301902 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.301611 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.301902 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.301860 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.301902 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.301897 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.305671 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.305648 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48bvv\" (UniqueName: \"kubernetes.io/projected/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-kube-api-access-48bvv\") pod \"prometheus-k8s-0\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.406764 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.406728 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:23:45.491457 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.491390 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rcfh9" event={"ID":"ce41555e-e114-4c1c-bce1-00dec4a79c09","Type":"ContainerStarted","Data":"cbaec4fbd423663c9a38757f071f8cca460f06ebb7d8afd5d78858620badddfc"} Apr 17 09:23:45.566719 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.566690 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 09:23:45.569374 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:23:45.569338 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6cdc27f_c3ed_4ada_96ec_2ddae988178b.slice/crio-140aee8c7ecaf860c7bc8b1b461112fbf6b9ef9085a971af246ccaf6084ad51a WatchSource:0}: Error finding container 140aee8c7ecaf860c7bc8b1b461112fbf6b9ef9085a971af246ccaf6084ad51a: Status 404 returned error can't find the container with id 140aee8c7ecaf860c7bc8b1b461112fbf6b9ef9085a971af246ccaf6084ad51a Apr 17 09:23:45.966549 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.966511 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v2rkj" Apr 17 09:23:45.967008 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.966709 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:23:45.969272 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.969248 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9szrj\"" Apr 17 09:23:45.977303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:45.977283 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-v2rkj" Apr 17 09:23:46.189962 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:46.189916 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-v2rkj"] Apr 17 09:23:46.193055 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:23:46.193023 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2721fe42_279c_4536_9769_411e4918ceac.slice/crio-31e9065478a55ce3d3600083b2cf632fc255b4693eb369c25496985d4afbb3af WatchSource:0}: Error finding container 31e9065478a55ce3d3600083b2cf632fc255b4693eb369c25496985d4afbb3af: Status 404 returned error can't find the container with id 31e9065478a55ce3d3600083b2cf632fc255b4693eb369c25496985d4afbb3af Apr 17 09:23:46.495757 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:46.495723 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rcfh9" event={"ID":"ce41555e-e114-4c1c-bce1-00dec4a79c09","Type":"ContainerStarted","Data":"c315a5b05e2eb7c9c236a1fe530c27fdd3c2ca8be327aad938713e1380f2cafe"} Apr 17 09:23:46.495951 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:46.495892 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rcfh9" Apr 17 09:23:46.497188 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:46.497128 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-v2rkj" event={"ID":"2721fe42-279c-4536-9769-411e4918ceac","Type":"ContainerStarted","Data":"31e9065478a55ce3d3600083b2cf632fc255b4693eb369c25496985d4afbb3af"} Apr 17 09:23:46.498416 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:46.498389 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6cdc27f-c3ed-4ada-96ec-2ddae988178b","Type":"ContainerStarted","Data":"140aee8c7ecaf860c7bc8b1b461112fbf6b9ef9085a971af246ccaf6084ad51a"} Apr 17 09:23:46.501933 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:46.501914 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rcfh9" Apr 17 09:23:46.511997 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:46.511961 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-rcfh9" podStartSLOduration=2.118824481 podStartE2EDuration="3.511949858s" podCreationTimestamp="2026-04-17 09:23:43 +0000 UTC" firstStartedPulling="2026-04-17 09:23:44.695333438 +0000 UTC m=+169.291385312" lastFinishedPulling="2026-04-17 09:23:46.088458814 +0000 UTC m=+170.684510689" observedRunningTime="2026-04-17 09:23:46.510982915 +0000 UTC m=+171.107034810" watchObservedRunningTime="2026-04-17 09:23:46.511949858 +0000 UTC m=+171.108001783" Apr 17 09:23:47.502838 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:47.502802 2574 generic.go:358] "Generic (PLEG): container finished" podID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerID="ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d" exitCode=0 Apr 17 09:23:47.503303 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:47.502886 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6cdc27f-c3ed-4ada-96ec-2ddae988178b","Type":"ContainerDied","Data":"ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d"} Apr 17 09:23:48.507064 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:48.507018 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-v2rkj" event={"ID":"2721fe42-279c-4536-9769-411e4918ceac","Type":"ContainerStarted","Data":"ecf621e4899890b4180dea58e1af83ae7e68cf34047fbb45980992347f84c50d"} Apr 17 09:23:48.524541 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:48.524496 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-v2rkj" podStartSLOduration=138.856612746 podStartE2EDuration="2m20.524483394s" podCreationTimestamp="2026-04-17 09:21:28 +0000 UTC" firstStartedPulling="2026-04-17 09:23:46.195278409 +0000 UTC m=+170.791330282" lastFinishedPulling="2026-04-17 09:23:47.863149058 +0000 UTC m=+172.459200930" observedRunningTime="2026-04-17 09:23:48.522984181 +0000 UTC m=+173.119036087" watchObservedRunningTime="2026-04-17 09:23:48.524483394 +0000 UTC m=+173.120535332" Apr 17 09:23:49.473311 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:49.473279 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zkvsl" Apr 17 09:23:50.529877 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:50.529841 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6cdc27f-c3ed-4ada-96ec-2ddae988178b","Type":"ContainerStarted","Data":"015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159"} Apr 17 09:23:50.529877 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:50.529881 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6cdc27f-c3ed-4ada-96ec-2ddae988178b","Type":"ContainerStarted","Data":"2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc"} Apr 17 09:23:52.538279 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:52.538210 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6cdc27f-c3ed-4ada-96ec-2ddae988178b","Type":"ContainerStarted","Data":"1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685"} Apr 17 09:23:52.538279 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:52.538241 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6cdc27f-c3ed-4ada-96ec-2ddae988178b","Type":"ContainerStarted","Data":"7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c"} Apr 17 09:23:52.538279 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:52.538251 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6cdc27f-c3ed-4ada-96ec-2ddae988178b","Type":"ContainerStarted","Data":"10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988"} Apr 17 09:23:52.538279 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:52.538260 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6cdc27f-c3ed-4ada-96ec-2ddae988178b","Type":"ContainerStarted","Data":"75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0"} Apr 17 09:23:52.568317 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:52.568275 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=0.879221098 podStartE2EDuration="7.56826253s" podCreationTimestamp="2026-04-17 09:23:45 +0000 UTC" firstStartedPulling="2026-04-17 09:23:45.571195959 +0000 UTC m=+170.167247836" lastFinishedPulling="2026-04-17 09:23:52.260237396 +0000 UTC m=+176.856289268" observedRunningTime="2026-04-17 09:23:52.566448822 +0000 UTC m=+177.162500712" watchObservedRunningTime="2026-04-17 09:23:52.56826253 +0000 UTC m=+177.164314425" Apr 17 09:23:55.407609 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:23:55.407580 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:24:04.573288 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:04.573254 2574 generic.go:358] "Generic (PLEG): container finished" podID="824a0058-01c3-4126-965f-c9f5a5d55e99" containerID="b80419a9144e6874815284e9e47349e6eddca1fd6859dce4dfa58c0ee59cafa8" exitCode=0 Apr 17 09:24:04.573721 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:04.573339 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-gmmqr" event={"ID":"824a0058-01c3-4126-965f-c9f5a5d55e99","Type":"ContainerDied","Data":"b80419a9144e6874815284e9e47349e6eddca1fd6859dce4dfa58c0ee59cafa8"} Apr 17 09:24:04.573721 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:04.573705 2574 scope.go:117] "RemoveContainer" containerID="b80419a9144e6874815284e9e47349e6eddca1fd6859dce4dfa58c0ee59cafa8" Apr 17 09:24:05.578225 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:05.578164 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-gmmqr" event={"ID":"824a0058-01c3-4126-965f-c9f5a5d55e99","Type":"ContainerStarted","Data":"224d6b94ef042544f3145d3b5c51d6878ec95eba0074ee1b19be40b591033ea6"} Apr 17 09:24:25.638860 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:25.638824 2574 generic.go:358] "Generic (PLEG): container finished" podID="51eeb544-5d28-4c8c-8577-6f932bfee2ce" containerID="7ba8277748fb166a5f3eaef23b020670b13d5f262e7b8bcb48a70f2c7ba7ee87" exitCode=0 Apr 17 09:24:25.639255 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:25.638903 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-sfz45" event={"ID":"51eeb544-5d28-4c8c-8577-6f932bfee2ce","Type":"ContainerDied","Data":"7ba8277748fb166a5f3eaef23b020670b13d5f262e7b8bcb48a70f2c7ba7ee87"} Apr 17 09:24:25.639305 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:25.639292 2574 scope.go:117] "RemoveContainer" containerID="7ba8277748fb166a5f3eaef23b020670b13d5f262e7b8bcb48a70f2c7ba7ee87" Apr 17 09:24:26.643340 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:26.643305 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-sfz45" event={"ID":"51eeb544-5d28-4c8c-8577-6f932bfee2ce","Type":"ContainerStarted","Data":"07c8998627ffa3472a081cce5c5e06df51ee5b5f0b4592d08c391922dc497be3"} Apr 17 09:24:27.511921 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:27.511891 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-2xhdl_d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39/kube-state-metrics/0.log" Apr 17 09:24:27.712229 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:27.712199 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-2xhdl_d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39/kube-rbac-proxy-main/0.log" Apr 17 09:24:27.910656 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:27.910632 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-2xhdl_d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39/kube-rbac-proxy-self/0.log" Apr 17 09:24:28.310675 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:28.310603 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-rcfh9_ce41555e-e114-4c1c-bce1-00dec4a79c09/monitoring-plugin/0.log" Apr 17 09:24:29.110147 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:29.110120 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h622q_6d362b3e-0a77-4adf-ae6b-f51342f9fb8c/init-textfile/0.log" Apr 17 09:24:29.312188 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:29.312153 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h622q_6d362b3e-0a77-4adf-ae6b-f51342f9fb8c/node-exporter/0.log" Apr 17 09:24:29.510702 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:29.510679 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h622q_6d362b3e-0a77-4adf-ae6b-f51342f9fb8c/kube-rbac-proxy/0.log" Apr 17 09:24:30.911336 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:30.911308 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f6cdc27f-c3ed-4ada-96ec-2ddae988178b/init-config-reloader/0.log" Apr 17 09:24:31.113040 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:31.113007 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f6cdc27f-c3ed-4ada-96ec-2ddae988178b/prometheus/0.log" Apr 17 09:24:31.311382 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:31.311316 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f6cdc27f-c3ed-4ada-96ec-2ddae988178b/config-reloader/0.log" Apr 17 09:24:31.510895 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:31.510873 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f6cdc27f-c3ed-4ada-96ec-2ddae988178b/thanos-sidecar/0.log" Apr 17 09:24:31.710976 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:31.710953 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f6cdc27f-c3ed-4ada-96ec-2ddae988178b/kube-rbac-proxy-web/0.log" Apr 17 09:24:31.910977 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:31.910952 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f6cdc27f-c3ed-4ada-96ec-2ddae988178b/kube-rbac-proxy/0.log" Apr 17 09:24:32.111456 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:32.111383 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f6cdc27f-c3ed-4ada-96ec-2ddae988178b/kube-rbac-proxy-thanos/0.log" Apr 17 09:24:32.313900 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:32.313874 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-nnbft_976b87df-daa5-4000-84e8-1d40b45adac8/prometheus-operator/0.log" Apr 17 09:24:32.510631 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:32.510584 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-nnbft_976b87df-daa5-4000-84e8-1d40b45adac8/kube-rbac-proxy/0.log" Apr 17 09:24:34.111014 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:34.110991 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-4c5bp_c68efb67-a1eb-4e0b-9af1-47c6e61f4d10/networking-console-plugin/0.log" Apr 17 09:24:45.406833 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:45.406801 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:24:45.421798 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:45.421772 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:24:45.712482 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:24:45.712409 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:03.439799 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.439769 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 09:25:03.440351 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.440197 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="prometheus" containerID="cri-o://2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc" gracePeriod=600 Apr 17 09:25:03.440351 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.440239 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="config-reloader" containerID="cri-o://015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159" gracePeriod=600 Apr 17 09:25:03.440351 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.440240 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="thanos-sidecar" containerID="cri-o://75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0" gracePeriod=600 Apr 17 09:25:03.440351 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.440269 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="kube-rbac-proxy-web" containerID="cri-o://10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988" gracePeriod=600 Apr 17 09:25:03.440351 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.440282 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="kube-rbac-proxy" containerID="cri-o://7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c" gracePeriod=600 Apr 17 09:25:03.440653 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.440614 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="kube-rbac-proxy-thanos" containerID="cri-o://1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685" gracePeriod=600 Apr 17 09:25:03.683348 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.683322 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:03.749957 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.749868 2574 generic.go:358] "Generic (PLEG): container finished" podID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerID="1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685" exitCode=0 Apr 17 09:25:03.749957 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.749892 2574 generic.go:358] "Generic (PLEG): container finished" podID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerID="7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c" exitCode=0 Apr 17 09:25:03.749957 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.749899 2574 generic.go:358] "Generic (PLEG): container finished" podID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerID="10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988" exitCode=0 Apr 17 09:25:03.749957 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.749906 2574 generic.go:358] "Generic (PLEG): container finished" podID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerID="75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0" exitCode=0 Apr 17 09:25:03.749957 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.749911 2574 generic.go:358] "Generic (PLEG): container finished" podID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerID="015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159" exitCode=0 Apr 17 09:25:03.749957 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.749916 2574 generic.go:358] "Generic (PLEG): container finished" podID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerID="2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc" exitCode=0 Apr 17 09:25:03.750393 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.749952 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6cdc27f-c3ed-4ada-96ec-2ddae988178b","Type":"ContainerDied","Data":"1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685"} Apr 17 09:25:03.750393 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.749974 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:03.750393 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.749991 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6cdc27f-c3ed-4ada-96ec-2ddae988178b","Type":"ContainerDied","Data":"7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c"} Apr 17 09:25:03.750393 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.750004 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6cdc27f-c3ed-4ada-96ec-2ddae988178b","Type":"ContainerDied","Data":"10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988"} Apr 17 09:25:03.750393 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.750013 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6cdc27f-c3ed-4ada-96ec-2ddae988178b","Type":"ContainerDied","Data":"75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0"} Apr 17 09:25:03.750393 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.750023 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6cdc27f-c3ed-4ada-96ec-2ddae988178b","Type":"ContainerDied","Data":"015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159"} Apr 17 09:25:03.750393 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.750032 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6cdc27f-c3ed-4ada-96ec-2ddae988178b","Type":"ContainerDied","Data":"2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc"} Apr 17 09:25:03.750393 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.750041 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f6cdc27f-c3ed-4ada-96ec-2ddae988178b","Type":"ContainerDied","Data":"140aee8c7ecaf860c7bc8b1b461112fbf6b9ef9085a971af246ccaf6084ad51a"} Apr 17 09:25:03.750393 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.750059 2574 scope.go:117] "RemoveContainer" containerID="1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685" Apr 17 09:25:03.757212 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.757057 2574 scope.go:117] "RemoveContainer" containerID="7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c" Apr 17 09:25:03.763766 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.763747 2574 scope.go:117] "RemoveContainer" containerID="10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988" Apr 17 09:25:03.769697 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.769683 2574 scope.go:117] "RemoveContainer" containerID="75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0" Apr 17 09:25:03.775326 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.775311 2574 scope.go:117] "RemoveContainer" containerID="015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159" Apr 17 09:25:03.781279 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.781264 2574 scope.go:117] "RemoveContainer" containerID="2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc" Apr 17 09:25:03.787245 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.787229 2574 scope.go:117] "RemoveContainer" containerID="ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d" Apr 17 09:25:03.792724 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.792709 2574 scope.go:117] "RemoveContainer" containerID="1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685" Apr 17 09:25:03.792964 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:25:03.792944 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685\": container with ID starting with 1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685 not found: ID does not exist" containerID="1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685" Apr 17 09:25:03.793006 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.792972 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685"} err="failed to get container status \"1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685\": rpc error: code = NotFound desc = could not find container \"1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685\": container with ID starting with 1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685 not found: ID does not exist" Apr 17 09:25:03.793006 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.793002 2574 scope.go:117] "RemoveContainer" containerID="7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c" Apr 17 09:25:03.793246 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:25:03.793223 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c\": container with ID starting with 7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c not found: ID does not exist" containerID="7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c" Apr 17 09:25:03.793322 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.793255 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c"} err="failed to get container status \"7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c\": rpc error: code = NotFound desc = could not find container \"7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c\": container with ID starting with 7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c not found: ID does not exist" Apr 17 09:25:03.793322 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.793277 2574 scope.go:117] "RemoveContainer" containerID="10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988" Apr 17 09:25:03.793485 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:25:03.793468 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988\": container with ID starting with 10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988 not found: ID does not exist" containerID="10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988" Apr 17 09:25:03.793524 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.793490 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988"} err="failed to get container status \"10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988\": rpc error: code = NotFound desc = could not find container \"10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988\": container with ID starting with 10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988 not found: ID does not exist" Apr 17 09:25:03.793524 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.793504 2574 scope.go:117] "RemoveContainer" containerID="75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0" Apr 17 09:25:03.793726 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:25:03.793710 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0\": container with ID starting with 75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0 not found: ID does not exist" containerID="75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0" Apr 17 09:25:03.793774 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.793730 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0"} err="failed to get container status \"75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0\": rpc error: code = NotFound desc = could not find container \"75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0\": container with ID starting with 75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0 not found: ID does not exist" Apr 17 09:25:03.793774 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.793744 2574 scope.go:117] "RemoveContainer" containerID="015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159" Apr 17 09:25:03.793953 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:25:03.793936 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159\": container with ID starting with 015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159 not found: ID does not exist" containerID="015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159" Apr 17 09:25:03.793999 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.793958 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159"} err="failed to get container status \"015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159\": rpc error: code = NotFound desc = could not find container \"015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159\": container with ID starting with 015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159 not found: ID does not exist" Apr 17 09:25:03.793999 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.793970 2574 scope.go:117] "RemoveContainer" containerID="2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc" Apr 17 09:25:03.794217 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:25:03.794159 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc\": container with ID starting with 2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc not found: ID does not exist" containerID="2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc" Apr 17 09:25:03.794217 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.794197 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc"} err="failed to get container status \"2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc\": rpc error: code = NotFound desc = could not find container \"2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc\": container with ID starting with 2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc not found: ID does not exist" Apr 17 09:25:03.794217 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.794210 2574 scope.go:117] "RemoveContainer" containerID="ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d" Apr 17 09:25:03.794453 ip-10-0-138-237 kubenswrapper[2574]: E0417 09:25:03.794438 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d\": container with ID starting with ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d not found: ID does not exist" containerID="ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d" Apr 17 09:25:03.794493 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.794458 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d"} err="failed to get container status \"ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d\": rpc error: code = NotFound desc = could not find container \"ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d\": container with ID starting with ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d not found: ID does not exist" Apr 17 09:25:03.794493 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.794471 2574 scope.go:117] "RemoveContainer" containerID="1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685" Apr 17 09:25:03.794680 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.794667 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685"} err="failed to get container status \"1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685\": rpc error: code = NotFound desc = could not find container \"1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685\": container with ID starting with 1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685 not found: ID does not exist" Apr 17 09:25:03.794716 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.794681 2574 scope.go:117] "RemoveContainer" containerID="7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c" Apr 17 09:25:03.794879 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.794863 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c"} err="failed to get container status \"7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c\": rpc error: code = NotFound desc = could not find container \"7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c\": container with ID starting with 7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c not found: ID does not exist" Apr 17 09:25:03.794917 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.794880 2574 scope.go:117] "RemoveContainer" containerID="10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988" Apr 17 09:25:03.795070 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.795056 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988"} err="failed to get container status \"10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988\": rpc error: code = NotFound desc = could not find container \"10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988\": container with ID starting with 10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988 not found: ID does not exist" Apr 17 09:25:03.795119 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.795070 2574 scope.go:117] "RemoveContainer" containerID="75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0" Apr 17 09:25:03.795275 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.795260 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0"} err="failed to get container status \"75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0\": rpc error: code = NotFound desc = could not find container \"75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0\": container with ID starting with 75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0 not found: ID does not exist" Apr 17 09:25:03.795317 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.795275 2574 scope.go:117] "RemoveContainer" containerID="015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159" Apr 17 09:25:03.795472 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.795450 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159"} err="failed to get container status \"015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159\": rpc error: code = NotFound desc = could not find container \"015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159\": container with ID starting with 015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159 not found: ID does not exist" Apr 17 09:25:03.795472 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.795470 2574 scope.go:117] "RemoveContainer" containerID="2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc" Apr 17 09:25:03.795675 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.795661 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc"} err="failed to get container status \"2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc\": rpc error: code = NotFound desc = could not find container \"2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc\": container with ID starting with 2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc not found: ID does not exist" Apr 17 09:25:03.795675 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.795674 2574 scope.go:117] "RemoveContainer" containerID="ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d" Apr 17 09:25:03.795851 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.795836 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d"} err="failed to get container status \"ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d\": rpc error: code = NotFound desc = could not find container \"ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d\": container with ID starting with ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d not found: ID does not exist" Apr 17 09:25:03.795891 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.795851 2574 scope.go:117] "RemoveContainer" containerID="1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685" Apr 17 09:25:03.796027 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.796010 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685"} err="failed to get container status \"1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685\": rpc error: code = NotFound desc = could not find container \"1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685\": container with ID starting with 1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685 not found: ID does not exist" Apr 17 09:25:03.796066 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.796028 2574 scope.go:117] "RemoveContainer" containerID="7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c" Apr 17 09:25:03.796271 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.796253 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c"} err="failed to get container status \"7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c\": rpc error: code = NotFound desc = could not find container \"7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c\": container with ID starting with 7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c not found: ID does not exist" Apr 17 09:25:03.796271 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.796271 2574 scope.go:117] "RemoveContainer" containerID="10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988" Apr 17 09:25:03.796470 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.796453 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988"} err="failed to get container status \"10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988\": rpc error: code = NotFound desc = could not find container \"10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988\": container with ID starting with 10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988 not found: ID does not exist" Apr 17 09:25:03.796512 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.796471 2574 scope.go:117] "RemoveContainer" containerID="75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0" Apr 17 09:25:03.796665 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.796651 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0"} err="failed to get container status \"75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0\": rpc error: code = NotFound desc = could not find container \"75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0\": container with ID starting with 75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0 not found: ID does not exist" Apr 17 09:25:03.796704 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.796665 2574 scope.go:117] "RemoveContainer" containerID="015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159" Apr 17 09:25:03.796849 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.796832 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159"} err="failed to get container status \"015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159\": rpc error: code = NotFound desc = could not find container \"015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159\": container with ID starting with 015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159 not found: ID does not exist" Apr 17 09:25:03.796902 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.796849 2574 scope.go:117] "RemoveContainer" containerID="2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc" Apr 17 09:25:03.797040 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.797027 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc"} err="failed to get container status \"2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc\": rpc error: code = NotFound desc = could not find container \"2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc\": container with ID starting with 2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc not found: ID does not exist" Apr 17 09:25:03.797040 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.797039 2574 scope.go:117] "RemoveContainer" containerID="ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d" Apr 17 09:25:03.797275 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.797256 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d"} err="failed to get container status \"ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d\": rpc error: code = NotFound desc = could not find container \"ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d\": container with ID starting with ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d not found: ID does not exist" Apr 17 09:25:03.797327 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.797275 2574 scope.go:117] "RemoveContainer" containerID="1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685" Apr 17 09:25:03.797498 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.797482 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685"} err="failed to get container status \"1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685\": rpc error: code = NotFound desc = could not find container \"1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685\": container with ID starting with 1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685 not found: ID does not exist" Apr 17 09:25:03.797548 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.797498 2574 scope.go:117] "RemoveContainer" containerID="7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c" Apr 17 09:25:03.797683 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.797665 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c"} err="failed to get container status \"7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c\": rpc error: code = NotFound desc = could not find container \"7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c\": container with ID starting with 7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c not found: ID does not exist" Apr 17 09:25:03.797724 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.797685 2574 scope.go:117] "RemoveContainer" containerID="10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988" Apr 17 09:25:03.797885 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.797869 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988"} err="failed to get container status \"10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988\": rpc error: code = NotFound desc = could not find container \"10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988\": container with ID starting with 10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988 not found: ID does not exist" Apr 17 09:25:03.797885 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.797884 2574 scope.go:117] "RemoveContainer" containerID="75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0" Apr 17 09:25:03.798085 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.798071 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0"} err="failed to get container status \"75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0\": rpc error: code = NotFound desc = could not find container \"75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0\": container with ID starting with 75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0 not found: ID does not exist" Apr 17 09:25:03.798136 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.798085 2574 scope.go:117] "RemoveContainer" containerID="015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159" Apr 17 09:25:03.798283 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.798269 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159"} err="failed to get container status \"015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159\": rpc error: code = NotFound desc = could not find container \"015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159\": container with ID starting with 015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159 not found: ID does not exist" Apr 17 09:25:03.798331 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.798283 2574 scope.go:117] "RemoveContainer" containerID="2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc" Apr 17 09:25:03.798487 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.798473 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc"} err="failed to get container status \"2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc\": rpc error: code = NotFound desc = could not find container \"2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc\": container with ID starting with 2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc not found: ID does not exist" Apr 17 09:25:03.798526 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.798490 2574 scope.go:117] "RemoveContainer" containerID="ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d" Apr 17 09:25:03.798693 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.798676 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d"} err="failed to get container status \"ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d\": rpc error: code = NotFound desc = could not find container \"ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d\": container with ID starting with ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d not found: ID does not exist" Apr 17 09:25:03.798737 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.798693 2574 scope.go:117] "RemoveContainer" containerID="1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685" Apr 17 09:25:03.798890 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.798874 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685"} err="failed to get container status \"1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685\": rpc error: code = NotFound desc = could not find container \"1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685\": container with ID starting with 1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685 not found: ID does not exist" Apr 17 09:25:03.798933 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.798891 2574 scope.go:117] "RemoveContainer" containerID="7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c" Apr 17 09:25:03.799140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.799120 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c"} err="failed to get container status \"7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c\": rpc error: code = NotFound desc = could not find container \"7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c\": container with ID starting with 7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c not found: ID does not exist" Apr 17 09:25:03.799140 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.799139 2574 scope.go:117] "RemoveContainer" containerID="10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988" Apr 17 09:25:03.799348 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.799333 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988"} err="failed to get container status \"10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988\": rpc error: code = NotFound desc = could not find container \"10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988\": container with ID starting with 10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988 not found: ID does not exist" Apr 17 09:25:03.799389 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.799349 2574 scope.go:117] "RemoveContainer" containerID="75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0" Apr 17 09:25:03.799523 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.799508 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0"} err="failed to get container status \"75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0\": rpc error: code = NotFound desc = could not find container \"75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0\": container with ID starting with 75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0 not found: ID does not exist" Apr 17 09:25:03.799566 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.799522 2574 scope.go:117] "RemoveContainer" containerID="015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159" Apr 17 09:25:03.799681 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.799667 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159"} err="failed to get container status \"015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159\": rpc error: code = NotFound desc = could not find container \"015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159\": container with ID starting with 015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159 not found: ID does not exist" Apr 17 09:25:03.799681 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.799680 2574 scope.go:117] "RemoveContainer" containerID="2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc" Apr 17 09:25:03.799851 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.799837 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc"} err="failed to get container status \"2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc\": rpc error: code = NotFound desc = could not find container \"2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc\": container with ID starting with 2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc not found: ID does not exist" Apr 17 09:25:03.799851 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.799850 2574 scope.go:117] "RemoveContainer" containerID="ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d" Apr 17 09:25:03.800033 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.800016 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d"} err="failed to get container status \"ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d\": rpc error: code = NotFound desc = could not find container \"ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d\": container with ID starting with ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d not found: ID does not exist" Apr 17 09:25:03.800113 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.800035 2574 scope.go:117] "RemoveContainer" containerID="1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685" Apr 17 09:25:03.800304 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.800277 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685"} err="failed to get container status \"1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685\": rpc error: code = NotFound desc = could not find container \"1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685\": container with ID starting with 1137ff2546ae2bb78b5ece9a5f5dc3c57fca2ed0ab9ccfd8d78ad77dc40fa685 not found: ID does not exist" Apr 17 09:25:03.800348 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.800305 2574 scope.go:117] "RemoveContainer" containerID="7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c" Apr 17 09:25:03.800495 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.800479 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c"} err="failed to get container status \"7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c\": rpc error: code = NotFound desc = could not find container \"7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c\": container with ID starting with 7d8bdbb1b280cd683e561458b49216f32fc972bd907bd312140e64dbccf3864c not found: ID does not exist" Apr 17 09:25:03.800541 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.800496 2574 scope.go:117] "RemoveContainer" containerID="10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988" Apr 17 09:25:03.800665 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.800647 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988"} err="failed to get container status \"10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988\": rpc error: code = NotFound desc = could not find container \"10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988\": container with ID starting with 10fab8aa47722fb804d078160ba27b7b9f02c2b48307fb3844a488f4b8d51988 not found: ID does not exist" Apr 17 09:25:03.800703 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.800665 2574 scope.go:117] "RemoveContainer" containerID="75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0" Apr 17 09:25:03.800837 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.800818 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0"} err="failed to get container status \"75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0\": rpc error: code = NotFound desc = could not find container \"75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0\": container with ID starting with 75462612f0a023b919db04b52b5e9db52103bddbc7cd174f899a5ab3d925cef0 not found: ID does not exist" Apr 17 09:25:03.800901 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.800839 2574 scope.go:117] "RemoveContainer" containerID="015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159" Apr 17 09:25:03.801037 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.801020 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159"} err="failed to get container status \"015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159\": rpc error: code = NotFound desc = could not find container \"015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159\": container with ID starting with 015b251757db2388ec27e9e4a0941ae6a85ffa156cce43a3d40bd18ea6f21159 not found: ID does not exist" Apr 17 09:25:03.801084 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.801038 2574 scope.go:117] "RemoveContainer" containerID="2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc" Apr 17 09:25:03.801222 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.801207 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc"} err="failed to get container status \"2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc\": rpc error: code = NotFound desc = could not find container \"2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc\": container with ID starting with 2151ecc424cd87fe1a0c29a837ff6bd79e9bce4729fac5a53f0a70aa6d40d2fc not found: ID does not exist" Apr 17 09:25:03.801262 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.801224 2574 scope.go:117] "RemoveContainer" containerID="ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d" Apr 17 09:25:03.801425 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.801410 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d"} err="failed to get container status \"ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d\": rpc error: code = NotFound desc = could not find container \"ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d\": container with ID starting with ba89b7b90dee281b53deb808d445f48c83827fb2fc0354fbc7adfe1adcb9c48d not found: ID does not exist" Apr 17 09:25:03.806664 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.806648 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-metrics-client-certs\") pod \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " Apr 17 09:25:03.806720 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.806672 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-kube-rbac-proxy\") pod \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " Apr 17 09:25:03.806720 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.806688 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48bvv\" (UniqueName: \"kubernetes.io/projected/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-kube-api-access-48bvv\") pod \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " Apr 17 09:25:03.806809 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.806720 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-prometheus-k8s-tls\") pod \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " Apr 17 09:25:03.806879 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.806858 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-configmap-serving-certs-ca-bundle\") pod \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " Apr 17 09:25:03.806931 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.806890 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-configmap-kubelet-serving-ca-bundle\") pod \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " Apr 17 09:25:03.806931 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.806923 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " Apr 17 09:25:03.807043 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.807023 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-prometheus-k8s-db\") pod \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " Apr 17 09:25:03.807143 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.807076 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-web-config\") pod \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " Apr 17 09:25:03.807221 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.807163 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " Apr 17 09:25:03.807274 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.807231 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-configmap-metrics-client-ca\") pod \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " Apr 17 09:25:03.807274 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.807262 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-thanos-prometheus-http-client-file\") pod \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " Apr 17 09:25:03.807372 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.807289 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-prometheus-k8s-rulefiles-0\") pod \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " Apr 17 09:25:03.807372 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.807304 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "f6cdc27f-c3ed-4ada-96ec-2ddae988178b" (UID: "f6cdc27f-c3ed-4ada-96ec-2ddae988178b"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:25:03.807372 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.807318 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-tls-assets\") pod \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " Apr 17 09:25:03.807372 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.807341 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-config\") pod \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " Apr 17 09:25:03.807568 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.807349 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "f6cdc27f-c3ed-4ada-96ec-2ddae988178b" (UID: "f6cdc27f-c3ed-4ada-96ec-2ddae988178b"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:25:03.807568 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.807375 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-config-out\") pod \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " Apr 17 09:25:03.807669 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.807432 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-grpc-tls\") pod \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " Apr 17 09:25:03.808961 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.808032 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "f6cdc27f-c3ed-4ada-96ec-2ddae988178b" (UID: "f6cdc27f-c3ed-4ada-96ec-2ddae988178b"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:25:03.808961 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.808842 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-prometheus-trusted-ca-bundle\") pod \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\" (UID: \"f6cdc27f-c3ed-4ada-96ec-2ddae988178b\") " Apr 17 09:25:03.808961 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.808878 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "f6cdc27f-c3ed-4ada-96ec-2ddae988178b" (UID: "f6cdc27f-c3ed-4ada-96ec-2ddae988178b"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 09:25:03.809146 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.809123 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-138-237.ec2.internal\" DevicePath \"\"" Apr 17 09:25:03.809219 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.809143 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-138-237.ec2.internal\" DevicePath \"\"" Apr 17 09:25:03.809219 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.809162 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-prometheus-k8s-db\") on node \"ip-10-0-138-237.ec2.internal\" DevicePath \"\"" Apr 17 09:25:03.809219 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.809205 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-configmap-metrics-client-ca\") on node \"ip-10-0-138-237.ec2.internal\" DevicePath \"\"" Apr 17 09:25:03.809366 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.809275 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "f6cdc27f-c3ed-4ada-96ec-2ddae988178b" (UID: "f6cdc27f-c3ed-4ada-96ec-2ddae988178b"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:25:03.809608 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.809578 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "f6cdc27f-c3ed-4ada-96ec-2ddae988178b" (UID: "f6cdc27f-c3ed-4ada-96ec-2ddae988178b"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 09:25:03.810344 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.810297 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "f6cdc27f-c3ed-4ada-96ec-2ddae988178b" (UID: "f6cdc27f-c3ed-4ada-96ec-2ddae988178b"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:25:03.810344 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.810310 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "f6cdc27f-c3ed-4ada-96ec-2ddae988178b" (UID: "f6cdc27f-c3ed-4ada-96ec-2ddae988178b"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:25:03.810344 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.810322 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "f6cdc27f-c3ed-4ada-96ec-2ddae988178b" (UID: "f6cdc27f-c3ed-4ada-96ec-2ddae988178b"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:25:03.810546 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.810382 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-kube-api-access-48bvv" (OuterVolumeSpecName: "kube-api-access-48bvv") pod "f6cdc27f-c3ed-4ada-96ec-2ddae988178b" (UID: "f6cdc27f-c3ed-4ada-96ec-2ddae988178b"). InnerVolumeSpecName "kube-api-access-48bvv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 09:25:03.810546 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.810528 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "f6cdc27f-c3ed-4ada-96ec-2ddae988178b" (UID: "f6cdc27f-c3ed-4ada-96ec-2ddae988178b"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:25:03.810937 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.810906 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f6cdc27f-c3ed-4ada-96ec-2ddae988178b" (UID: "f6cdc27f-c3ed-4ada-96ec-2ddae988178b"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 09:25:03.811260 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.811238 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "f6cdc27f-c3ed-4ada-96ec-2ddae988178b" (UID: "f6cdc27f-c3ed-4ada-96ec-2ddae988178b"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:25:03.811616 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.811598 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "f6cdc27f-c3ed-4ada-96ec-2ddae988178b" (UID: "f6cdc27f-c3ed-4ada-96ec-2ddae988178b"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:25:03.811797 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.811782 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-config-out" (OuterVolumeSpecName: "config-out") pod "f6cdc27f-c3ed-4ada-96ec-2ddae988178b" (UID: "f6cdc27f-c3ed-4ada-96ec-2ddae988178b"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 09:25:03.811978 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.811963 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f6cdc27f-c3ed-4ada-96ec-2ddae988178b" (UID: "f6cdc27f-c3ed-4ada-96ec-2ddae988178b"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:25:03.812115 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.812100 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-config" (OuterVolumeSpecName: "config") pod "f6cdc27f-c3ed-4ada-96ec-2ddae988178b" (UID: "f6cdc27f-c3ed-4ada-96ec-2ddae988178b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:25:03.820103 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.820086 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-web-config" (OuterVolumeSpecName: "web-config") pod "f6cdc27f-c3ed-4ada-96ec-2ddae988178b" (UID: "f6cdc27f-c3ed-4ada-96ec-2ddae988178b"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 09:25:03.910262 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.910229 2574 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-thanos-prometheus-http-client-file\") on node \"ip-10-0-138-237.ec2.internal\" DevicePath \"\"" Apr 17 09:25:03.910262 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.910260 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-138-237.ec2.internal\" DevicePath \"\"" Apr 17 09:25:03.910387 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.910273 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-tls-assets\") on node \"ip-10-0-138-237.ec2.internal\" DevicePath \"\"" Apr 17 09:25:03.910387 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.910283 2574 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-config\") on node \"ip-10-0-138-237.ec2.internal\" DevicePath \"\"" Apr 17 09:25:03.910387 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.910293 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-config-out\") on node \"ip-10-0-138-237.ec2.internal\" DevicePath \"\"" Apr 17 09:25:03.910387 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.910300 2574 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-grpc-tls\") on node \"ip-10-0-138-237.ec2.internal\" DevicePath \"\"" Apr 17 09:25:03.910387 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.910309 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-prometheus-trusted-ca-bundle\") on node \"ip-10-0-138-237.ec2.internal\" DevicePath \"\"" Apr 17 09:25:03.910387 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.910317 2574 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-metrics-client-certs\") on node \"ip-10-0-138-237.ec2.internal\" DevicePath \"\"" Apr 17 09:25:03.910387 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.910326 2574 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-kube-rbac-proxy\") on node \"ip-10-0-138-237.ec2.internal\" DevicePath \"\"" Apr 17 09:25:03.910387 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.910334 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-48bvv\" (UniqueName: \"kubernetes.io/projected/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-kube-api-access-48bvv\") on node \"ip-10-0-138-237.ec2.internal\" DevicePath \"\"" Apr 17 09:25:03.910387 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.910342 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-prometheus-k8s-tls\") on node \"ip-10-0-138-237.ec2.internal\" DevicePath \"\"" Apr 17 09:25:03.910387 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.910351 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-138-237.ec2.internal\" DevicePath \"\"" Apr 17 09:25:03.910387 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.910361 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-web-config\") on node \"ip-10-0-138-237.ec2.internal\" DevicePath \"\"" Apr 17 09:25:03.910387 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:03.910370 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f6cdc27f-c3ed-4ada-96ec-2ddae988178b-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-138-237.ec2.internal\" DevicePath \"\"" Apr 17 09:25:04.068880 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.068820 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 09:25:04.072079 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.072059 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 09:25:04.094621 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.094592 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 09:25:04.094889 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.094873 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="init-config-reloader" Apr 17 09:25:04.094954 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.094892 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="init-config-reloader" Apr 17 09:25:04.094954 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.094906 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="kube-rbac-proxy" Apr 17 09:25:04.094954 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.094915 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="kube-rbac-proxy" Apr 17 09:25:04.094954 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.094926 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="thanos-sidecar" Apr 17 09:25:04.094954 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.094934 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="thanos-sidecar" Apr 17 09:25:04.094954 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.094951 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="prometheus" Apr 17 09:25:04.095270 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.094959 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="prometheus" Apr 17 09:25:04.095270 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.094973 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="config-reloader" Apr 17 09:25:04.095270 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.094980 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="config-reloader" Apr 17 09:25:04.095270 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.094993 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="kube-rbac-proxy-web" Apr 17 09:25:04.095270 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.095001 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="kube-rbac-proxy-web" Apr 17 09:25:04.095270 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.095023 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="kube-rbac-proxy-thanos" Apr 17 09:25:04.095270 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.095031 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="kube-rbac-proxy-thanos" Apr 17 09:25:04.095270 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.095102 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="prometheus" Apr 17 09:25:04.095270 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.095113 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="thanos-sidecar" Apr 17 09:25:04.095270 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.095124 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="kube-rbac-proxy-web" Apr 17 09:25:04.095270 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.095144 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="config-reloader" Apr 17 09:25:04.095270 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.095156 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="kube-rbac-proxy" Apr 17 09:25:04.095270 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.095166 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" containerName="kube-rbac-proxy-thanos" Apr 17 09:25:04.100430 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.100412 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.103273 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.103252 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 09:25:04.103402 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.103275 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 09:25:04.103402 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.103346 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 09:25:04.103518 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.103450 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-rzxkx\"" Apr 17 09:25:04.103518 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.103450 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 09:25:04.103612 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.103520 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 09:25:04.103612 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.103542 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 09:25:04.103732 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.103716 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 09:25:04.103800 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.103784 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 09:25:04.103800 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.103819 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 09:25:04.103936 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.103892 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-e2mdjdsu4919b\"" Apr 17 09:25:04.104130 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.104074 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 09:25:04.104130 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.104082 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 09:25:04.106681 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.106661 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 09:25:04.108505 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.108486 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 09:25:04.110726 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.110706 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 09:25:04.212525 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.212497 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80b874e6-5e93-4cf9-b760-06ed442c2763-config-out\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.212655 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.212532 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80b874e6-5e93-4cf9-b760-06ed442c2763-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.212655 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.212552 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.212655 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.212625 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80b874e6-5e93-4cf9-b760-06ed442c2763-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.212771 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.212663 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.212771 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.212685 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80b874e6-5e93-4cf9-b760-06ed442c2763-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.212771 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.212709 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-web-config\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.212771 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.212726 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-config\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.212771 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.212739 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/80b874e6-5e93-4cf9-b760-06ed442c2763-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.212771 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.212763 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.213049 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.212777 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80b874e6-5e93-4cf9-b760-06ed442c2763-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.213049 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.212859 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/80b874e6-5e93-4cf9-b760-06ed442c2763-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.213049 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.212897 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.213049 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.212958 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.213049 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.212989 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.213049 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.213023 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.213285 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.213058 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dbzg\" (UniqueName: \"kubernetes.io/projected/80b874e6-5e93-4cf9-b760-06ed442c2763-kube-api-access-6dbzg\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.213285 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.213083 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80b874e6-5e93-4cf9-b760-06ed442c2763-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.314431 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.314395 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-web-config\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.314431 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.314431 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-config\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.314657 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.314448 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/80b874e6-5e93-4cf9-b760-06ed442c2763-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.314657 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.314476 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.314657 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.314599 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80b874e6-5e93-4cf9-b760-06ed442c2763-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.314657 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.314650 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/80b874e6-5e93-4cf9-b760-06ed442c2763-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.314857 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.314808 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.314908 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.314871 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.314908 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.314887 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/80b874e6-5e93-4cf9-b760-06ed442c2763-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.314908 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.314903 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.315376 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.315354 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80b874e6-5e93-4cf9-b760-06ed442c2763-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.315564 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.315544 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.315719 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.315704 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dbzg\" (UniqueName: \"kubernetes.io/projected/80b874e6-5e93-4cf9-b760-06ed442c2763-kube-api-access-6dbzg\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.315833 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.315816 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80b874e6-5e93-4cf9-b760-06ed442c2763-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.315944 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.315929 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80b874e6-5e93-4cf9-b760-06ed442c2763-config-out\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.316042 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.316028 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80b874e6-5e93-4cf9-b760-06ed442c2763-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.316151 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.316137 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.316278 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.316263 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80b874e6-5e93-4cf9-b760-06ed442c2763-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.316395 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.316382 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.316514 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.316499 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80b874e6-5e93-4cf9-b760-06ed442c2763-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.316638 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.316612 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80b874e6-5e93-4cf9-b760-06ed442c2763-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.317848 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.317795 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.317949 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.317912 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.319143 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.318423 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-config\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.319143 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.318831 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80b874e6-5e93-4cf9-b760-06ed442c2763-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.319143 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.319030 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.319380 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.319350 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.319483 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.319446 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80b874e6-5e93-4cf9-b760-06ed442c2763-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.319483 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.319467 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-web-config\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.319483 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.319473 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.319628 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.319610 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80b874e6-5e93-4cf9-b760-06ed442c2763-config-out\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.319745 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.319703 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/80b874e6-5e93-4cf9-b760-06ed442c2763-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.319745 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.319726 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80b874e6-5e93-4cf9-b760-06ed442c2763-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.319804 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.319772 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.320297 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.320278 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/80b874e6-5e93-4cf9-b760-06ed442c2763-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.323292 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.323277 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dbzg\" (UniqueName: \"kubernetes.io/projected/80b874e6-5e93-4cf9-b760-06ed442c2763-kube-api-access-6dbzg\") pod \"prometheus-k8s-0\" (UID: \"80b874e6-5e93-4cf9-b760-06ed442c2763\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.410207 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.410168 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:04.538476 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.538455 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 09:25:04.540537 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:25:04.540509 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80b874e6_5e93_4cf9_b760_06ed442c2763.slice/crio-c428c5d9ab474a6cb94f63ca282ff6fa88157943729df72d07de9e2e2346e555 WatchSource:0}: Error finding container c428c5d9ab474a6cb94f63ca282ff6fa88157943729df72d07de9e2e2346e555: Status 404 returned error can't find the container with id c428c5d9ab474a6cb94f63ca282ff6fa88157943729df72d07de9e2e2346e555 Apr 17 09:25:04.753728 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.753700 2574 generic.go:358] "Generic (PLEG): container finished" podID="80b874e6-5e93-4cf9-b760-06ed442c2763" containerID="e17017f659f32f20e04c2885fae33a4c136f1a4c113db6502f43f85780f1f35a" exitCode=0 Apr 17 09:25:04.753839 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.753781 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"80b874e6-5e93-4cf9-b760-06ed442c2763","Type":"ContainerDied","Data":"e17017f659f32f20e04c2885fae33a4c136f1a4c113db6502f43f85780f1f35a"} Apr 17 09:25:04.753839 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:04.753811 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"80b874e6-5e93-4cf9-b760-06ed442c2763","Type":"ContainerStarted","Data":"c428c5d9ab474a6cb94f63ca282ff6fa88157943729df72d07de9e2e2346e555"} Apr 17 09:25:05.759519 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:05.759440 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"80b874e6-5e93-4cf9-b760-06ed442c2763","Type":"ContainerStarted","Data":"4a0ef6049e7f016c19023cf1bdfa9a4e924bb6857a924b92b6d69aa04847b73d"} Apr 17 09:25:05.759519 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:05.759471 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"80b874e6-5e93-4cf9-b760-06ed442c2763","Type":"ContainerStarted","Data":"0e2aece74978d1238f0f96d917711a9a60ea13ddbca829f255aee202db6e8209"} Apr 17 09:25:05.759519 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:05.759481 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"80b874e6-5e93-4cf9-b760-06ed442c2763","Type":"ContainerStarted","Data":"4c521163bbfc744683973f957bffa3793f74ab963905b3312f9eff83cc41e68c"} Apr 17 09:25:05.759519 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:05.759489 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"80b874e6-5e93-4cf9-b760-06ed442c2763","Type":"ContainerStarted","Data":"8c13cbedafa3c722b392d0c2171f8f9110737be4ef284c2442f65405ab89717e"} Apr 17 09:25:05.759519 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:05.759498 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"80b874e6-5e93-4cf9-b760-06ed442c2763","Type":"ContainerStarted","Data":"517043aeeeb5492c05cecf5a84ac91eeccc82d72ac0733f68e677afaf7f70201"} Apr 17 09:25:05.759519 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:05.759506 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"80b874e6-5e93-4cf9-b760-06ed442c2763","Type":"ContainerStarted","Data":"cf5946a8e4e6b584fbd1204267dd9ad2410e59921da50e1bd9503c940507fbab"} Apr 17 09:25:05.787028 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:05.786981 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.7869670279999998 podStartE2EDuration="1.786967028s" podCreationTimestamp="2026-04-17 09:25:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:25:05.786573224 +0000 UTC m=+250.382625155" watchObservedRunningTime="2026-04-17 09:25:05.786967028 +0000 UTC m=+250.383018923" Apr 17 09:25:05.968768 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:05.968739 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6cdc27f-c3ed-4ada-96ec-2ddae988178b" path="/var/lib/kubelet/pods/f6cdc27f-c3ed-4ada-96ec-2ddae988178b/volumes" Apr 17 09:25:07.744943 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:07.744904 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs\") pod \"network-metrics-daemon-22cz6\" (UID: \"fba6f7ca-a68b-4315-91fd-d249cb9d13d1\") " pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:25:07.747301 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:07.747278 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba6f7ca-a68b-4315-91fd-d249cb9d13d1-metrics-certs\") pod \"network-metrics-daemon-22cz6\" (UID: \"fba6f7ca-a68b-4315-91fd-d249cb9d13d1\") " pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:25:07.870620 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:07.870589 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-8wjkc\"" Apr 17 09:25:07.878642 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:07.878619 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-22cz6" Apr 17 09:25:07.993639 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:07.993615 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-22cz6"] Apr 17 09:25:07.995668 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:25:07.995612 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfba6f7ca_a68b_4315_91fd_d249cb9d13d1.slice/crio-187dc0cfb976004bf4e6b4a73b9cf590ce941616f7f4786cb7ca9642cb24a186 WatchSource:0}: Error finding container 187dc0cfb976004bf4e6b4a73b9cf590ce941616f7f4786cb7ca9642cb24a186: Status 404 returned error can't find the container with id 187dc0cfb976004bf4e6b4a73b9cf590ce941616f7f4786cb7ca9642cb24a186 Apr 17 09:25:08.770029 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:08.769991 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-22cz6" event={"ID":"fba6f7ca-a68b-4315-91fd-d249cb9d13d1","Type":"ContainerStarted","Data":"187dc0cfb976004bf4e6b4a73b9cf590ce941616f7f4786cb7ca9642cb24a186"} Apr 17 09:25:09.411015 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:09.410990 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:25:09.774622 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:09.774560 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-22cz6" event={"ID":"fba6f7ca-a68b-4315-91fd-d249cb9d13d1","Type":"ContainerStarted","Data":"5ab644bf1110cea7a8dad2d362967cdaf49ce53f213eb4b57ed4c3f6dbed0626"} Apr 17 09:25:09.774622 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:09.774591 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-22cz6" event={"ID":"fba6f7ca-a68b-4315-91fd-d249cb9d13d1","Type":"ContainerStarted","Data":"161c9a7ed9a2f5e114a13b75b709aec2a5d2f235fe87fc99a1e91ea41e540b7d"} Apr 17 09:25:09.806232 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:09.806166 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-22cz6" podStartSLOduration=252.833206078 podStartE2EDuration="4m13.806151861s" podCreationTimestamp="2026-04-17 09:20:56 +0000 UTC" firstStartedPulling="2026-04-17 09:25:07.997360186 +0000 UTC m=+252.593412057" lastFinishedPulling="2026-04-17 09:25:08.970305969 +0000 UTC m=+253.566357840" observedRunningTime="2026-04-17 09:25:09.803641947 +0000 UTC m=+254.399693839" watchObservedRunningTime="2026-04-17 09:25:09.806151861 +0000 UTC m=+254.402203752" Apr 17 09:25:55.854978 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:55.854948 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rb9kg_19bff221-f968-4a84-9891-8578f50203f2/ovn-acl-logging/0.log" Apr 17 09:25:55.855551 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:55.855524 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rb9kg_19bff221-f968-4a84-9891-8578f50203f2/ovn-acl-logging/0.log" Apr 17 09:25:55.862582 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:25:55.862565 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 09:26:04.411248 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:26:04.411209 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:26:04.426142 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:26:04.426119 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:26:04.943889 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:26:04.943861 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 09:28:18.251963 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:28:18.251876 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-b8z96"] Apr 17 09:28:18.255364 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:28:18.255341 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-b8z96" Apr 17 09:28:18.257893 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:28:18.257871 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 17 09:28:18.258014 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:28:18.257896 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 17 09:28:18.259024 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:28:18.259004 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-operator-dockercfg-ck7pl\"" Apr 17 09:28:18.261481 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:28:18.261463 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-b8z96"] Apr 17 09:28:18.272973 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:28:18.272936 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a0386386-2475-49d6-aa85-18eae48e0dab-tmp\") pod \"jobset-operator-747c5859c7-b8z96\" (UID: \"a0386386-2475-49d6-aa85-18eae48e0dab\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-b8z96" Apr 17 09:28:18.273082 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:28:18.273003 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq95p\" (UniqueName: \"kubernetes.io/projected/a0386386-2475-49d6-aa85-18eae48e0dab-kube-api-access-jq95p\") pod \"jobset-operator-747c5859c7-b8z96\" (UID: \"a0386386-2475-49d6-aa85-18eae48e0dab\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-b8z96" Apr 17 09:28:18.373755 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:28:18.373726 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a0386386-2475-49d6-aa85-18eae48e0dab-tmp\") pod \"jobset-operator-747c5859c7-b8z96\" (UID: \"a0386386-2475-49d6-aa85-18eae48e0dab\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-b8z96" Apr 17 09:28:18.373861 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:28:18.373772 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jq95p\" (UniqueName: \"kubernetes.io/projected/a0386386-2475-49d6-aa85-18eae48e0dab-kube-api-access-jq95p\") pod \"jobset-operator-747c5859c7-b8z96\" (UID: \"a0386386-2475-49d6-aa85-18eae48e0dab\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-b8z96" Apr 17 09:28:18.374054 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:28:18.374036 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a0386386-2475-49d6-aa85-18eae48e0dab-tmp\") pod \"jobset-operator-747c5859c7-b8z96\" (UID: \"a0386386-2475-49d6-aa85-18eae48e0dab\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-b8z96" Apr 17 09:28:18.381845 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:28:18.381826 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq95p\" (UniqueName: \"kubernetes.io/projected/a0386386-2475-49d6-aa85-18eae48e0dab-kube-api-access-jq95p\") pod \"jobset-operator-747c5859c7-b8z96\" (UID: \"a0386386-2475-49d6-aa85-18eae48e0dab\") " pod="openshift-jobset-operator/jobset-operator-747c5859c7-b8z96" Apr 17 09:28:18.579781 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:28:18.579725 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-operator-747c5859c7-b8z96" Apr 17 09:28:18.689647 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:28:18.689618 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-operator-747c5859c7-b8z96"] Apr 17 09:28:18.692647 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:28:18.692615 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0386386_2475_49d6_aa85_18eae48e0dab.slice/crio-5681135b42089efbb997f59b981c3c4fccba729dd117f6ad70c31680bfc3b713 WatchSource:0}: Error finding container 5681135b42089efbb997f59b981c3c4fccba729dd117f6ad70c31680bfc3b713: Status 404 returned error can't find the container with id 5681135b42089efbb997f59b981c3c4fccba729dd117f6ad70c31680bfc3b713 Apr 17 09:28:18.693923 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:28:18.693909 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 09:28:19.302998 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:28:19.302959 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-b8z96" event={"ID":"a0386386-2475-49d6-aa85-18eae48e0dab","Type":"ContainerStarted","Data":"5681135b42089efbb997f59b981c3c4fccba729dd117f6ad70c31680bfc3b713"} Apr 17 09:28:21.310553 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:28:21.310523 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-operator-747c5859c7-b8z96" event={"ID":"a0386386-2475-49d6-aa85-18eae48e0dab","Type":"ContainerStarted","Data":"92e331079e5a8ebf8a60e92b51f26f6733171b8a22adf967af1d520cb47dd9c4"} Apr 17 09:28:21.326063 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:28:21.326012 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-operator-747c5859c7-b8z96" podStartSLOduration=0.872190115 podStartE2EDuration="3.325996346s" podCreationTimestamp="2026-04-17 09:28:18 +0000 UTC" firstStartedPulling="2026-04-17 09:28:18.69402047 +0000 UTC m=+443.290072344" lastFinishedPulling="2026-04-17 09:28:21.1478267 +0000 UTC m=+445.743878575" observedRunningTime="2026-04-17 09:28:21.324849447 +0000 UTC m=+445.920901341" watchObservedRunningTime="2026-04-17 09:28:21.325996346 +0000 UTC m=+445.922048239" Apr 17 09:29:58.710685 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:29:58.710653 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xkrgt/must-gather-2dmdl"] Apr 17 09:29:58.712799 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:29:58.712783 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xkrgt/must-gather-2dmdl" Apr 17 09:29:58.715274 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:29:58.715254 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xkrgt\"/\"default-dockercfg-v97xl\"" Apr 17 09:29:58.715395 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:29:58.715260 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xkrgt\"/\"kube-root-ca.crt\"" Apr 17 09:29:58.715395 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:29:58.715302 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xkrgt\"/\"openshift-service-ca.crt\"" Apr 17 09:29:58.722485 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:29:58.722466 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xkrgt/must-gather-2dmdl"] Apr 17 09:29:58.797367 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:29:58.797345 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/69bed0d7-8888-48fb-89a4-57cbaff2a542-must-gather-output\") pod \"must-gather-2dmdl\" (UID: \"69bed0d7-8888-48fb-89a4-57cbaff2a542\") " pod="openshift-must-gather-xkrgt/must-gather-2dmdl" Apr 17 09:29:58.797469 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:29:58.797394 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw8pp\" (UniqueName: \"kubernetes.io/projected/69bed0d7-8888-48fb-89a4-57cbaff2a542-kube-api-access-tw8pp\") pod \"must-gather-2dmdl\" (UID: \"69bed0d7-8888-48fb-89a4-57cbaff2a542\") " pod="openshift-must-gather-xkrgt/must-gather-2dmdl" Apr 17 09:29:58.898200 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:29:58.898163 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tw8pp\" (UniqueName: \"kubernetes.io/projected/69bed0d7-8888-48fb-89a4-57cbaff2a542-kube-api-access-tw8pp\") pod \"must-gather-2dmdl\" (UID: \"69bed0d7-8888-48fb-89a4-57cbaff2a542\") " pod="openshift-must-gather-xkrgt/must-gather-2dmdl" Apr 17 09:29:58.898295 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:29:58.898223 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/69bed0d7-8888-48fb-89a4-57cbaff2a542-must-gather-output\") pod \"must-gather-2dmdl\" (UID: \"69bed0d7-8888-48fb-89a4-57cbaff2a542\") " pod="openshift-must-gather-xkrgt/must-gather-2dmdl" Apr 17 09:29:58.898480 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:29:58.898463 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/69bed0d7-8888-48fb-89a4-57cbaff2a542-must-gather-output\") pod \"must-gather-2dmdl\" (UID: \"69bed0d7-8888-48fb-89a4-57cbaff2a542\") " pod="openshift-must-gather-xkrgt/must-gather-2dmdl" Apr 17 09:29:58.905642 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:29:58.905612 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw8pp\" (UniqueName: \"kubernetes.io/projected/69bed0d7-8888-48fb-89a4-57cbaff2a542-kube-api-access-tw8pp\") pod \"must-gather-2dmdl\" (UID: \"69bed0d7-8888-48fb-89a4-57cbaff2a542\") " pod="openshift-must-gather-xkrgt/must-gather-2dmdl" Apr 17 09:29:59.022117 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:29:59.022057 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xkrgt/must-gather-2dmdl" Apr 17 09:29:59.133095 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:29:59.133068 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xkrgt/must-gather-2dmdl"] Apr 17 09:29:59.136019 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:29:59.135991 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69bed0d7_8888_48fb_89a4_57cbaff2a542.slice/crio-996dbec171eeb0a340356cb1e18b5281447022094feab81ef23d5d65db6c471a WatchSource:0}: Error finding container 996dbec171eeb0a340356cb1e18b5281447022094feab81ef23d5d65db6c471a: Status 404 returned error can't find the container with id 996dbec171eeb0a340356cb1e18b5281447022094feab81ef23d5d65db6c471a Apr 17 09:29:59.581044 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:29:59.581013 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xkrgt/must-gather-2dmdl" event={"ID":"69bed0d7-8888-48fb-89a4-57cbaff2a542","Type":"ContainerStarted","Data":"996dbec171eeb0a340356cb1e18b5281447022094feab81ef23d5d65db6c471a"} Apr 17 09:30:00.587797 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:00.587750 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xkrgt/must-gather-2dmdl" event={"ID":"69bed0d7-8888-48fb-89a4-57cbaff2a542","Type":"ContainerStarted","Data":"c101ef509e1d65e26aaaa079d0c208992ef13926e0daf820fb892f04a64dfc86"} Apr 17 09:30:00.587797 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:00.587803 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xkrgt/must-gather-2dmdl" event={"ID":"69bed0d7-8888-48fb-89a4-57cbaff2a542","Type":"ContainerStarted","Data":"2311f5e0672c9eaf0767456f605c9aa39ea2fa6a0bc05ee38c3c6ed0ca4b7ae5"} Apr 17 09:30:00.603114 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:00.603060 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xkrgt/must-gather-2dmdl" podStartSLOduration=1.7705064959999999 podStartE2EDuration="2.603045312s" podCreationTimestamp="2026-04-17 09:29:58 +0000 UTC" firstStartedPulling="2026-04-17 09:29:59.13779766 +0000 UTC m=+543.733849535" lastFinishedPulling="2026-04-17 09:29:59.970336478 +0000 UTC m=+544.566388351" observedRunningTime="2026-04-17 09:30:00.601708317 +0000 UTC m=+545.197760209" watchObservedRunningTime="2026-04-17 09:30:00.603045312 +0000 UTC m=+545.199097244" Apr 17 09:30:01.298734 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:01.298705 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-tp8sk_c3c029a5-8abb-404f-bd4f-0b2cbf2dfe11/global-pull-secret-syncer/0.log" Apr 17 09:30:01.362880 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:01.362850 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-fgpzd_ce7cf71b-3181-4cb7-84c1-caec23780d1c/konnectivity-agent/0.log" Apr 17 09:30:01.447520 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:01.447489 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-237.ec2.internal_9ca6567298fdecb7125a67e335ee762b/haproxy/0.log" Apr 17 09:30:04.825878 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:04.825796 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-2xhdl_d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39/kube-state-metrics/0.log" Apr 17 09:30:04.847500 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:04.847464 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-2xhdl_d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39/kube-rbac-proxy-main/0.log" Apr 17 09:30:04.866969 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:04.866939 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-2xhdl_d60dd8b0-ebdb-4af4-9f01-1bfe1c043b39/kube-rbac-proxy-self/0.log" Apr 17 09:30:04.915979 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:04.915952 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-rcfh9_ce41555e-e114-4c1c-bce1-00dec4a79c09/monitoring-plugin/0.log" Apr 17 09:30:05.005471 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:05.005447 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h622q_6d362b3e-0a77-4adf-ae6b-f51342f9fb8c/node-exporter/0.log" Apr 17 09:30:05.027829 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:05.027796 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h622q_6d362b3e-0a77-4adf-ae6b-f51342f9fb8c/kube-rbac-proxy/0.log" Apr 17 09:30:05.046713 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:05.046677 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-h622q_6d362b3e-0a77-4adf-ae6b-f51342f9fb8c/init-textfile/0.log" Apr 17 09:30:05.215165 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:05.215138 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_80b874e6-5e93-4cf9-b760-06ed442c2763/prometheus/0.log" Apr 17 09:30:05.235406 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:05.235370 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_80b874e6-5e93-4cf9-b760-06ed442c2763/config-reloader/0.log" Apr 17 09:30:05.256748 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:05.256720 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_80b874e6-5e93-4cf9-b760-06ed442c2763/thanos-sidecar/0.log" Apr 17 09:30:05.277319 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:05.277296 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_80b874e6-5e93-4cf9-b760-06ed442c2763/kube-rbac-proxy-web/0.log" Apr 17 09:30:05.297016 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:05.296991 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_80b874e6-5e93-4cf9-b760-06ed442c2763/kube-rbac-proxy/0.log" Apr 17 09:30:05.327085 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:05.327058 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_80b874e6-5e93-4cf9-b760-06ed442c2763/kube-rbac-proxy-thanos/0.log" Apr 17 09:30:05.357780 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:05.357749 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_80b874e6-5e93-4cf9-b760-06ed442c2763/init-config-reloader/0.log" Apr 17 09:30:05.389409 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:05.389379 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-nnbft_976b87df-daa5-4000-84e8-1d40b45adac8/prometheus-operator/0.log" Apr 17 09:30:05.412347 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:05.412318 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-nnbft_976b87df-daa5-4000-84e8-1d40b45adac8/kube-rbac-proxy/0.log" Apr 17 09:30:06.618160 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:06.618131 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-4c5bp_c68efb67-a1eb-4e0b-9af1-47c6e61f4d10/networking-console-plugin/0.log" Apr 17 09:30:07.612643 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:07.612607 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz"] Apr 17 09:30:07.615968 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:07.615946 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz" Apr 17 09:30:07.623937 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:07.623906 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz"] Apr 17 09:30:07.668048 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:07.668007 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-zjszk_32be7a79-e2e8-447f-9b7b-731ca24adef9/volume-data-source-validator/0.log" Apr 17 09:30:07.680718 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:07.680692 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ba152c57-2f80-4b71-b8f8-a45554ae5e23-proc\") pod \"perf-node-gather-daemonset-78vdz\" (UID: \"ba152c57-2f80-4b71-b8f8-a45554ae5e23\") " pod="openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz" Apr 17 09:30:07.680846 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:07.680723 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba152c57-2f80-4b71-b8f8-a45554ae5e23-lib-modules\") pod \"perf-node-gather-daemonset-78vdz\" (UID: \"ba152c57-2f80-4b71-b8f8-a45554ae5e23\") " pod="openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz" Apr 17 09:30:07.680846 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:07.680746 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgwjj\" (UniqueName: \"kubernetes.io/projected/ba152c57-2f80-4b71-b8f8-a45554ae5e23-kube-api-access-hgwjj\") pod \"perf-node-gather-daemonset-78vdz\" (UID: \"ba152c57-2f80-4b71-b8f8-a45554ae5e23\") " pod="openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz" Apr 17 09:30:07.680846 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:07.680774 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ba152c57-2f80-4b71-b8f8-a45554ae5e23-podres\") pod \"perf-node-gather-daemonset-78vdz\" (UID: \"ba152c57-2f80-4b71-b8f8-a45554ae5e23\") " pod="openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz" Apr 17 09:30:07.680955 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:07.680865 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba152c57-2f80-4b71-b8f8-a45554ae5e23-sys\") pod \"perf-node-gather-daemonset-78vdz\" (UID: \"ba152c57-2f80-4b71-b8f8-a45554ae5e23\") " pod="openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz" Apr 17 09:30:07.781256 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:07.781226 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba152c57-2f80-4b71-b8f8-a45554ae5e23-sys\") pod \"perf-node-gather-daemonset-78vdz\" (UID: \"ba152c57-2f80-4b71-b8f8-a45554ae5e23\") " pod="openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz" Apr 17 09:30:07.781403 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:07.781271 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ba152c57-2f80-4b71-b8f8-a45554ae5e23-proc\") pod \"perf-node-gather-daemonset-78vdz\" (UID: \"ba152c57-2f80-4b71-b8f8-a45554ae5e23\") " pod="openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz" Apr 17 09:30:07.781403 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:07.781296 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba152c57-2f80-4b71-b8f8-a45554ae5e23-lib-modules\") pod \"perf-node-gather-daemonset-78vdz\" (UID: \"ba152c57-2f80-4b71-b8f8-a45554ae5e23\") " pod="openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz" Apr 17 09:30:07.781403 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:07.781333 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hgwjj\" (UniqueName: \"kubernetes.io/projected/ba152c57-2f80-4b71-b8f8-a45554ae5e23-kube-api-access-hgwjj\") pod \"perf-node-gather-daemonset-78vdz\" (UID: \"ba152c57-2f80-4b71-b8f8-a45554ae5e23\") " pod="openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz" Apr 17 09:30:07.781403 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:07.781350 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ba152c57-2f80-4b71-b8f8-a45554ae5e23-sys\") pod \"perf-node-gather-daemonset-78vdz\" (UID: \"ba152c57-2f80-4b71-b8f8-a45554ae5e23\") " pod="openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz" Apr 17 09:30:07.781403 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:07.781360 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/ba152c57-2f80-4b71-b8f8-a45554ae5e23-proc\") pod \"perf-node-gather-daemonset-78vdz\" (UID: \"ba152c57-2f80-4b71-b8f8-a45554ae5e23\") " pod="openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz" Apr 17 09:30:07.781403 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:07.781367 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ba152c57-2f80-4b71-b8f8-a45554ae5e23-podres\") pod \"perf-node-gather-daemonset-78vdz\" (UID: \"ba152c57-2f80-4b71-b8f8-a45554ae5e23\") " pod="openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz" Apr 17 09:30:07.781588 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:07.781441 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/ba152c57-2f80-4b71-b8f8-a45554ae5e23-podres\") pod \"perf-node-gather-daemonset-78vdz\" (UID: \"ba152c57-2f80-4b71-b8f8-a45554ae5e23\") " pod="openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz" Apr 17 09:30:07.781588 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:07.781475 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ba152c57-2f80-4b71-b8f8-a45554ae5e23-lib-modules\") pod \"perf-node-gather-daemonset-78vdz\" (UID: \"ba152c57-2f80-4b71-b8f8-a45554ae5e23\") " pod="openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz" Apr 17 09:30:07.788781 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:07.788751 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgwjj\" (UniqueName: \"kubernetes.io/projected/ba152c57-2f80-4b71-b8f8-a45554ae5e23-kube-api-access-hgwjj\") pod \"perf-node-gather-daemonset-78vdz\" (UID: \"ba152c57-2f80-4b71-b8f8-a45554ae5e23\") " pod="openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz" Apr 17 09:30:07.931529 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:07.931486 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz" Apr 17 09:30:08.074402 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:08.074231 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz"] Apr 17 09:30:08.076525 ip-10-0-138-237 kubenswrapper[2574]: W0417 09:30:08.076498 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podba152c57_2f80_4b71_b8f8_a45554ae5e23.slice/crio-3b88815f323234fc8c0523b1359805f05c81d2d46cf3b36c24d1642a26b589c1 WatchSource:0}: Error finding container 3b88815f323234fc8c0523b1359805f05c81d2d46cf3b36c24d1642a26b589c1: Status 404 returned error can't find the container with id 3b88815f323234fc8c0523b1359805f05c81d2d46cf3b36c24d1642a26b589c1 Apr 17 09:30:08.343298 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:08.343272 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zkvsl_7281a4b6-76d6-494b-98e2-8fd1f322c7de/dns/0.log" Apr 17 09:30:08.362633 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:08.362609 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-zkvsl_7281a4b6-76d6-494b-98e2-8fd1f322c7de/kube-rbac-proxy/0.log" Apr 17 09:30:08.426799 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:08.426778 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-g5ls5_0755186c-8ac0-47fe-abc7-dd4eae84ad55/dns-node-resolver/0.log" Apr 17 09:30:08.617702 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:08.617616 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz" event={"ID":"ba152c57-2f80-4b71-b8f8-a45554ae5e23","Type":"ContainerStarted","Data":"dc66b9b2332a2956fd28e748890e94f11b63ef3b92e73e401485e01f2df3bcc3"} Apr 17 09:30:08.617702 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:08.617650 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz" event={"ID":"ba152c57-2f80-4b71-b8f8-a45554ae5e23","Type":"ContainerStarted","Data":"3b88815f323234fc8c0523b1359805f05c81d2d46cf3b36c24d1642a26b589c1"} Apr 17 09:30:08.617702 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:08.617678 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz" Apr 17 09:30:08.645254 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:08.645209 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz" podStartSLOduration=1.645166159 podStartE2EDuration="1.645166159s" podCreationTimestamp="2026-04-17 09:30:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 09:30:08.643812785 +0000 UTC m=+553.239864678" watchObservedRunningTime="2026-04-17 09:30:08.645166159 +0000 UTC m=+553.241218055" Apr 17 09:30:08.870894 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:08.870793 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-l7tg5_7a3125fc-e8c4-420c-8d7b-684643355422/node-ca/0.log" Apr 17 09:30:09.583116 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:09.583088 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-59cd84dcb8-fhdx4_74b4cdd2-7175-4d47-9486-0863bdb1bdb2/router/0.log" Apr 17 09:30:09.928298 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:09.928273 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-v2rkj_2721fe42-279c-4536-9769-411e4918ceac/serve-healthcheck-canary/0.log" Apr 17 09:30:10.223462 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:10.223389 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-sfz45_51eeb544-5d28-4c8c-8577-6f932bfee2ce/insights-operator/0.log" Apr 17 09:30:10.224944 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:10.224921 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-sfz45_51eeb544-5d28-4c8c-8577-6f932bfee2ce/insights-operator/1.log" Apr 17 09:30:10.241824 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:10.241807 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d5b4r_a38ec03e-4cba-4d52-80e2-d579913e7f31/kube-rbac-proxy/0.log" Apr 17 09:30:10.259617 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:10.259593 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d5b4r_a38ec03e-4cba-4d52-80e2-d579913e7f31/exporter/0.log" Apr 17 09:30:10.278127 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:10.278099 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-d5b4r_a38ec03e-4cba-4d52-80e2-d579913e7f31/extractor/0.log" Apr 17 09:30:11.856338 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:11.856312 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-operator-747c5859c7-b8z96_a0386386-2475-49d6-aa85-18eae48e0dab/jobset-operator/0.log" Apr 17 09:30:14.558417 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:14.558392 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-f9crx_5b0284d6-e5f1-458f-a124-d9b4696c61af/migrator/0.log" Apr 17 09:30:14.578490 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:14.578462 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-f9crx_5b0284d6-e5f1-458f-a124-d9b4696c61af/graceful-termination/0.log" Apr 17 09:30:14.631657 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:14.631629 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xkrgt/perf-node-gather-daemonset-78vdz" Apr 17 09:30:15.709872 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:15.709826 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g62mf_34bc662a-193e-440f-9c2e-1dee8a208524/kube-multus-additional-cni-plugins/0.log" Apr 17 09:30:15.728648 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:15.728621 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g62mf_34bc662a-193e-440f-9c2e-1dee8a208524/egress-router-binary-copy/0.log" Apr 17 09:30:15.746618 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:15.746555 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g62mf_34bc662a-193e-440f-9c2e-1dee8a208524/cni-plugins/0.log" Apr 17 09:30:15.767232 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:15.767202 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g62mf_34bc662a-193e-440f-9c2e-1dee8a208524/bond-cni-plugin/0.log" Apr 17 09:30:15.785469 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:15.785403 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g62mf_34bc662a-193e-440f-9c2e-1dee8a208524/routeoverride-cni/0.log" Apr 17 09:30:15.808049 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:15.808020 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g62mf_34bc662a-193e-440f-9c2e-1dee8a208524/whereabouts-cni-bincopy/0.log" Apr 17 09:30:15.831307 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:15.831286 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g62mf_34bc662a-193e-440f-9c2e-1dee8a208524/whereabouts-cni/0.log" Apr 17 09:30:16.001824 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:16.001752 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-c8z79_e2050949-863c-4e07-8b7f-adfdaf82601d/kube-multus/0.log" Apr 17 09:30:16.050803 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:16.050770 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-22cz6_fba6f7ca-a68b-4315-91fd-d249cb9d13d1/network-metrics-daemon/0.log" Apr 17 09:30:16.069463 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:16.069439 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-22cz6_fba6f7ca-a68b-4315-91fd-d249cb9d13d1/kube-rbac-proxy/0.log" Apr 17 09:30:17.411266 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:17.411226 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rb9kg_19bff221-f968-4a84-9891-8578f50203f2/ovn-controller/0.log" Apr 17 09:30:17.425397 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:17.425361 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rb9kg_19bff221-f968-4a84-9891-8578f50203f2/ovn-acl-logging/0.log" Apr 17 09:30:17.430500 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:17.430481 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rb9kg_19bff221-f968-4a84-9891-8578f50203f2/ovn-acl-logging/1.log" Apr 17 09:30:17.453942 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:17.453911 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rb9kg_19bff221-f968-4a84-9891-8578f50203f2/kube-rbac-proxy-node/0.log" Apr 17 09:30:17.502385 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:17.502363 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rb9kg_19bff221-f968-4a84-9891-8578f50203f2/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 09:30:17.517188 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:17.517158 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rb9kg_19bff221-f968-4a84-9891-8578f50203f2/northd/0.log" Apr 17 09:30:17.534983 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:17.534960 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rb9kg_19bff221-f968-4a84-9891-8578f50203f2/nbdb/0.log" Apr 17 09:30:17.553248 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:17.553224 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rb9kg_19bff221-f968-4a84-9891-8578f50203f2/sbdb/0.log" Apr 17 09:30:17.705198 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:17.705106 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rb9kg_19bff221-f968-4a84-9891-8578f50203f2/ovnkube-controller/0.log" Apr 17 09:30:18.656434 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:18.656410 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-v6qts_da8df46a-d006-4e3a-8a95-df428038ed39/network-check-target-container/0.log" Apr 17 09:30:19.397934 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:19.397906 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-2g9gn_2a16b37e-cd01-4bbd-9f94-16d59a30ae97/iptables-alerter/0.log" Apr 17 09:30:20.020711 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:20.020680 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-nl7c6_a0dde629-4b00-4d8e-8f44-daa979a1e1a8/tuned/0.log" Apr 17 09:30:22.360046 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:22.360017 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-gmmqr_824a0058-01c3-4126-965f-c9f5a5d55e99/service-ca-operator/1.log" Apr 17 09:30:22.361744 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:22.361687 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-gmmqr_824a0058-01c3-4126-965f-c9f5a5d55e99/service-ca-operator/0.log" Apr 17 09:30:22.666442 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:22.666410 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca_service-ca-865cb79987-44k6p_35dfea14-84cc-4879-ad5e-6c2cc44d00de/service-ca-controller/0.log" Apr 17 09:30:23.042905 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:23.042834 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-c7s55_03476f8a-15ce-445d-b484-102c5da8fbed/csi-driver/0.log" Apr 17 09:30:23.060641 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:23.060618 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-c7s55_03476f8a-15ce-445d-b484-102c5da8fbed/csi-node-driver-registrar/0.log" Apr 17 09:30:23.077210 ip-10-0-138-237 kubenswrapper[2574]: I0417 09:30:23.077185 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-c7s55_03476f8a-15ce-445d-b484-102c5da8fbed/csi-liveness-probe/0.log"