Apr 22 17:53:27.080128 ip-10-0-132-106 systemd[1]: Starting Kubernetes Kubelet... Apr 22 17:53:27.525090 ip-10-0-132-106 kubenswrapper[2566]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:53:27.525090 ip-10-0-132-106 kubenswrapper[2566]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 17:53:27.525090 ip-10-0-132-106 kubenswrapper[2566]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:53:27.525090 ip-10-0-132-106 kubenswrapper[2566]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 17:53:27.525090 ip-10-0-132-106 kubenswrapper[2566]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 17:53:27.526805 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.526726 2566 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 17:53:27.530621 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530602 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:27.530621 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530617 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:27.530621 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530622 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:27.530621 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530626 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:27.530778 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530629 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:27.530778 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530632 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:27.530778 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530635 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:27.530778 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530638 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:27.530778 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530641 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:27.530778 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530644 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:27.530778 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530647 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:27.530778 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530649 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:27.530778 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530652 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:27.530778 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530655 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:27.530778 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530657 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:27.530778 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530660 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:27.530778 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530662 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:27.530778 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530665 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:27.530778 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530669 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:27.530778 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530674 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:27.530778 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530677 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:27.530778 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530679 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:27.530778 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530682 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:27.531250 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530685 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:27.531250 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530688 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:27.531250 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530690 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:27.531250 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530693 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:27.531250 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530695 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:27.531250 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530698 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:27.531250 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530701 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:27.531250 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530703 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:27.531250 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530706 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:27.531250 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530708 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:27.531250 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530711 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:27.531250 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530714 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:27.531250 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530716 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:27.531250 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530719 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:27.531250 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530722 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:27.531250 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530725 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:27.531250 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530728 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:27.531250 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530731 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:27.531250 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530733 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:27.531250 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530736 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:27.531734 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530739 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:27.531734 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530741 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:27.531734 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530744 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:27.531734 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530747 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:27.531734 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530750 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:27.531734 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530752 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:27.531734 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530755 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:27.531734 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530757 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:27.531734 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530760 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:27.531734 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530763 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:27.531734 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530766 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:27.531734 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530768 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:27.531734 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530770 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:27.531734 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530774 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:27.531734 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530776 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:27.531734 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530778 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:27.531734 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530781 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:27.531734 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530784 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:27.531734 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530787 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:27.531734 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530790 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:27.532222 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530793 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:27.532222 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530795 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:27.532222 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530799 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:27.532222 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530803 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:27.532222 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530805 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:27.532222 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530809 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:27.532222 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530812 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:27.532222 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530815 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:27.532222 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530818 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:27.532222 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530821 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:27.532222 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530823 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:27.532222 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530826 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:27.532222 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530828 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:27.532222 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530831 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:27.532222 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530833 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:27.532222 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530843 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:27.532222 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530846 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:27.532222 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530848 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:27.532222 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530851 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:27.532222 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530853 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:27.532732 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530856 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:27.532732 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530874 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:27.532732 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.530877 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:27.532732 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531266 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:27.532732 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531271 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:27.532732 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531273 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:27.532732 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531276 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:27.532732 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531279 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:27.532732 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531282 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:27.532732 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531285 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:27.532732 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531287 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:27.532732 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531290 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:27.532732 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531293 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:27.532732 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531295 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:27.532732 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531298 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:27.532732 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531301 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:27.532732 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531303 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:27.532732 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531306 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:27.532732 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531309 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:27.532732 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531312 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:27.533228 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531315 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:27.533228 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531318 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:27.533228 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531321 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:27.533228 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531324 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:27.533228 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531327 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:27.533228 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531329 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:27.533228 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531332 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:27.533228 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531340 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:27.533228 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531343 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:27.533228 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531345 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:27.533228 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531348 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:27.533228 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531350 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:27.533228 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531353 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:27.533228 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531355 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:27.533228 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531358 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:27.533228 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531360 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:27.533228 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531365 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:27.533228 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531368 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:27.533228 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531371 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:27.533688 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531374 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:27.533688 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531377 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:27.533688 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531380 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:27.533688 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531383 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:27.533688 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531386 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:27.533688 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531389 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:27.533688 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531392 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:27.533688 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531395 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:27.533688 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531398 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:27.533688 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531400 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:27.533688 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531403 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:27.533688 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531406 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:27.533688 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531408 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:27.533688 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531411 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:27.533688 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531414 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:27.533688 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531416 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:27.533688 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531419 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:27.533688 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531422 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:27.533688 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531425 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:27.533688 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531427 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:27.534198 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531430 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:27.534198 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531434 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:27.534198 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531436 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:27.534198 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531439 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:27.534198 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531442 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:27.534198 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531444 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:27.534198 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531447 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:27.534198 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531450 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:27.534198 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531452 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:27.534198 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531454 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:27.534198 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531457 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:27.534198 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531459 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:27.534198 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531462 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:27.534198 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531464 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:27.534198 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531467 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:27.534198 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531469 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:27.534198 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531472 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:27.534198 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531474 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:27.534198 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531477 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:27.534198 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531480 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:27.534694 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531483 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:27.534694 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531486 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:27.534694 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531488 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:27.534694 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531492 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:27.534694 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531494 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:27.534694 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531497 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:27.534694 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531499 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:27.534694 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531502 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:27.534694 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531505 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:27.534694 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.531507 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:27.534694 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533384 2566 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 17:53:27.534694 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533392 2566 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 17:53:27.534694 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533402 2566 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 17:53:27.534694 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533406 2566 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 17:53:27.534694 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533413 2566 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 17:53:27.534694 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533417 2566 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 17:53:27.534694 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533421 2566 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 17:53:27.534694 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533426 2566 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 17:53:27.534694 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533429 2566 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 17:53:27.534694 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533432 2566 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 17:53:27.534694 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533436 2566 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533439 2566 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533442 2566 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533445 2566 flags.go:64] FLAG: --cgroup-root="" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533448 2566 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533451 2566 flags.go:64] FLAG: --client-ca-file="" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533454 2566 flags.go:64] FLAG: --cloud-config="" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533457 2566 flags.go:64] FLAG: --cloud-provider="external" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533459 2566 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533464 2566 flags.go:64] FLAG: --cluster-domain="" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533467 2566 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533470 2566 flags.go:64] FLAG: --config-dir="" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533473 2566 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533476 2566 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533480 2566 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533483 2566 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533486 2566 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533490 2566 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533492 2566 flags.go:64] FLAG: --contention-profiling="false" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533495 2566 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533499 2566 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533502 2566 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533505 2566 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533509 2566 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533512 2566 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 17:53:27.535219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533514 2566 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533517 2566 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533521 2566 flags.go:64] FLAG: --enable-server="true" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533524 2566 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533528 2566 flags.go:64] FLAG: --event-burst="100" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533531 2566 flags.go:64] FLAG: --event-qps="50" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533535 2566 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533538 2566 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533541 2566 flags.go:64] FLAG: --eviction-hard="" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533549 2566 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533552 2566 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533555 2566 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533558 2566 flags.go:64] FLAG: --eviction-soft="" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533561 2566 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533564 2566 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533567 2566 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533570 2566 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533573 2566 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533575 2566 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533578 2566 flags.go:64] FLAG: --feature-gates="" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533583 2566 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533586 2566 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533588 2566 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533592 2566 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533595 2566 flags.go:64] FLAG: --healthz-port="10248" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533598 2566 flags.go:64] FLAG: --help="false" Apr 22 17:53:27.535812 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533601 2566 flags.go:64] FLAG: --hostname-override="ip-10-0-132-106.ec2.internal" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533604 2566 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533606 2566 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533609 2566 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533613 2566 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533616 2566 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533619 2566 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533622 2566 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533625 2566 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533628 2566 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533631 2566 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533634 2566 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533637 2566 flags.go:64] FLAG: --kube-reserved="" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533640 2566 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533643 2566 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533646 2566 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533649 2566 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533652 2566 flags.go:64] FLAG: --lock-file="" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533655 2566 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533658 2566 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533661 2566 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533666 2566 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533669 2566 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533672 2566 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 17:53:27.536450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533675 2566 flags.go:64] FLAG: --logging-format="text" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533678 2566 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533681 2566 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533684 2566 flags.go:64] FLAG: --manifest-url="" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533686 2566 flags.go:64] FLAG: --manifest-url-header="" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533691 2566 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533694 2566 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533699 2566 flags.go:64] FLAG: --max-pods="110" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533702 2566 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533705 2566 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533708 2566 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533711 2566 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533714 2566 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533717 2566 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533720 2566 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533726 2566 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533729 2566 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533733 2566 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533736 2566 flags.go:64] FLAG: --pod-cidr="" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533739 2566 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533745 2566 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533748 2566 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533751 2566 flags.go:64] FLAG: --pods-per-core="0" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533754 2566 flags.go:64] FLAG: --port="10250" Apr 22 17:53:27.537123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533756 2566 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533759 2566 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-08b298c8d50975d4c" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533763 2566 flags.go:64] FLAG: --qos-reserved="" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533766 2566 flags.go:64] FLAG: --read-only-port="10255" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533769 2566 flags.go:64] FLAG: --register-node="true" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533771 2566 flags.go:64] FLAG: --register-schedulable="true" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533774 2566 flags.go:64] FLAG: --register-with-taints="" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533778 2566 flags.go:64] FLAG: --registry-burst="10" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533781 2566 flags.go:64] FLAG: --registry-qps="5" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533784 2566 flags.go:64] FLAG: --reserved-cpus="" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533787 2566 flags.go:64] FLAG: --reserved-memory="" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533790 2566 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533793 2566 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533796 2566 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533799 2566 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533802 2566 flags.go:64] FLAG: --runonce="false" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533805 2566 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533808 2566 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533810 2566 flags.go:64] FLAG: --seccomp-default="false" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533813 2566 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533816 2566 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533819 2566 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533822 2566 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533825 2566 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533827 2566 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533830 2566 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 17:53:27.537726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533833 2566 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533836 2566 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533839 2566 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533842 2566 flags.go:64] FLAG: --system-cgroups="" Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533845 2566 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533850 2566 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533853 2566 flags.go:64] FLAG: --tls-cert-file="" Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533856 2566 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533871 2566 flags.go:64] FLAG: --tls-min-version="" Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533873 2566 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533876 2566 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533879 2566 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533882 2566 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533885 2566 flags.go:64] FLAG: --v="2" Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533889 2566 flags.go:64] FLAG: --version="false" Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533893 2566 flags.go:64] FLAG: --vmodule="" Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533897 2566 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.533900 2566 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.533996 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534000 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534003 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534006 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534009 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:27.538365 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534012 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:27.538957 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534015 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:27.538957 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534018 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:27.538957 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534021 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:27.538957 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534023 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:27.538957 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534026 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:27.538957 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534028 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:27.538957 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534031 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:27.538957 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534033 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:27.538957 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534036 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:27.538957 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534038 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:27.538957 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534041 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:27.538957 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534048 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:27.538957 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534051 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:27.538957 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534053 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:27.538957 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534056 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:27.538957 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534059 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:27.538957 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534061 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:27.538957 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534064 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:27.538957 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534067 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:27.538957 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534069 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:27.539484 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534072 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:27.539484 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534074 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:27.539484 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534077 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:27.539484 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534079 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:27.539484 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534082 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:27.539484 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534084 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:27.539484 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534087 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:27.539484 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534090 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:27.539484 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534092 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:27.539484 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534095 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:27.539484 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534097 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:27.539484 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534100 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:27.539484 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534102 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:27.539484 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534105 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:27.539484 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534108 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:27.539484 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534110 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:27.539484 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534112 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:27.539484 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534115 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:27.539484 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534118 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:27.539484 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534120 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:27.540006 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534122 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:27.540006 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534125 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:27.540006 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534128 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:27.540006 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534132 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:27.540006 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534135 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:27.540006 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534138 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:27.540006 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534141 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:27.540006 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534143 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:27.540006 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534146 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:27.540006 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534150 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:27.540006 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534153 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:27.540006 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534156 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:27.540006 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534159 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:27.540006 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534162 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:27.540006 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534165 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:27.540006 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534167 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:27.540006 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534170 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:27.540006 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534173 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:27.540006 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534175 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:27.540472 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534179 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:27.540472 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534183 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:27.540472 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534186 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:27.540472 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534188 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:27.540472 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534191 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:27.540472 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534193 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:27.540472 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534196 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:27.540472 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534198 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:27.540472 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534201 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:27.540472 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534204 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:27.540472 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534207 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:27.540472 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534209 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:27.540472 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534212 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:27.540472 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534215 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:27.540472 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534218 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:27.540472 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534222 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:27.540472 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534226 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:27.540472 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534229 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:27.540472 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534232 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:27.540472 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534234 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:27.541008 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.534237 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:27.541008 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.534890 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:53:27.541537 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.541518 2566 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 17:53:27.541568 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.541537 2566 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 17:53:27.541597 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541593 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:27.541627 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541600 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:27.541627 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541605 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:27.541627 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541608 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:27.541627 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541612 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:27.541627 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541615 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:27.541627 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541619 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:27.541627 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541621 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:27.541627 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541624 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:27.541627 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541628 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:27.541627 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541631 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:27.541894 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541634 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:27.541894 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541637 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:27.541894 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541640 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:27.541894 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541642 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:27.541894 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541645 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:27.541894 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541647 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:27.541894 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541650 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:27.541894 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541653 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:27.541894 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541655 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:27.541894 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541658 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:27.541894 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541661 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:27.541894 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541663 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:27.541894 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541666 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:27.541894 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541668 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:27.541894 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541671 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:27.541894 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541674 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:27.541894 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541677 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:27.541894 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541679 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:27.541894 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541682 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:27.541894 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541684 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:27.542386 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541688 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:27.542386 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541691 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:27.542386 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541693 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:27.542386 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541696 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:27.542386 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541698 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:27.542386 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541701 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:27.542386 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541704 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:27.542386 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541707 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:27.542386 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541709 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:27.542386 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541712 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:27.542386 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541714 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:27.542386 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541716 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:27.542386 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541719 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:27.542386 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541722 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:27.542386 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541725 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:27.542386 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541728 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:27.542386 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541730 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:27.542386 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541733 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:27.542386 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541735 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:27.542386 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541737 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:27.542999 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541740 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:27.542999 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541742 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:27.542999 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541745 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:27.542999 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541747 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:27.542999 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541750 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:27.542999 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541753 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:27.542999 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541755 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:27.542999 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541757 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:27.542999 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541760 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:27.542999 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541762 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:27.542999 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541765 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:27.542999 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541768 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:27.542999 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541770 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:27.542999 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541773 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:27.542999 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541775 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:27.542999 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541778 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:27.542999 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541780 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:27.542999 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541783 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:27.542999 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541786 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:27.543469 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541789 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:27.543469 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541791 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:27.543469 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541794 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:27.543469 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541798 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:27.543469 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541801 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:27.543469 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541804 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:27.543469 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541807 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:27.543469 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541811 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:27.543469 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541814 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:27.543469 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541816 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:27.543469 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541819 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:27.543469 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541822 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:27.543469 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541824 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:27.543469 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541827 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:27.543469 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541829 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:27.543469 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541832 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:27.543855 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.541836 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:53:27.543855 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541954 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 17:53:27.543855 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541959 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 17:53:27.543855 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541962 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 17:53:27.543855 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541965 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 17:53:27.543855 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541967 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 17:53:27.543855 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541970 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 17:53:27.543855 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541973 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 17:53:27.543855 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541975 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 17:53:27.543855 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541978 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 17:53:27.543855 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541980 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 17:53:27.543855 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541983 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 17:53:27.543855 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541986 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 17:53:27.543855 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541988 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 17:53:27.543855 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541991 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 17:53:27.543855 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541994 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 17:53:27.544261 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541996 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 17:53:27.544261 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.541998 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 17:53:27.544261 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542001 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 17:53:27.544261 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542003 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 17:53:27.544261 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542006 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 17:53:27.544261 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542008 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 17:53:27.544261 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542011 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 17:53:27.544261 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542013 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 17:53:27.544261 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542016 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 17:53:27.544261 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542019 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 17:53:27.544261 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542021 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 17:53:27.544261 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542024 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 17:53:27.544261 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542026 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 17:53:27.544261 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542029 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 17:53:27.544261 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542031 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 17:53:27.544261 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542034 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 17:53:27.544261 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542036 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 17:53:27.544261 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542039 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 17:53:27.544261 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542041 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 17:53:27.544261 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542044 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 17:53:27.544743 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542046 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 17:53:27.544743 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542049 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 17:53:27.544743 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542052 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 17:53:27.544743 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542054 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 17:53:27.544743 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542057 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 17:53:27.544743 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542059 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 17:53:27.544743 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542061 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 17:53:27.544743 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542064 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 17:53:27.544743 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542067 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 17:53:27.544743 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542069 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 17:53:27.544743 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542072 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 17:53:27.544743 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542074 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 17:53:27.544743 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542077 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 17:53:27.544743 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542080 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 17:53:27.544743 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542082 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 17:53:27.544743 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542085 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 17:53:27.544743 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542087 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 17:53:27.544743 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542091 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 17:53:27.544743 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542094 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 17:53:27.544743 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542097 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 17:53:27.545248 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542099 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 17:53:27.545248 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542102 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 17:53:27.545248 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542104 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 17:53:27.545248 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542107 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 17:53:27.545248 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542109 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 17:53:27.545248 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542112 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 17:53:27.545248 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542114 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 17:53:27.545248 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542117 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 17:53:27.545248 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542119 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 17:53:27.545248 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542122 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 17:53:27.545248 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542124 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 17:53:27.545248 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542127 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 17:53:27.545248 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542131 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 17:53:27.545248 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542135 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 17:53:27.545248 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542137 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 17:53:27.545248 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542140 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 17:53:27.545248 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542142 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 17:53:27.545248 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542145 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 22 17:53:27.545248 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542148 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 17:53:27.545248 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542150 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 17:53:27.545716 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542152 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 17:53:27.545716 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542155 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 17:53:27.545716 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542158 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 17:53:27.545716 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542160 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 17:53:27.545716 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542163 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 17:53:27.545716 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542165 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 17:53:27.545716 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542168 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 17:53:27.545716 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542170 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 17:53:27.545716 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542173 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 17:53:27.545716 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542175 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 17:53:27.545716 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:27.542178 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 17:53:27.545716 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.542183 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 17:53:27.545716 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.542783 2566 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 17:53:27.545716 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.544617 2566 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 17:53:27.545716 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.545712 2566 server.go:1019] "Starting client certificate rotation" Apr 22 17:53:27.546098 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.545805 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:53:27.546098 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.545834 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 17:53:27.571638 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.571619 2566 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:53:27.575937 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.575919 2566 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 17:53:27.591325 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.591307 2566 log.go:25] "Validated CRI v1 runtime API" Apr 22 17:53:27.596799 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.596784 2566 log.go:25] "Validated CRI v1 image API" Apr 22 17:53:27.598839 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.598823 2566 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 17:53:27.601174 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.601158 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:53:27.602899 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.602882 2566 fs.go:135] Filesystem UUIDs: map[0852c2f3-8934-493e-a572-863900c608b6:/dev/nvme0n1p3 66e42116-ccd8-484a-87e5-1d1cb6fe2b74:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 22 17:53:27.602956 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.602900 2566 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 17:53:27.608839 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.608736 2566 manager.go:217] Machine: {Timestamp:2026-04-22 17:53:27.606519188 +0000 UTC m=+0.406412399 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098977 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2ab737e3f5f0dbf074c7ec5927691a SystemUUID:ec2ab737-e3f5-f0db-f074-c7ec5927691a BootID:8961ade4-7b39-4976-87da-a3089abfe8b8 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a5:bd:ec:ab:5f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a5:bd:ec:ab:5f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:32:c4:75:79:b5:37 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 17:53:27.609278 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.609267 2566 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 17:53:27.609351 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.609340 2566 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 17:53:27.610388 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.610368 2566 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 17:53:27.610526 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.610390 2566 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-106.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 17:53:27.610577 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.610534 2566 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 17:53:27.610577 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.610551 2566 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 17:53:27.610577 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.610564 2566 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:53:27.611323 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.611313 2566 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 17:53:27.612039 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.612029 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:53:27.612141 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.612132 2566 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 17:53:27.614667 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.614656 2566 kubelet.go:491] "Attempting to sync node with API server" Apr 22 17:53:27.614702 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.614677 2566 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 17:53:27.614702 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.614689 2566 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 17:53:27.614702 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.614699 2566 kubelet.go:397] "Adding apiserver pod source" Apr 22 17:53:27.614790 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.614708 2566 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 17:53:27.615750 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.615739 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:53:27.615787 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.615756 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 17:53:27.618732 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.618710 2566 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 17:53:27.619929 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.619916 2566 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 17:53:27.621178 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.621161 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-x22lx" Apr 22 17:53:27.621681 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.621651 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 17:53:27.621681 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.621670 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 17:53:27.621681 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.621676 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 17:53:27.621681 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.621682 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 17:53:27.621681 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.621690 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 17:53:27.622007 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.621700 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 17:53:27.622007 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.621709 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 17:53:27.622007 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.621718 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 17:53:27.622007 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.621726 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 17:53:27.622007 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.621731 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 17:53:27.622007 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.621751 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 17:53:27.622007 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.621760 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 17:53:27.622726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.622714 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 17:53:27.622757 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.622727 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 17:53:27.626323 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.626302 2566 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 17:53:27.626416 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.626361 2566 server.go:1295] "Started kubelet" Apr 22 17:53:27.626876 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.626831 2566 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 17:53:27.627144 ip-10-0-132-106 systemd[1]: Started Kubernetes Kubelet. Apr 22 17:53:27.627270 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.627183 2566 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 17:53:27.627270 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.627260 2566 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 17:53:27.628302 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.628288 2566 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 17:53:27.628585 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:27.628559 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 17:53:27.628714 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:27.628699 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-106.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 17:53:27.628758 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.628741 2566 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-106.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 17:53:27.630227 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.630210 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-x22lx" Apr 22 17:53:27.630311 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.630252 2566 server.go:317] "Adding debug handlers to kubelet server" Apr 22 17:53:27.636660 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.636644 2566 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 17:53:27.636747 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.636703 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 17:53:27.637749 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.637417 2566 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 17:53:27.637749 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.637453 2566 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 17:53:27.637749 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.637467 2566 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 17:53:27.637749 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.637569 2566 reconstruct.go:97] "Volume reconstruction finished" Apr 22 17:53:27.637749 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.637580 2566 reconciler.go:26] "Reconciler: start to sync state" Apr 22 17:53:27.637749 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:27.636753 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-106.ec2.internal.18a8bf5100eb0ab3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-106.ec2.internal,UID:ip-10-0-132-106.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-106.ec2.internal,},FirstTimestamp:2026-04-22 17:53:27.626320563 +0000 UTC m=+0.426213775,LastTimestamp:2026-04-22 17:53:27.626320563 +0000 UTC m=+0.426213775,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-106.ec2.internal,}" Apr 22 17:53:27.638108 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.637763 2566 factory.go:55] Registering systemd factory Apr 22 17:53:27.638108 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.637849 2566 factory.go:223] Registration of the systemd container factory successfully Apr 22 17:53:27.638108 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:27.637886 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-106.ec2.internal\" not found" Apr 22 17:53:27.639110 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.638958 2566 factory.go:153] Registering CRI-O factory Apr 22 17:53:27.639110 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.638976 2566 factory.go:223] Registration of the crio container factory successfully Apr 22 17:53:27.639110 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.639029 2566 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 17:53:27.639298 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.639252 2566 factory.go:103] Registering Raw factory Apr 22 17:53:27.639298 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.639275 2566 manager.go:1196] Started watching for new ooms in manager Apr 22 17:53:27.639941 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.639924 2566 manager.go:319] Starting recovery of all containers Apr 22 17:53:27.646727 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.646705 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:27.649415 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:27.649278 2566 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-132-106.ec2.internal\" not found" node="ip-10-0-132-106.ec2.internal" Apr 22 17:53:27.649669 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.649657 2566 manager.go:324] Recovery completed Apr 22 17:53:27.653741 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.653729 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:27.656508 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.656487 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-106.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:27.656607 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.656521 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:27.656607 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.656542 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-106.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:27.657219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.657202 2566 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 17:53:27.657289 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.657220 2566 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 17:53:27.657289 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.657241 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 22 17:53:27.659349 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.659336 2566 policy_none.go:49] "None policy: Start" Apr 22 17:53:27.659390 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.659352 2566 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 17:53:27.659390 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.659362 2566 state_mem.go:35] "Initializing new in-memory state store" Apr 22 17:53:27.703403 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.703388 2566 manager.go:341] "Starting Device Plugin manager" Apr 22 17:53:27.726057 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:27.703421 2566 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 17:53:27.726057 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.703434 2566 server.go:85] "Starting device plugin registration server" Apr 22 17:53:27.726057 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.703632 2566 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 17:53:27.726057 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.703643 2566 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 17:53:27.726057 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.703731 2566 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 17:53:27.726057 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.703823 2566 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 17:53:27.726057 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.703833 2566 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 17:53:27.726057 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:27.704153 2566 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 17:53:27.726057 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:27.704182 2566 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-106.ec2.internal\" not found" Apr 22 17:53:27.773312 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.773284 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 17:53:27.774413 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.774392 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 17:53:27.774413 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.774415 2566 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 17:53:27.774514 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.774429 2566 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 17:53:27.774514 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.774435 2566 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 17:53:27.774514 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:27.774463 2566 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 17:53:27.776370 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.776325 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:27.803974 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.803957 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:27.804700 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.804686 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-106.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:27.804753 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.804715 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:27.804753 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.804726 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-106.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:27.804753 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.804750 2566 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-106.ec2.internal" Apr 22 17:53:27.812024 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.812011 2566 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-106.ec2.internal" Apr 22 17:53:27.812085 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:27.812029 2566 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-106.ec2.internal\": node \"ip-10-0-132-106.ec2.internal\" not found" Apr 22 17:53:27.828568 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:27.828554 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-106.ec2.internal\" not found" Apr 22 17:53:27.875114 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.875090 2566 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-106.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-106.ec2.internal"] Apr 22 17:53:27.875187 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.875155 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:27.875930 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.875918 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-106.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:27.875979 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.875944 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:27.875979 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.875954 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-106.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:27.880631 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.880619 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:27.880757 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.880744 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-106.ec2.internal" Apr 22 17:53:27.880794 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.880772 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:27.881302 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.881288 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-106.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:27.881378 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.881312 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:27.881378 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.881326 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-106.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:27.881378 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.881347 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-106.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:27.881378 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.881363 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:27.881378 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.881372 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-106.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:27.886410 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.886396 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-106.ec2.internal" Apr 22 17:53:27.886453 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.886424 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 17:53:27.887070 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.887053 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-106.ec2.internal" event="NodeHasSufficientMemory" Apr 22 17:53:27.887146 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.887088 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-106.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 17:53:27.887146 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.887102 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-106.ec2.internal" event="NodeHasSufficientPID" Apr 22 17:53:27.900385 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:27.900367 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-106.ec2.internal\" not found" node="ip-10-0-132-106.ec2.internal" Apr 22 17:53:27.905951 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:27.905936 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-106.ec2.internal\" not found" node="ip-10-0-132-106.ec2.internal" Apr 22 17:53:27.928663 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:27.928650 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-106.ec2.internal\" not found" Apr 22 17:53:27.939985 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.939964 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/de01c4bd78187d5743793fda4e118da1-config\") pod \"kube-apiserver-proxy-ip-10-0-132-106.ec2.internal\" (UID: \"de01c4bd78187d5743793fda4e118da1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-106.ec2.internal" Apr 22 17:53:27.940063 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.939995 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e0fb4cdf390b8d2f8fff1cf55f8e8acb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-106.ec2.internal\" (UID: \"e0fb4cdf390b8d2f8fff1cf55f8e8acb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-106.ec2.internal" Apr 22 17:53:27.940063 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:27.940025 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0fb4cdf390b8d2f8fff1cf55f8e8acb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-106.ec2.internal\" (UID: \"e0fb4cdf390b8d2f8fff1cf55f8e8acb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-106.ec2.internal" Apr 22 17:53:28.029727 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:28.029686 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-106.ec2.internal\" not found" Apr 22 17:53:28.040120 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.040103 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e0fb4cdf390b8d2f8fff1cf55f8e8acb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-106.ec2.internal\" (UID: \"e0fb4cdf390b8d2f8fff1cf55f8e8acb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-106.ec2.internal" Apr 22 17:53:28.040180 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.040125 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0fb4cdf390b8d2f8fff1cf55f8e8acb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-106.ec2.internal\" (UID: \"e0fb4cdf390b8d2f8fff1cf55f8e8acb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-106.ec2.internal" Apr 22 17:53:28.040180 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.040142 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/de01c4bd78187d5743793fda4e118da1-config\") pod \"kube-apiserver-proxy-ip-10-0-132-106.ec2.internal\" (UID: \"de01c4bd78187d5743793fda4e118da1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-106.ec2.internal" Apr 22 17:53:28.040276 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.040184 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/de01c4bd78187d5743793fda4e118da1-config\") pod \"kube-apiserver-proxy-ip-10-0-132-106.ec2.internal\" (UID: \"de01c4bd78187d5743793fda4e118da1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-106.ec2.internal" Apr 22 17:53:28.040276 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.040208 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e0fb4cdf390b8d2f8fff1cf55f8e8acb-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-106.ec2.internal\" (UID: \"e0fb4cdf390b8d2f8fff1cf55f8e8acb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-106.ec2.internal" Apr 22 17:53:28.040276 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.040227 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0fb4cdf390b8d2f8fff1cf55f8e8acb-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-106.ec2.internal\" (UID: \"e0fb4cdf390b8d2f8fff1cf55f8e8acb\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-106.ec2.internal" Apr 22 17:53:28.130728 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:28.130690 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-106.ec2.internal\" not found" Apr 22 17:53:28.202164 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.202133 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-106.ec2.internal" Apr 22 17:53:28.208621 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.208607 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-106.ec2.internal" Apr 22 17:53:28.231592 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:28.231571 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-106.ec2.internal\" not found" Apr 22 17:53:28.332068 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:28.332011 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-106.ec2.internal\" not found" Apr 22 17:53:28.432553 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:28.432524 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-106.ec2.internal\" not found" Apr 22 17:53:28.480872 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.480849 2566 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:28.492081 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.492056 2566 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:28.537368 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.537351 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-106.ec2.internal" Apr 22 17:53:28.546002 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.545989 2566 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 17:53:28.546091 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.546077 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:53:28.546136 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.546119 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:53:28.546136 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.546128 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:53:28.546210 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.546141 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 17:53:28.546210 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:28.546154 2566 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://a1bf10caf35a343cf8ade7e7950838da-5faa5e806963b3f7.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.132.106:43258->3.227.73.220:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-106.ec2.internal" Apr 22 17:53:28.546210 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.546177 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-106.ec2.internal" Apr 22 17:53:28.564174 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.564155 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 17:53:28.615890 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.615823 2566 apiserver.go:52] "Watching apiserver" Apr 22 17:53:28.627520 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.627501 2566 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 17:53:28.628422 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.628402 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-2vnst","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-106.ec2.internal","openshift-multus/network-metrics-daemon-dv96w","openshift-network-diagnostics/network-check-target-ngcz8","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk","openshift-multus/multus-additional-cni-plugins-zmjph","openshift-multus/multus-cqc2t","openshift-network-operator/iptables-alerter-rhw5j","openshift-ovn-kubernetes/ovnkube-node-9z9pj","kube-system/konnectivity-agent-lfws2","kube-system/kube-apiserver-proxy-ip-10-0-132-106.ec2.internal","openshift-cluster-node-tuning-operator/tuned-55lxs","openshift-dns/node-resolver-9kn68"] Apr 22 17:53:28.630596 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.630577 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.631774 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.631759 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:28.631919 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:28.631895 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv96w" podUID="92650e2d-54ea-4904-8ee5-235164ed2949" Apr 22 17:53:28.633097 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.633082 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:28.633147 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:28.633133 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngcz8" podUID="6fb74441-7ec5-4482-ad08-21d23adeeb37" Apr 22 17:53:28.634074 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.634053 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 17:48:27 +0000 UTC" deadline="2027-10-01 19:12:01.685548389 +0000 UTC" Apr 22 17:53:28.634115 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.634073 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12649h18m33.051476961s" Apr 22 17:53:28.634398 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.634380 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.635513 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.635491 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 17:53:28.635513 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.635504 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 17:53:28.635625 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.635491 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 17:53:28.635881 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.635854 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.636002 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.635985 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 17:53:28.636696 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.636683 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-2frrq\"" Apr 22 17:53:28.636820 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.636810 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 17:53:28.637534 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.637304 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 17:53:28.637534 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.637406 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 17:53:28.637534 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.637418 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 17:53:28.637534 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.637482 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2vnst" Apr 22 17:53:28.637534 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.637535 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-cxxtn\"" Apr 22 17:53:28.638558 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.638537 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 17:53:28.638687 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.638674 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 17:53:28.639148 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.639134 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rhw5j" Apr 22 17:53:28.639912 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.639892 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-h2rxh\"" Apr 22 17:53:28.640342 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.640328 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 17:53:28.640402 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.640390 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 17:53:28.640601 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.640589 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 17:53:28.640938 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.640919 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.641149 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.641051 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5mp6q\"" Apr 22 17:53:28.641494 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.641479 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:53:28.641494 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.641486 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 17:53:28.642171 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.642154 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-9ml8k\"" Apr 22 17:53:28.642258 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.642161 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 17:53:28.642732 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.642714 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lfws2" Apr 22 17:53:28.643616 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.643453 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 17:53:28.643616 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.643494 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c437ff89-37eb-4bee-a67b-f2918685eee5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.643767 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.643735 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs\") pod \"network-metrics-daemon-dv96w\" (UID: \"92650e2d-54ea-4904-8ee5-235164ed2949\") " pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:28.643825 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.643774 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d947eb97-b29b-4a7e-bece-a9253fffdcd0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jclhk\" (UID: \"d947eb97-b29b-4a7e-bece-a9253fffdcd0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.643825 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.643800 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d947eb97-b29b-4a7e-bece-a9253fffdcd0-sys-fs\") pod \"aws-ebs-csi-driver-node-jclhk\" (UID: \"d947eb97-b29b-4a7e-bece-a9253fffdcd0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.643931 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.643826 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-multus-socket-dir-parent\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.643931 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.643877 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-etc-kubernetes\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.644098 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.644079 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 17:53:28.644155 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.643985 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xznzm\" (UniqueName: \"kubernetes.io/projected/d947eb97-b29b-4a7e-bece-a9253fffdcd0-kube-api-access-xznzm\") pod \"aws-ebs-csi-driver-node-jclhk\" (UID: \"d947eb97-b29b-4a7e-bece-a9253fffdcd0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.644155 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.644145 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d947eb97-b29b-4a7e-bece-a9253fffdcd0-socket-dir\") pod \"aws-ebs-csi-driver-node-jclhk\" (UID: \"d947eb97-b29b-4a7e-bece-a9253fffdcd0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.645630 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.644161 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 17:53:28.645630 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.644184 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dgsd\" (UniqueName: \"kubernetes.io/projected/90a8af9e-a3b4-4682-86d6-985e15148048-kube-api-access-2dgsd\") pod \"node-ca-2vnst\" (UID: \"90a8af9e-a3b4-4682-86d6-985e15148048\") " pod="openshift-image-registry/node-ca-2vnst" Apr 22 17:53:28.645630 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.644218 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-multus-cni-dir\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.645630 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.644248 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-cnibin\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.645630 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.644276 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9e89ff1c-b604-4ae4-a756-badea52f84ef-multus-daemon-config\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.645630 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.644305 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d947eb97-b29b-4a7e-bece-a9253fffdcd0-device-dir\") pod \"aws-ebs-csi-driver-node-jclhk\" (UID: \"d947eb97-b29b-4a7e-bece-a9253fffdcd0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.645630 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.644326 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 17:53:28.645630 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.644334 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d947eb97-b29b-4a7e-bece-a9253fffdcd0-etc-selinux\") pod \"aws-ebs-csi-driver-node-jclhk\" (UID: \"d947eb97-b29b-4a7e-bece-a9253fffdcd0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.645630 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.644538 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/90a8af9e-a3b4-4682-86d6-985e15148048-serviceca\") pod \"node-ca-2vnst\" (UID: \"90a8af9e-a3b4-4682-86d6-985e15148048\") " pod="openshift-image-registry/node-ca-2vnst" Apr 22 17:53:28.645630 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.644661 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-system-cni-dir\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.645630 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.644880 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-host-var-lib-cni-bin\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.645630 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.645060 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-t78px\"" Apr 22 17:53:28.645630 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.645040 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c437ff89-37eb-4bee-a67b-f2918685eee5-system-cni-dir\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.645630 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.645186 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 17:53:28.645630 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.645205 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-695v6\" (UniqueName: \"kubernetes.io/projected/c437ff89-37eb-4bee-a67b-f2918685eee5-kube-api-access-695v6\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.645630 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.645250 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0d845f70-52b2-4607-be37-ce8250614a3f-iptables-alerter-script\") pod \"iptables-alerter-rhw5j\" (UID: \"0d845f70-52b2-4607-be37-ce8250614a3f\") " pod="openshift-network-operator/iptables-alerter-rhw5j" Apr 22 17:53:28.645630 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.645480 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-host-run-k8s-cni-cncf-io\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.645630 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.645475 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.645630 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.645538 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-host-var-lib-kubelet\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.646535 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.645578 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-multus-conf-dir\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.646535 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.645614 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 17:53:28.646535 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.645647 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r5tt\" (UniqueName: \"kubernetes.io/projected/92650e2d-54ea-4904-8ee5-235164ed2949-kube-api-access-8r5tt\") pod \"network-metrics-daemon-dv96w\" (UID: \"92650e2d-54ea-4904-8ee5-235164ed2949\") " pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:28.646535 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.645695 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9e89ff1c-b604-4ae4-a756-badea52f84ef-cni-binary-copy\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.646535 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.645729 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-host-run-netns\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.646535 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.645764 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-host-var-lib-cni-multus\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.646535 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.645801 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-hostroot\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.646535 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.645838 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgck7\" (UniqueName: \"kubernetes.io/projected/9e89ff1c-b604-4ae4-a756-badea52f84ef-kube-api-access-xgck7\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.646535 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.645889 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c437ff89-37eb-4bee-a67b-f2918685eee5-cnibin\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.646535 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.645921 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0d845f70-52b2-4607-be37-ce8250614a3f-host-slash\") pod \"iptables-alerter-rhw5j\" (UID: \"0d845f70-52b2-4607-be37-ce8250614a3f\") " pod="openshift-network-operator/iptables-alerter-rhw5j" Apr 22 17:53:28.646535 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.645954 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9mp2\" (UniqueName: \"kubernetes.io/projected/0d845f70-52b2-4607-be37-ce8250614a3f-kube-api-access-g9mp2\") pod \"iptables-alerter-rhw5j\" (UID: \"0d845f70-52b2-4607-be37-ce8250614a3f\") " pod="openshift-network-operator/iptables-alerter-rhw5j" Apr 22 17:53:28.646535 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.646128 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-host-run-multus-certs\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.646535 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.646385 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 17:53:28.647155 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.646658 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-os-release\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.647155 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.646698 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c437ff89-37eb-4bee-a67b-f2918685eee5-os-release\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.647155 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.646731 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll6tj\" (UniqueName: \"kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj\") pod \"network-check-target-ngcz8\" (UID: \"6fb74441-7ec5-4482-ad08-21d23adeeb37\") " pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:28.647155 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.646760 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90a8af9e-a3b4-4682-86d6-985e15148048-host\") pod \"node-ca-2vnst\" (UID: \"90a8af9e-a3b4-4682-86d6-985e15148048\") " pod="openshift-image-registry/node-ca-2vnst" Apr 22 17:53:28.647155 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.646785 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d947eb97-b29b-4a7e-bece-a9253fffdcd0-registration-dir\") pod \"aws-ebs-csi-driver-node-jclhk\" (UID: \"d947eb97-b29b-4a7e-bece-a9253fffdcd0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.647155 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.646815 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c437ff89-37eb-4bee-a67b-f2918685eee5-cni-binary-copy\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.647155 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.646893 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c437ff89-37eb-4bee-a67b-f2918685eee5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.647155 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.646933 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c437ff89-37eb-4bee-a67b-f2918685eee5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.647508 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.647281 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-v965b\"" Apr 22 17:53:28.647508 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.647380 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 17:53:28.648928 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.648906 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jnfvt\"" Apr 22 17:53:28.649016 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.648937 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:53:28.649016 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.648971 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 17:53:28.649158 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.649144 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9kn68" Apr 22 17:53:28.651220 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.651206 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 17:53:28.651502 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.651491 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 17:53:28.653413 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.653397 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 17:53:28.654057 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.654045 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-mwpxm\"" Apr 22 17:53:28.682485 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.682471 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-85fph" Apr 22 17:53:28.691518 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.691505 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-85fph" Apr 22 17:53:28.738131 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.738115 2566 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 17:53:28.747429 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747412 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d947eb97-b29b-4a7e-bece-a9253fffdcd0-sys-fs\") pod \"aws-ebs-csi-driver-node-jclhk\" (UID: \"d947eb97-b29b-4a7e-bece-a9253fffdcd0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.747507 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747437 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-etc-modprobe-d\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.747507 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747454 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-lib-modules\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.747507 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747468 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkn6b\" (UniqueName: \"kubernetes.io/projected/abf52151-090a-4499-9e78-eebbda08114e-kube-api-access-bkn6b\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.747598 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747537 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d947eb97-b29b-4a7e-bece-a9253fffdcd0-sys-fs\") pod \"aws-ebs-csi-driver-node-jclhk\" (UID: \"d947eb97-b29b-4a7e-bece-a9253fffdcd0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.747598 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747568 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bfef47bf-9dff-44e0-8b1a-1397bf347548-hosts-file\") pod \"node-resolver-9kn68\" (UID: \"bfef47bf-9dff-44e0-8b1a-1397bf347548\") " pod="openshift-dns/node-resolver-9kn68" Apr 22 17:53:28.747598 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747587 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-etc-kubernetes\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.747695 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747603 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-var-lib-openvswitch\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.747695 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747621 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-etc-kubernetes\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.747695 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747643 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-etc-kubernetes\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.747695 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747638 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c437ff89-37eb-4bee-a67b-f2918685eee5-system-cni-dir\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.747848 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747693 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d947eb97-b29b-4a7e-bece-a9253fffdcd0-socket-dir\") pod \"aws-ebs-csi-driver-node-jclhk\" (UID: \"d947eb97-b29b-4a7e-bece-a9253fffdcd0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.747848 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747704 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c437ff89-37eb-4bee-a67b-f2918685eee5-system-cni-dir\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.747848 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747722 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2dgsd\" (UniqueName: \"kubernetes.io/projected/90a8af9e-a3b4-4682-86d6-985e15148048-kube-api-access-2dgsd\") pod \"node-ca-2vnst\" (UID: \"90a8af9e-a3b4-4682-86d6-985e15148048\") " pod="openshift-image-registry/node-ca-2vnst" Apr 22 17:53:28.747848 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747753 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-cnibin\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.747848 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747770 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9e89ff1c-b604-4ae4-a756-badea52f84ef-multus-daemon-config\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.747848 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747800 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d947eb97-b29b-4a7e-bece-a9253fffdcd0-socket-dir\") pod \"aws-ebs-csi-driver-node-jclhk\" (UID: \"d947eb97-b29b-4a7e-bece-a9253fffdcd0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.747848 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747838 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-cnibin\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.748096 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747895 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d947eb97-b29b-4a7e-bece-a9253fffdcd0-device-dir\") pod \"aws-ebs-csi-driver-node-jclhk\" (UID: \"d947eb97-b29b-4a7e-bece-a9253fffdcd0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.748096 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747926 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1685d860-db2e-419d-b016-516c1932fa2f-ovnkube-config\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.748096 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747960 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/90a8af9e-a3b4-4682-86d6-985e15148048-serviceca\") pod \"node-ca-2vnst\" (UID: \"90a8af9e-a3b4-4682-86d6-985e15148048\") " pod="openshift-image-registry/node-ca-2vnst" Apr 22 17:53:28.748096 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747969 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d947eb97-b29b-4a7e-bece-a9253fffdcd0-device-dir\") pod \"aws-ebs-csi-driver-node-jclhk\" (UID: \"d947eb97-b29b-4a7e-bece-a9253fffdcd0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.748096 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.747997 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-system-cni-dir\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.748096 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748024 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-695v6\" (UniqueName: \"kubernetes.io/projected/c437ff89-37eb-4bee-a67b-f2918685eee5-kube-api-access-695v6\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.748096 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748057 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-host-cni-bin\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.748405 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748092 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/da9cec39-924f-4446-be6f-25108e9c58ba-konnectivity-ca\") pod \"konnectivity-agent-lfws2\" (UID: \"da9cec39-924f-4446-be6f-25108e9c58ba\") " pod="kube-system/konnectivity-agent-lfws2" Apr 22 17:53:28.748405 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748102 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-system-cni-dir\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.748405 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748122 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-run\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.748405 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748138 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wcqn\" (UniqueName: \"kubernetes.io/projected/bfef47bf-9dff-44e0-8b1a-1397bf347548-kube-api-access-4wcqn\") pod \"node-resolver-9kn68\" (UID: \"bfef47bf-9dff-44e0-8b1a-1397bf347548\") " pod="openshift-dns/node-resolver-9kn68" Apr 22 17:53:28.748405 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748164 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-host-var-lib-kubelet\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.748405 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748187 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-multus-conf-dir\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.748405 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748209 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-host\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.748405 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748232 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9e89ff1c-b604-4ae4-a756-badea52f84ef-cni-binary-copy\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.748405 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748253 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-hostroot\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.748405 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748276 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-systemd-units\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.748405 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748299 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-host-run-netns\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.748405 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748324 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-node-log\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.748405 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748339 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/abf52151-090a-4499-9e78-eebbda08114e-etc-tuned\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.748405 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748357 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bfef47bf-9dff-44e0-8b1a-1397bf347548-tmp-dir\") pod \"node-resolver-9kn68\" (UID: \"bfef47bf-9dff-44e0-8b1a-1397bf347548\") " pod="openshift-dns/node-resolver-9kn68" Apr 22 17:53:28.748405 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748378 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-host-run-multus-certs\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.748999 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748439 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-host-run-multus-certs\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.748999 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748443 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/90a8af9e-a3b4-4682-86d6-985e15148048-serviceca\") pod \"node-ca-2vnst\" (UID: \"90a8af9e-a3b4-4682-86d6-985e15148048\") " pod="openshift-image-registry/node-ca-2vnst" Apr 22 17:53:28.748999 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748354 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-hostroot\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.748999 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748494 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-os-release\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.748999 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748510 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c437ff89-37eb-4bee-a67b-f2918685eee5-os-release\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.748999 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748538 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-host-slash\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.748999 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748552 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-etc-openvswitch\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.748999 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748629 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c437ff89-37eb-4bee-a67b-f2918685eee5-os-release\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.748999 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748672 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ll6tj\" (UniqueName: \"kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj\") pod \"network-check-target-ngcz8\" (UID: \"6fb74441-7ec5-4482-ad08-21d23adeeb37\") " pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:28.748999 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748705 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-os-release\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.748999 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748719 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-host-var-lib-kubelet\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.748999 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748745 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-multus-conf-dir\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.748999 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748797 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c437ff89-37eb-4bee-a67b-f2918685eee5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.748999 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748845 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-run-openvswitch\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.748999 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748898 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9e89ff1c-b604-4ae4-a756-badea52f84ef-cni-binary-copy\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.748999 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748903 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1685d860-db2e-419d-b016-516c1932fa2f-ovn-node-metrics-cert\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.748999 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748944 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/da9cec39-924f-4446-be6f-25108e9c58ba-agent-certs\") pod \"konnectivity-agent-lfws2\" (UID: \"da9cec39-924f-4446-be6f-25108e9c58ba\") " pod="kube-system/konnectivity-agent-lfws2" Apr 22 17:53:28.749587 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748968 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/abf52151-090a-4499-9e78-eebbda08114e-tmp\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.749587 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748988 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9e89ff1c-b604-4ae4-a756-badea52f84ef-multus-daemon-config\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.749587 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.748996 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs\") pod \"network-metrics-daemon-dv96w\" (UID: \"92650e2d-54ea-4904-8ee5-235164ed2949\") " pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:28.749587 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749020 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-host-cni-netd\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.749587 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:28.749108 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:28.749587 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749136 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-etc-sysctl-conf\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.749587 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:28.749160 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs podName:92650e2d-54ea-4904-8ee5-235164ed2949 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:29.249145631 +0000 UTC m=+2.049038851 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs") pod "network-metrics-daemon-dv96w" (UID: "92650e2d-54ea-4904-8ee5-235164ed2949") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:28.749587 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749223 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-var-lib-kubelet\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.749587 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749248 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-multus-socket-dir-parent\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.749587 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749264 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xznzm\" (UniqueName: \"kubernetes.io/projected/d947eb97-b29b-4a7e-bece-a9253fffdcd0-kube-api-access-xznzm\") pod \"aws-ebs-csi-driver-node-jclhk\" (UID: \"d947eb97-b29b-4a7e-bece-a9253fffdcd0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.749587 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749269 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c437ff89-37eb-4bee-a67b-f2918685eee5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.749587 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749282 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1685d860-db2e-419d-b016-516c1932fa2f-ovnkube-script-lib\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.749587 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749303 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6j62\" (UniqueName: \"kubernetes.io/projected/1685d860-db2e-419d-b016-516c1932fa2f-kube-api-access-h6j62\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.749587 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749322 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-multus-cni-dir\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.749587 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749330 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-multus-socket-dir-parent\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.749587 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749337 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d947eb97-b29b-4a7e-bece-a9253fffdcd0-etc-selinux\") pod \"aws-ebs-csi-driver-node-jclhk\" (UID: \"d947eb97-b29b-4a7e-bece-a9253fffdcd0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.750257 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749354 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-etc-sysconfig\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.750257 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749375 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-host-var-lib-cni-bin\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.750257 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749399 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0d845f70-52b2-4607-be37-ce8250614a3f-iptables-alerter-script\") pod \"iptables-alerter-rhw5j\" (UID: \"0d845f70-52b2-4607-be37-ce8250614a3f\") " pod="openshift-network-operator/iptables-alerter-rhw5j" Apr 22 17:53:28.750257 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749423 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-log-socket\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.750257 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749440 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-multus-cni-dir\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.750257 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749449 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d947eb97-b29b-4a7e-bece-a9253fffdcd0-etc-selinux\") pod \"aws-ebs-csi-driver-node-jclhk\" (UID: \"d947eb97-b29b-4a7e-bece-a9253fffdcd0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.750257 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749447 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-host-var-lib-cni-bin\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.750257 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749489 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-host-run-k8s-cni-cncf-io\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.750257 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749450 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-host-run-k8s-cni-cncf-io\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.750257 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749554 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8r5tt\" (UniqueName: \"kubernetes.io/projected/92650e2d-54ea-4904-8ee5-235164ed2949-kube-api-access-8r5tt\") pod \"network-metrics-daemon-dv96w\" (UID: \"92650e2d-54ea-4904-8ee5-235164ed2949\") " pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:28.750257 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749574 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-etc-systemd\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.750257 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749588 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-sys\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.750257 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749611 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-host-run-netns\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.750257 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749643 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-host-run-netns\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.750257 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749647 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-host-var-lib-cni-multus\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.750257 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749671 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9e89ff1c-b604-4ae4-a756-badea52f84ef-host-var-lib-cni-multus\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.750257 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749680 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgck7\" (UniqueName: \"kubernetes.io/projected/9e89ff1c-b604-4ae4-a756-badea52f84ef-kube-api-access-xgck7\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.750784 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749705 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c437ff89-37eb-4bee-a67b-f2918685eee5-cnibin\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.750784 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749730 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0d845f70-52b2-4607-be37-ce8250614a3f-host-slash\") pod \"iptables-alerter-rhw5j\" (UID: \"0d845f70-52b2-4607-be37-ce8250614a3f\") " pod="openshift-network-operator/iptables-alerter-rhw5j" Apr 22 17:53:28.750784 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749757 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9mp2\" (UniqueName: \"kubernetes.io/projected/0d845f70-52b2-4607-be37-ce8250614a3f-kube-api-access-g9mp2\") pod \"iptables-alerter-rhw5j\" (UID: \"0d845f70-52b2-4607-be37-ce8250614a3f\") " pod="openshift-network-operator/iptables-alerter-rhw5j" Apr 22 17:53:28.750784 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749769 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c437ff89-37eb-4bee-a67b-f2918685eee5-cnibin\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.750784 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749785 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-run-systemd\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.750784 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749808 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0d845f70-52b2-4607-be37-ce8250614a3f-host-slash\") pod \"iptables-alerter-rhw5j\" (UID: \"0d845f70-52b2-4607-be37-ce8250614a3f\") " pod="openshift-network-operator/iptables-alerter-rhw5j" Apr 22 17:53:28.750784 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749815 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1685d860-db2e-419d-b016-516c1932fa2f-env-overrides\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.750784 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749846 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.750784 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749892 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-etc-sysctl-d\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.750784 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749898 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0d845f70-52b2-4607-be37-ce8250614a3f-iptables-alerter-script\") pod \"iptables-alerter-rhw5j\" (UID: \"0d845f70-52b2-4607-be37-ce8250614a3f\") " pod="openshift-network-operator/iptables-alerter-rhw5j" Apr 22 17:53:28.750784 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749925 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90a8af9e-a3b4-4682-86d6-985e15148048-host\") pod \"node-ca-2vnst\" (UID: \"90a8af9e-a3b4-4682-86d6-985e15148048\") " pod="openshift-image-registry/node-ca-2vnst" Apr 22 17:53:28.750784 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749951 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d947eb97-b29b-4a7e-bece-a9253fffdcd0-registration-dir\") pod \"aws-ebs-csi-driver-node-jclhk\" (UID: \"d947eb97-b29b-4a7e-bece-a9253fffdcd0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.750784 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.749999 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c437ff89-37eb-4bee-a67b-f2918685eee5-cni-binary-copy\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.750784 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.750021 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c437ff89-37eb-4bee-a67b-f2918685eee5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.750784 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.750037 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c437ff89-37eb-4bee-a67b-f2918685eee5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.750784 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.750057 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-host-kubelet\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.750784 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.750035 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/90a8af9e-a3b4-4682-86d6-985e15148048-host\") pod \"node-ca-2vnst\" (UID: \"90a8af9e-a3b4-4682-86d6-985e15148048\") " pod="openshift-image-registry/node-ca-2vnst" Apr 22 17:53:28.751314 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.750074 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-run-ovn\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.751314 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.750059 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d947eb97-b29b-4a7e-bece-a9253fffdcd0-registration-dir\") pod \"aws-ebs-csi-driver-node-jclhk\" (UID: \"d947eb97-b29b-4a7e-bece-a9253fffdcd0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.751314 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.750116 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-host-run-ovn-kubernetes\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.751314 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.750144 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d947eb97-b29b-4a7e-bece-a9253fffdcd0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jclhk\" (UID: \"d947eb97-b29b-4a7e-bece-a9253fffdcd0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.751314 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.750210 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d947eb97-b29b-4a7e-bece-a9253fffdcd0-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jclhk\" (UID: \"d947eb97-b29b-4a7e-bece-a9253fffdcd0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.751314 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.750202 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c437ff89-37eb-4bee-a67b-f2918685eee5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.751314 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.750478 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c437ff89-37eb-4bee-a67b-f2918685eee5-cni-binary-copy\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.751314 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.750517 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c437ff89-37eb-4bee-a67b-f2918685eee5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.755503 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:28.755440 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:28.755503 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:28.755462 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:28.755503 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:28.755471 2566 projected.go:194] Error preparing data for projected volume kube-api-access-ll6tj for pod openshift-network-diagnostics/network-check-target-ngcz8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:28.755654 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:28.755543 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj podName:6fb74441-7ec5-4482-ad08-21d23adeeb37 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:29.255525436 +0000 UTC m=+2.055418650 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ll6tj" (UniqueName: "kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj") pod "network-check-target-ngcz8" (UID: "6fb74441-7ec5-4482-ad08-21d23adeeb37") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:28.757053 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.757032 2566 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 17:53:28.760446 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.760425 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-695v6\" (UniqueName: \"kubernetes.io/projected/c437ff89-37eb-4bee-a67b-f2918685eee5-kube-api-access-695v6\") pod \"multus-additional-cni-plugins-zmjph\" (UID: \"c437ff89-37eb-4bee-a67b-f2918685eee5\") " pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.760446 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.760441 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dgsd\" (UniqueName: \"kubernetes.io/projected/90a8af9e-a3b4-4682-86d6-985e15148048-kube-api-access-2dgsd\") pod \"node-ca-2vnst\" (UID: \"90a8af9e-a3b4-4682-86d6-985e15148048\") " pod="openshift-image-registry/node-ca-2vnst" Apr 22 17:53:28.765314 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.765293 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xznzm\" (UniqueName: \"kubernetes.io/projected/d947eb97-b29b-4a7e-bece-a9253fffdcd0-kube-api-access-xznzm\") pod \"aws-ebs-csi-driver-node-jclhk\" (UID: \"d947eb97-b29b-4a7e-bece-a9253fffdcd0\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.765423 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.765405 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgck7\" (UniqueName: \"kubernetes.io/projected/9e89ff1c-b604-4ae4-a756-badea52f84ef-kube-api-access-xgck7\") pod \"multus-cqc2t\" (UID: \"9e89ff1c-b604-4ae4-a756-badea52f84ef\") " pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.766141 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.766114 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9mp2\" (UniqueName: \"kubernetes.io/projected/0d845f70-52b2-4607-be37-ce8250614a3f-kube-api-access-g9mp2\") pod \"iptables-alerter-rhw5j\" (UID: \"0d845f70-52b2-4607-be37-ce8250614a3f\") " pod="openshift-network-operator/iptables-alerter-rhw5j" Apr 22 17:53:28.766702 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.766683 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r5tt\" (UniqueName: \"kubernetes.io/projected/92650e2d-54ea-4904-8ee5-235164ed2949-kube-api-access-8r5tt\") pod \"network-metrics-daemon-dv96w\" (UID: \"92650e2d-54ea-4904-8ee5-235164ed2949\") " pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:28.850885 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.850851 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1685d860-db2e-419d-b016-516c1932fa2f-ovnkube-script-lib\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.850885 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.850890 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6j62\" (UniqueName: \"kubernetes.io/projected/1685d860-db2e-419d-b016-516c1932fa2f-kube-api-access-h6j62\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.851116 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.850906 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-etc-sysconfig\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.851116 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.850923 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-log-socket\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.851116 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.850938 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-etc-systemd\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.851116 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.850960 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-sys\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.851116 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.850989 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-run-systemd\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.851116 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851002 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-log-socket\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.851116 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851014 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-etc-systemd\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.851116 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851003 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-etc-sysconfig\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.851116 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851042 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-run-systemd\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.851116 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851011 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1685d860-db2e-419d-b016-516c1932fa2f-env-overrides\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.851116 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851050 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-sys\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.851116 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851092 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.851633 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851120 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-etc-sysctl-d\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.851633 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851149 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.851633 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851158 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-host-kubelet\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.851633 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851183 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-run-ovn\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.851633 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851211 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-host-run-ovn-kubernetes\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.851633 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851226 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-host-kubelet\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.851633 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851237 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-etc-modprobe-d\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.851633 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851249 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-etc-sysctl-d\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.851633 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851261 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-lib-modules\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.851633 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851281 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-run-ovn\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.851633 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851285 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkn6b\" (UniqueName: \"kubernetes.io/projected/abf52151-090a-4499-9e78-eebbda08114e-kube-api-access-bkn6b\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.851633 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851300 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-host-run-ovn-kubernetes\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.851633 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851327 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bfef47bf-9dff-44e0-8b1a-1397bf347548-hosts-file\") pod \"node-resolver-9kn68\" (UID: \"bfef47bf-9dff-44e0-8b1a-1397bf347548\") " pod="openshift-dns/node-resolver-9kn68" Apr 22 17:53:28.851633 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851353 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-etc-modprobe-d\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.851633 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851357 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-var-lib-openvswitch\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.851633 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851384 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bfef47bf-9dff-44e0-8b1a-1397bf347548-hosts-file\") pod \"node-resolver-9kn68\" (UID: \"bfef47bf-9dff-44e0-8b1a-1397bf347548\") " pod="openshift-dns/node-resolver-9kn68" Apr 22 17:53:28.851633 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851395 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1685d860-db2e-419d-b016-516c1932fa2f-env-overrides\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.851633 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851387 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-etc-kubernetes\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.852407 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851442 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1685d860-db2e-419d-b016-516c1932fa2f-ovnkube-script-lib\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.852407 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851446 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1685d860-db2e-419d-b016-516c1932fa2f-ovnkube-config\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.852407 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851406 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-lib-modules\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.852407 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851417 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-var-lib-openvswitch\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.852407 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851479 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-etc-kubernetes\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.852407 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851507 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-host-cni-bin\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.852407 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851520 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-host-cni-bin\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.852407 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851538 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/da9cec39-924f-4446-be6f-25108e9c58ba-konnectivity-ca\") pod \"konnectivity-agent-lfws2\" (UID: \"da9cec39-924f-4446-be6f-25108e9c58ba\") " pod="kube-system/konnectivity-agent-lfws2" Apr 22 17:53:28.852407 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851563 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-run\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.852407 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851589 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wcqn\" (UniqueName: \"kubernetes.io/projected/bfef47bf-9dff-44e0-8b1a-1397bf347548-kube-api-access-4wcqn\") pod \"node-resolver-9kn68\" (UID: \"bfef47bf-9dff-44e0-8b1a-1397bf347548\") " pod="openshift-dns/node-resolver-9kn68" Apr 22 17:53:28.852407 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851614 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-host\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.852407 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851638 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-systemd-units\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.852407 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851645 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-run\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.852407 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851660 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-host-run-netns\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.852407 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851683 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-node-log\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.852407 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851707 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/abf52151-090a-4499-9e78-eebbda08114e-etc-tuned\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.852407 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851715 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-host-run-netns\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.852407 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851680 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-host\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.853268 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851738 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bfef47bf-9dff-44e0-8b1a-1397bf347548-tmp-dir\") pod \"node-resolver-9kn68\" (UID: \"bfef47bf-9dff-44e0-8b1a-1397bf347548\") " pod="openshift-dns/node-resolver-9kn68" Apr 22 17:53:28.853268 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851680 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-systemd-units\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.853268 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851764 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-node-log\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.853268 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851802 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-host-slash\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.853268 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851827 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-etc-openvswitch\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.853268 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851881 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-run-openvswitch\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.853268 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851890 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1685d860-db2e-419d-b016-516c1932fa2f-ovnkube-config\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.853268 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851909 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1685d860-db2e-419d-b016-516c1932fa2f-ovn-node-metrics-cert\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.853268 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851931 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-etc-openvswitch\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.853268 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851935 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/da9cec39-924f-4446-be6f-25108e9c58ba-agent-certs\") pod \"konnectivity-agent-lfws2\" (UID: \"da9cec39-924f-4446-be6f-25108e9c58ba\") " pod="kube-system/konnectivity-agent-lfws2" Apr 22 17:53:28.853268 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851881 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-host-slash\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.853268 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851939 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-run-openvswitch\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.853268 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851962 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/abf52151-090a-4499-9e78-eebbda08114e-tmp\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.853268 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.851997 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-host-cni-netd\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.853268 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.852028 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-etc-sysctl-conf\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.853268 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.852047 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/da9cec39-924f-4446-be6f-25108e9c58ba-konnectivity-ca\") pod \"konnectivity-agent-lfws2\" (UID: \"da9cec39-924f-4446-be6f-25108e9c58ba\") " pod="kube-system/konnectivity-agent-lfws2" Apr 22 17:53:28.853268 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.852053 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-var-lib-kubelet\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.853268 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.852055 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/bfef47bf-9dff-44e0-8b1a-1397bf347548-tmp-dir\") pod \"node-resolver-9kn68\" (UID: \"bfef47bf-9dff-44e0-8b1a-1397bf347548\") " pod="openshift-dns/node-resolver-9kn68" Apr 22 17:53:28.854181 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.852118 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1685d860-db2e-419d-b016-516c1932fa2f-host-cni-netd\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.854181 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.852167 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-etc-sysctl-conf\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.854181 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.852172 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/abf52151-090a-4499-9e78-eebbda08114e-var-lib-kubelet\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.854181 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.854092 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/abf52151-090a-4499-9e78-eebbda08114e-etc-tuned\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.854453 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.854211 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/da9cec39-924f-4446-be6f-25108e9c58ba-agent-certs\") pod \"konnectivity-agent-lfws2\" (UID: \"da9cec39-924f-4446-be6f-25108e9c58ba\") " pod="kube-system/konnectivity-agent-lfws2" Apr 22 17:53:28.854453 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.854268 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1685d860-db2e-419d-b016-516c1932fa2f-ovn-node-metrics-cert\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.855290 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.855274 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/abf52151-090a-4499-9e78-eebbda08114e-tmp\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.863236 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.863217 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6j62\" (UniqueName: \"kubernetes.io/projected/1685d860-db2e-419d-b016-516c1932fa2f-kube-api-access-h6j62\") pod \"ovnkube-node-9z9pj\" (UID: \"1685d860-db2e-419d-b016-516c1932fa2f\") " pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:28.863443 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.863428 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkn6b\" (UniqueName: \"kubernetes.io/projected/abf52151-090a-4499-9e78-eebbda08114e-kube-api-access-bkn6b\") pod \"tuned-55lxs\" (UID: \"abf52151-090a-4499-9e78-eebbda08114e\") " pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:28.863668 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.863653 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wcqn\" (UniqueName: \"kubernetes.io/projected/bfef47bf-9dff-44e0-8b1a-1397bf347548-kube-api-access-4wcqn\") pod \"node-resolver-9kn68\" (UID: \"bfef47bf-9dff-44e0-8b1a-1397bf347548\") " pod="openshift-dns/node-resolver-9kn68" Apr 22 17:53:28.905566 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:28.905528 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde01c4bd78187d5743793fda4e118da1.slice/crio-354921718e1b340c92812bf6105ebde6d70bc7002f6e5dfbf8eab7a279a27e4b WatchSource:0}: Error finding container 354921718e1b340c92812bf6105ebde6d70bc7002f6e5dfbf8eab7a279a27e4b: Status 404 returned error can't find the container with id 354921718e1b340c92812bf6105ebde6d70bc7002f6e5dfbf8eab7a279a27e4b Apr 22 17:53:28.905888 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:28.905855 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0fb4cdf390b8d2f8fff1cf55f8e8acb.slice/crio-8bce79b64789068f6ccd62d3491bde827cd176deb693f7a6b0a4c8a83fb6c23e WatchSource:0}: Error finding container 8bce79b64789068f6ccd62d3491bde827cd176deb693f7a6b0a4c8a83fb6c23e: Status 404 returned error can't find the container with id 8bce79b64789068f6ccd62d3491bde827cd176deb693f7a6b0a4c8a83fb6c23e Apr 22 17:53:28.910488 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.910470 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:53:28.964365 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.964342 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cqc2t" Apr 22 17:53:28.969429 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:28.969408 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e89ff1c_b604_4ae4_a756_badea52f84ef.slice/crio-ca6464b4789f673116f3716ff6ca1098b968a41b05fe9781856f512c6c502228 WatchSource:0}: Error finding container ca6464b4789f673116f3716ff6ca1098b968a41b05fe9781856f512c6c502228: Status 404 returned error can't find the container with id ca6464b4789f673116f3716ff6ca1098b968a41b05fe9781856f512c6c502228 Apr 22 17:53:28.975306 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.975290 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" Apr 22 17:53:28.980445 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:28.980425 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd947eb97_b29b_4a7e_bece_a9253fffdcd0.slice/crio-5eb1a6a85025835ff5a89a973ff89f61b80e035ce7f039922a82064a2be656e9 WatchSource:0}: Error finding container 5eb1a6a85025835ff5a89a973ff89f61b80e035ce7f039922a82064a2be656e9: Status 404 returned error can't find the container with id 5eb1a6a85025835ff5a89a973ff89f61b80e035ce7f039922a82064a2be656e9 Apr 22 17:53:28.988313 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:28.988298 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zmjph" Apr 22 17:53:28.994651 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:28.994630 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc437ff89_37eb_4bee_a67b_f2918685eee5.slice/crio-e53b1bdfe55c2cfc792bb352830490f43cbdf1dd49b5fd17156080e2f723c702 WatchSource:0}: Error finding container e53b1bdfe55c2cfc792bb352830490f43cbdf1dd49b5fd17156080e2f723c702: Status 404 returned error can't find the container with id e53b1bdfe55c2cfc792bb352830490f43cbdf1dd49b5fd17156080e2f723c702 Apr 22 17:53:29.003391 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.003374 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2vnst" Apr 22 17:53:29.008271 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:29.008254 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90a8af9e_a3b4_4682_86d6_985e15148048.slice/crio-12f4d5af9da04290d815e2661c28f7a9d355f37d3f370ff7392967345fe1b507 WatchSource:0}: Error finding container 12f4d5af9da04290d815e2661c28f7a9d355f37d3f370ff7392967345fe1b507: Status 404 returned error can't find the container with id 12f4d5af9da04290d815e2661c28f7a9d355f37d3f370ff7392967345fe1b507 Apr 22 17:53:29.020119 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.020102 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rhw5j" Apr 22 17:53:29.024993 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:29.024973 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d845f70_52b2_4607_be37_ce8250614a3f.slice/crio-56d4b1560534eeeba327fe9c39dbd7ac17020b764eaa445ceb52733afb9e756b WatchSource:0}: Error finding container 56d4b1560534eeeba327fe9c39dbd7ac17020b764eaa445ceb52733afb9e756b: Status 404 returned error can't find the container with id 56d4b1560534eeeba327fe9c39dbd7ac17020b764eaa445ceb52733afb9e756b Apr 22 17:53:29.038262 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.038246 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:29.043288 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:29.043270 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1685d860_db2e_419d_b016_516c1932fa2f.slice/crio-4a114ad1eb99451e9c9951c61ec2e1d15a0e207df846f3ec87214b88d2626688 WatchSource:0}: Error finding container 4a114ad1eb99451e9c9951c61ec2e1d15a0e207df846f3ec87214b88d2626688: Status 404 returned error can't find the container with id 4a114ad1eb99451e9c9951c61ec2e1d15a0e207df846f3ec87214b88d2626688 Apr 22 17:53:29.054662 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.054647 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lfws2" Apr 22 17:53:29.060409 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:29.060390 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda9cec39_924f_4446_be6f_25108e9c58ba.slice/crio-f07b5c0fc6beebb28e2888750330ff917abbde6a36ceee6ab6fd672d76c7f11d WatchSource:0}: Error finding container f07b5c0fc6beebb28e2888750330ff917abbde6a36ceee6ab6fd672d76c7f11d: Status 404 returned error can't find the container with id f07b5c0fc6beebb28e2888750330ff917abbde6a36ceee6ab6fd672d76c7f11d Apr 22 17:53:29.076951 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.076929 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-55lxs" Apr 22 17:53:29.081437 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.081413 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9kn68" Apr 22 17:53:29.081893 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:29.081823 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabf52151_090a_4499_9e78_eebbda08114e.slice/crio-f72944ba88bbc711ffd8d5e0d20360430e3399b2b59b4b396f58a5d36e157dce WatchSource:0}: Error finding container f72944ba88bbc711ffd8d5e0d20360430e3399b2b59b4b396f58a5d36e157dce: Status 404 returned error can't find the container with id f72944ba88bbc711ffd8d5e0d20360430e3399b2b59b4b396f58a5d36e157dce Apr 22 17:53:29.086650 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:53:29.086632 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfef47bf_9dff_44e0_8b1a_1397bf347548.slice/crio-4bb05a80fa636a8f94573fe762691dcd847b3ed348a8aad361e3db47f6330326 WatchSource:0}: Error finding container 4bb05a80fa636a8f94573fe762691dcd847b3ed348a8aad361e3db47f6330326: Status 404 returned error can't find the container with id 4bb05a80fa636a8f94573fe762691dcd847b3ed348a8aad361e3db47f6330326 Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:29.088014 2566 kuberuntime_manager.go:1358] "Unhandled Error" err=< Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:331caf7efdfbf739b1585570e0004ebd8b5301a6977fbc7b2c64a07475354bc8,Command:[/bin/bash -c #!/bin/bash Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: set -uo pipefail Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: HOSTS_FILE="/etc/hosts" Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: TEMP_FILE="/tmp/hosts.tmp" Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: IFS=', ' read -r -a services <<< "${SERVICES}" Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: # Make a temporary file with the old hosts file's attributes. Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: echo "Failed to preserve hosts file. Exiting." Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: exit 1 Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: fi Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: while true; do Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: declare -A svc_ips Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: for svc in "${services[@]}"; do Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: # Fetch service IP from cluster dns if present. We make several tries Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: # support UDP loadbalancers and require reaching DNS through TCP. Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: for i in ${!cmds[*]} Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: do Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: ips=($(eval "${cmds[i]}")) Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: svc_ips["${svc}"]="${ips[@]}" Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: break Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: fi Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: done Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: done Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: # Update /etc/hosts only if we get valid service IPs Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: # Stale entries could exist in /etc/hosts if the service is deleted Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: if [[ -n "${svc_ips[*]-}" ]]; then Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: # Only continue rebuilding the hosts entries if its original content is preserved Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: sleep 60 & wait Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: continue Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: fi Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: # Append resolver entries for services Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: rc=0 Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: for svc in "${!svc_ips[@]}"; do Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: for ip in ${svc_ips[${svc}]}; do Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: done Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: done Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: if [[ $rc -ne 0 ]]; then Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: sleep 60 & wait Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: continue Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: fi Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: # Replace /etc/hosts with our modified version if needed Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: fi Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: sleep 60 & wait Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: unset svc_ips Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: done Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:172.31.0.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tmp-dir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4wcqn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-9kn68_openshift-dns(bfef47bf-9dff-44e0-8b1a-1397bf347548): ErrImagePull: pull QPS exceeded Apr 22 17:53:29.088037 ip-10-0-132-106 kubenswrapper[2566]: > logger="UnhandledError" Apr 22 17:53:29.089880 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:29.089134 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with ErrImagePull: \"pull QPS exceeded\"" pod="openshift-dns/node-resolver-9kn68" podUID="bfef47bf-9dff-44e0-8b1a-1397bf347548" Apr 22 17:53:29.254430 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.254380 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs\") pod \"network-metrics-daemon-dv96w\" (UID: \"92650e2d-54ea-4904-8ee5-235164ed2949\") " pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:29.254516 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:29.254492 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:29.254563 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:29.254548 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs podName:92650e2d-54ea-4904-8ee5-235164ed2949 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:30.254527153 +0000 UTC m=+3.054420355 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs") pod "network-metrics-daemon-dv96w" (UID: "92650e2d-54ea-4904-8ee5-235164ed2949") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:29.354997 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.354969 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ll6tj\" (UniqueName: \"kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj\") pod \"network-check-target-ngcz8\" (UID: \"6fb74441-7ec5-4482-ad08-21d23adeeb37\") " pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:29.355101 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:29.355068 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:29.355101 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:29.355089 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:29.355101 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:29.355101 2566 projected.go:194] Error preparing data for projected volume kube-api-access-ll6tj for pod openshift-network-diagnostics/network-check-target-ngcz8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:29.355253 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:29.355158 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj podName:6fb74441-7ec5-4482-ad08-21d23adeeb37 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:30.355140343 +0000 UTC m=+3.155033545 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ll6tj" (UniqueName: "kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj") pod "network-check-target-ngcz8" (UID: "6fb74441-7ec5-4482-ad08-21d23adeeb37") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:29.693462 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.693375 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:48:28 +0000 UTC" deadline="2027-09-20 15:16:30.741755806 +0000 UTC" Apr 22 17:53:29.693462 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.693414 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12381h23m1.04834544s" Apr 22 17:53:29.777679 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.777646 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:29.777842 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:29.777794 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngcz8" podUID="6fb74441-7ec5-4482-ad08-21d23adeeb37" Apr 22 17:53:29.807123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.807058 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-55lxs" event={"ID":"abf52151-090a-4499-9e78-eebbda08114e","Type":"ContainerStarted","Data":"f72944ba88bbc711ffd8d5e0d20360430e3399b2b59b4b396f58a5d36e157dce"} Apr 22 17:53:29.815433 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.815407 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:29.823581 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.823553 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rhw5j" event={"ID":"0d845f70-52b2-4607-be37-ce8250614a3f","Type":"ContainerStarted","Data":"56d4b1560534eeeba327fe9c39dbd7ac17020b764eaa445ceb52733afb9e756b"} Apr 22 17:53:29.837676 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.837650 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2vnst" event={"ID":"90a8af9e-a3b4-4682-86d6-985e15148048","Type":"ContainerStarted","Data":"12f4d5af9da04290d815e2661c28f7a9d355f37d3f370ff7392967345fe1b507"} Apr 22 17:53:29.848153 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.848126 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zmjph" event={"ID":"c437ff89-37eb-4bee-a67b-f2918685eee5","Type":"ContainerStarted","Data":"e53b1bdfe55c2cfc792bb352830490f43cbdf1dd49b5fd17156080e2f723c702"} Apr 22 17:53:29.864081 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.864039 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-106.ec2.internal" event={"ID":"e0fb4cdf390b8d2f8fff1cf55f8e8acb","Type":"ContainerStarted","Data":"8bce79b64789068f6ccd62d3491bde827cd176deb693f7a6b0a4c8a83fb6c23e"} Apr 22 17:53:29.881875 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.881835 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-106.ec2.internal" event={"ID":"de01c4bd78187d5743793fda4e118da1","Type":"ContainerStarted","Data":"354921718e1b340c92812bf6105ebde6d70bc7002f6e5dfbf8eab7a279a27e4b"} Apr 22 17:53:29.892776 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.892738 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lfws2" event={"ID":"da9cec39-924f-4446-be6f-25108e9c58ba","Type":"ContainerStarted","Data":"f07b5c0fc6beebb28e2888750330ff917abbde6a36ceee6ab6fd672d76c7f11d"} Apr 22 17:53:29.927028 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.926997 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" event={"ID":"1685d860-db2e-419d-b016-516c1932fa2f","Type":"ContainerStarted","Data":"4a114ad1eb99451e9c9951c61ec2e1d15a0e207df846f3ec87214b88d2626688"} Apr 22 17:53:29.930708 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.930682 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" event={"ID":"d947eb97-b29b-4a7e-bece-a9253fffdcd0","Type":"ContainerStarted","Data":"5eb1a6a85025835ff5a89a973ff89f61b80e035ce7f039922a82064a2be656e9"} Apr 22 17:53:29.945093 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.945025 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cqc2t" event={"ID":"9e89ff1c-b604-4ae4-a756-badea52f84ef","Type":"ContainerStarted","Data":"ca6464b4789f673116f3716ff6ca1098b968a41b05fe9781856f512c6c502228"} Apr 22 17:53:29.952698 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.952672 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9kn68" event={"ID":"bfef47bf-9dff-44e0-8b1a-1397bf347548","Type":"ContainerStarted","Data":"4bb05a80fa636a8f94573fe762691dcd847b3ed348a8aad361e3db47f6330326"} Apr 22 17:53:29.964642 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:29.964612 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:331caf7efdfbf739b1585570e0004ebd8b5301a6977fbc7b2c64a07475354bc8\\\": ErrImagePull: pull QPS exceeded\"" pod="openshift-dns/node-resolver-9kn68" podUID="bfef47bf-9dff-44e0-8b1a-1397bf347548" Apr 22 17:53:29.973272 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:29.973252 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:30.266814 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:30.266737 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs\") pod \"network-metrics-daemon-dv96w\" (UID: \"92650e2d-54ea-4904-8ee5-235164ed2949\") " pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:30.267000 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:30.266911 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:30.267000 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:30.266982 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs podName:92650e2d-54ea-4904-8ee5-235164ed2949 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:32.266962356 +0000 UTC m=+5.066855577 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs") pod "network-metrics-daemon-dv96w" (UID: "92650e2d-54ea-4904-8ee5-235164ed2949") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:30.367453 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:30.367410 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ll6tj\" (UniqueName: \"kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj\") pod \"network-check-target-ngcz8\" (UID: \"6fb74441-7ec5-4482-ad08-21d23adeeb37\") " pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:30.367617 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:30.367570 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:30.367617 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:30.367586 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:30.367617 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:30.367598 2566 projected.go:194] Error preparing data for projected volume kube-api-access-ll6tj for pod openshift-network-diagnostics/network-check-target-ngcz8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:30.367765 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:30.367658 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj podName:6fb74441-7ec5-4482-ad08-21d23adeeb37 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:32.367633479 +0000 UTC m=+5.167526683 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ll6tj" (UniqueName: "kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj") pod "network-check-target-ngcz8" (UID: "6fb74441-7ec5-4482-ad08-21d23adeeb37") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:30.694675 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:30.694569 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 17:48:28 +0000 UTC" deadline="2028-01-28 15:39:20.434326334 +0000 UTC" Apr 22 17:53:30.694675 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:30.694607 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15501h45m49.739723029s" Apr 22 17:53:30.751190 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:30.750837 2566 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:30.775629 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:30.775171 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:30.775629 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:30.775302 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv96w" podUID="92650e2d-54ea-4904-8ee5-235164ed2949" Apr 22 17:53:30.865215 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:30.865189 2566 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 17:53:31.777415 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:31.775310 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:31.777415 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:31.775438 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngcz8" podUID="6fb74441-7ec5-4482-ad08-21d23adeeb37" Apr 22 17:53:32.283855 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:32.283801 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs\") pod \"network-metrics-daemon-dv96w\" (UID: \"92650e2d-54ea-4904-8ee5-235164ed2949\") " pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:32.284055 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:32.283958 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:32.284055 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:32.284026 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs podName:92650e2d-54ea-4904-8ee5-235164ed2949 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:36.284007648 +0000 UTC m=+9.083900859 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs") pod "network-metrics-daemon-dv96w" (UID: "92650e2d-54ea-4904-8ee5-235164ed2949") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:32.384986 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:32.384950 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ll6tj\" (UniqueName: \"kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj\") pod \"network-check-target-ngcz8\" (UID: \"6fb74441-7ec5-4482-ad08-21d23adeeb37\") " pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:32.385147 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:32.385116 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:32.385147 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:32.385135 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:32.385223 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:32.385149 2566 projected.go:194] Error preparing data for projected volume kube-api-access-ll6tj for pod openshift-network-diagnostics/network-check-target-ngcz8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:32.385223 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:32.385211 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj podName:6fb74441-7ec5-4482-ad08-21d23adeeb37 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:36.385191835 +0000 UTC m=+9.185085039 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ll6tj" (UniqueName: "kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj") pod "network-check-target-ngcz8" (UID: "6fb74441-7ec5-4482-ad08-21d23adeeb37") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:32.775267 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:32.775189 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:32.775434 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:32.775321 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv96w" podUID="92650e2d-54ea-4904-8ee5-235164ed2949" Apr 22 17:53:33.776131 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:33.776099 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:33.776565 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:33.776232 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngcz8" podUID="6fb74441-7ec5-4482-ad08-21d23adeeb37" Apr 22 17:53:34.774816 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:34.774787 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:34.774996 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:34.774933 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv96w" podUID="92650e2d-54ea-4904-8ee5-235164ed2949" Apr 22 17:53:35.777468 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:35.777436 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:35.777910 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:35.777554 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngcz8" podUID="6fb74441-7ec5-4482-ad08-21d23adeeb37" Apr 22 17:53:36.319231 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:36.319168 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs\") pod \"network-metrics-daemon-dv96w\" (UID: \"92650e2d-54ea-4904-8ee5-235164ed2949\") " pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:36.319399 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:36.319288 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:36.319399 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:36.319371 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs podName:92650e2d-54ea-4904-8ee5-235164ed2949 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:44.31934911 +0000 UTC m=+17.119242312 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs") pod "network-metrics-daemon-dv96w" (UID: "92650e2d-54ea-4904-8ee5-235164ed2949") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:36.419708 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:36.419673 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ll6tj\" (UniqueName: \"kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj\") pod \"network-check-target-ngcz8\" (UID: \"6fb74441-7ec5-4482-ad08-21d23adeeb37\") " pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:36.419901 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:36.419852 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:36.419901 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:36.419890 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:36.419901 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:36.419903 2566 projected.go:194] Error preparing data for projected volume kube-api-access-ll6tj for pod openshift-network-diagnostics/network-check-target-ngcz8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:36.420058 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:36.419958 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj podName:6fb74441-7ec5-4482-ad08-21d23adeeb37 nodeName:}" failed. No retries permitted until 2026-04-22 17:53:44.419940043 +0000 UTC m=+17.219833247 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ll6tj" (UniqueName: "kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj") pod "network-check-target-ngcz8" (UID: "6fb74441-7ec5-4482-ad08-21d23adeeb37") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:36.775382 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:36.775299 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:36.775545 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:36.775437 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv96w" podUID="92650e2d-54ea-4904-8ee5-235164ed2949" Apr 22 17:53:37.776033 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:37.775855 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:37.776033 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:37.775982 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngcz8" podUID="6fb74441-7ec5-4482-ad08-21d23adeeb37" Apr 22 17:53:38.775283 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:38.775250 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:38.775451 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:38.775369 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv96w" podUID="92650e2d-54ea-4904-8ee5-235164ed2949" Apr 22 17:53:39.775403 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:39.775370 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:39.775817 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:39.775472 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngcz8" podUID="6fb74441-7ec5-4482-ad08-21d23adeeb37" Apr 22 17:53:40.775432 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:40.775404 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:40.775902 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:40.775539 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv96w" podUID="92650e2d-54ea-4904-8ee5-235164ed2949" Apr 22 17:53:41.775132 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:41.775097 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:41.775373 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:41.775212 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngcz8" podUID="6fb74441-7ec5-4482-ad08-21d23adeeb37" Apr 22 17:53:42.775172 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:42.775139 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:42.775575 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:42.775259 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv96w" podUID="92650e2d-54ea-4904-8ee5-235164ed2949" Apr 22 17:53:43.774912 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:43.774880 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:43.775085 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:43.774997 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngcz8" podUID="6fb74441-7ec5-4482-ad08-21d23adeeb37" Apr 22 17:53:44.380887 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:44.380839 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs\") pod \"network-metrics-daemon-dv96w\" (UID: \"92650e2d-54ea-4904-8ee5-235164ed2949\") " pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:44.381292 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:44.380989 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:44.381292 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:44.381061 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs podName:92650e2d-54ea-4904-8ee5-235164ed2949 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:00.381042788 +0000 UTC m=+33.180936003 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs") pod "network-metrics-daemon-dv96w" (UID: "92650e2d-54ea-4904-8ee5-235164ed2949") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:53:44.481925 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:44.481891 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ll6tj\" (UniqueName: \"kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj\") pod \"network-check-target-ngcz8\" (UID: \"6fb74441-7ec5-4482-ad08-21d23adeeb37\") " pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:44.482098 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:44.482036 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:53:44.482098 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:44.482050 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:53:44.482098 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:44.482060 2566 projected.go:194] Error preparing data for projected volume kube-api-access-ll6tj for pod openshift-network-diagnostics/network-check-target-ngcz8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:44.482243 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:44.482111 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj podName:6fb74441-7ec5-4482-ad08-21d23adeeb37 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:00.482094599 +0000 UTC m=+33.281987797 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ll6tj" (UniqueName: "kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj") pod "network-check-target-ngcz8" (UID: "6fb74441-7ec5-4482-ad08-21d23adeeb37") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:53:44.775561 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:44.775488 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:44.775702 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:44.775598 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv96w" podUID="92650e2d-54ea-4904-8ee5-235164ed2949" Apr 22 17:53:45.775140 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:45.775104 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:45.775572 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:45.775234 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngcz8" podUID="6fb74441-7ec5-4482-ad08-21d23adeeb37" Apr 22 17:53:46.775430 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:46.775373 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:46.775717 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:46.775470 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv96w" podUID="92650e2d-54ea-4904-8ee5-235164ed2949" Apr 22 17:53:47.776651 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:47.776352 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:47.777368 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:47.776757 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngcz8" podUID="6fb74441-7ec5-4482-ad08-21d23adeeb37" Apr 22 17:53:47.992866 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:47.992838 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9kn68" event={"ID":"bfef47bf-9dff-44e0-8b1a-1397bf347548","Type":"ContainerStarted","Data":"f81d0f9205814a70ce5ffe362843b3a97d083752d4af1a4addaed9c14967d7ea"} Apr 22 17:53:47.996625 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:47.996600 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-55lxs" event={"ID":"abf52151-090a-4499-9e78-eebbda08114e","Type":"ContainerStarted","Data":"fcda18437c5ff188d5ff01481f12d8964f702c84e6e2f420f6377bd74a2e783a"} Apr 22 17:53:47.997795 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:47.997774 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2vnst" event={"ID":"90a8af9e-a3b4-4682-86d6-985e15148048","Type":"ContainerStarted","Data":"6bb604e226ee86aa79fc97e806ac344ad0dafb0fc866db79a54519774c1aff36"} Apr 22 17:53:48.000384 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.000359 2566 generic.go:358] "Generic (PLEG): container finished" podID="c437ff89-37eb-4bee-a67b-f2918685eee5" containerID="4c11887b9eda1926fc073e19a5a4596a945f14f59b3c134256fd9c566f1867b3" exitCode=0 Apr 22 17:53:48.000491 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.000450 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zmjph" event={"ID":"c437ff89-37eb-4bee-a67b-f2918685eee5","Type":"ContainerDied","Data":"4c11887b9eda1926fc073e19a5a4596a945f14f59b3c134256fd9c566f1867b3"} Apr 22 17:53:48.002220 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.002200 2566 generic.go:358] "Generic (PLEG): container finished" podID="e0fb4cdf390b8d2f8fff1cf55f8e8acb" containerID="e662896be0ae719b38598d27a301f823f23af90b07b0197af050ae2da1c6227e" exitCode=0 Apr 22 17:53:48.002307 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.002271 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-106.ec2.internal" event={"ID":"e0fb4cdf390b8d2f8fff1cf55f8e8acb","Type":"ContainerDied","Data":"e662896be0ae719b38598d27a301f823f23af90b07b0197af050ae2da1c6227e"} Apr 22 17:53:48.003578 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.003549 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-106.ec2.internal" event={"ID":"de01c4bd78187d5743793fda4e118da1","Type":"ContainerStarted","Data":"7f29a7196fed8690eaf7ed85a81e1b513483c3994240c4d12c3ca302850844eb"} Apr 22 17:53:48.004811 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.004787 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lfws2" event={"ID":"da9cec39-924f-4446-be6f-25108e9c58ba","Type":"ContainerStarted","Data":"bdcb368694b105200c1bda836af028691e144c235d434e29372321b59c451829"} Apr 22 17:53:48.007022 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.007005 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 17:53:48.007327 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.007309 2566 generic.go:358] "Generic (PLEG): container finished" podID="1685d860-db2e-419d-b016-516c1932fa2f" containerID="01087cdfee4c881d2beee0a4e00bafbbcc1197c91523cd0ca10d6373598414fa" exitCode=1 Apr 22 17:53:48.007398 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.007355 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" event={"ID":"1685d860-db2e-419d-b016-516c1932fa2f","Type":"ContainerStarted","Data":"c88eccf1fffb761d199bc055077367d2de17bc4a8f8b25ff4efdaca6a1846eb4"} Apr 22 17:53:48.007398 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.007379 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" event={"ID":"1685d860-db2e-419d-b016-516c1932fa2f","Type":"ContainerStarted","Data":"19273cdcb535fedf761a1aca50fc3ca222208f9942cd485ff0affb1f39564c97"} Apr 22 17:53:48.007398 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.007392 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" event={"ID":"1685d860-db2e-419d-b016-516c1932fa2f","Type":"ContainerStarted","Data":"8d714b171e7c9628f4c2030ce16417cf734dea57d1bcbd4379f22d4448dd5613"} Apr 22 17:53:48.007518 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.007402 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" event={"ID":"1685d860-db2e-419d-b016-516c1932fa2f","Type":"ContainerStarted","Data":"1e855d17bebb17cee412bd2544fd057805cfb10dab6e2ded8125dd5a4d86c1c5"} Apr 22 17:53:48.007518 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.007411 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" event={"ID":"1685d860-db2e-419d-b016-516c1932fa2f","Type":"ContainerDied","Data":"01087cdfee4c881d2beee0a4e00bafbbcc1197c91523cd0ca10d6373598414fa"} Apr 22 17:53:48.007518 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.007420 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" event={"ID":"1685d860-db2e-419d-b016-516c1932fa2f","Type":"ContainerStarted","Data":"52832ff33aeeee65388b7418f877433fe74407bc6556b72cb2d3c528c2a64920"} Apr 22 17:53:48.008455 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.008416 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9kn68" podStartSLOduration=-9223372015.84637 podStartE2EDuration="21.00840538s" podCreationTimestamp="2026-04-22 17:53:27 +0000 UTC" firstStartedPulling="2026-04-22 17:53:29.087901537 +0000 UTC m=+1.887794735" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:53:48.008172056 +0000 UTC m=+20.808065275" watchObservedRunningTime="2026-04-22 17:53:48.00840538 +0000 UTC m=+20.808298600" Apr 22 17:53:48.008600 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.008582 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" event={"ID":"d947eb97-b29b-4a7e-bece-a9253fffdcd0","Type":"ContainerStarted","Data":"0e5792fdb655289a68aea674d7ed86072318ab6a41232b7d25c7d628df579044"} Apr 22 17:53:48.009672 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.009655 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cqc2t" event={"ID":"9e89ff1c-b604-4ae4-a756-badea52f84ef","Type":"ContainerStarted","Data":"ec6a01a7c9c66ea5ef64e1e73c2960059262159db4a5379a123a51eb2315a187"} Apr 22 17:53:48.022357 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.022325 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2vnst" podStartSLOduration=3.21361958 podStartE2EDuration="21.022315929s" podCreationTimestamp="2026-04-22 17:53:27 +0000 UTC" firstStartedPulling="2026-04-22 17:53:29.009640757 +0000 UTC m=+1.809533956" lastFinishedPulling="2026-04-22 17:53:46.818337098 +0000 UTC m=+19.618230305" observedRunningTime="2026-04-22 17:53:48.021962114 +0000 UTC m=+20.821855335" watchObservedRunningTime="2026-04-22 17:53:48.022315929 +0000 UTC m=+20.822209149" Apr 22 17:53:48.035380 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.035351 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-106.ec2.internal" podStartSLOduration=20.035341976 podStartE2EDuration="20.035341976s" podCreationTimestamp="2026-04-22 17:53:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:53:48.035102801 +0000 UTC m=+20.834996022" watchObservedRunningTime="2026-04-22 17:53:48.035341976 +0000 UTC m=+20.835235196" Apr 22 17:53:48.072322 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.072253 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-lfws2" podStartSLOduration=8.120793343 podStartE2EDuration="21.072243403s" podCreationTimestamp="2026-04-22 17:53:27 +0000 UTC" firstStartedPulling="2026-04-22 17:53:29.061463155 +0000 UTC m=+1.861356356" lastFinishedPulling="2026-04-22 17:53:42.012913218 +0000 UTC m=+14.812806416" observedRunningTime="2026-04-22 17:53:48.072070102 +0000 UTC m=+20.871963321" watchObservedRunningTime="2026-04-22 17:53:48.072243403 +0000 UTC m=+20.872136623" Apr 22 17:53:48.091742 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.091700 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-55lxs" podStartSLOduration=3.345778561 podStartE2EDuration="21.091690805s" podCreationTimestamp="2026-04-22 17:53:27 +0000 UTC" firstStartedPulling="2026-04-22 17:53:29.083991478 +0000 UTC m=+1.883884691" lastFinishedPulling="2026-04-22 17:53:46.829903726 +0000 UTC m=+19.629796935" observedRunningTime="2026-04-22 17:53:48.091360173 +0000 UTC m=+20.891253392" watchObservedRunningTime="2026-04-22 17:53:48.091690805 +0000 UTC m=+20.891584026" Apr 22 17:53:48.108816 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.108477 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cqc2t" podStartSLOduration=2.945807011 podStartE2EDuration="21.108466045s" podCreationTimestamp="2026-04-22 17:53:27 +0000 UTC" firstStartedPulling="2026-04-22 17:53:28.970928339 +0000 UTC m=+1.770821536" lastFinishedPulling="2026-04-22 17:53:47.133587366 +0000 UTC m=+19.933480570" observedRunningTime="2026-04-22 17:53:48.108294515 +0000 UTC m=+20.908187736" watchObservedRunningTime="2026-04-22 17:53:48.108466045 +0000 UTC m=+20.908359266" Apr 22 17:53:48.758430 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.758405 2566 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 17:53:48.775255 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:48.775234 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:48.775359 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:48.775338 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv96w" podUID="92650e2d-54ea-4904-8ee5-235164ed2949" Apr 22 17:53:49.012952 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:49.012872 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rhw5j" event={"ID":"0d845f70-52b2-4607-be37-ce8250614a3f","Type":"ContainerStarted","Data":"f2fa55020d1c489084f909a17bfd53af7272d6f96ca7dbf696d1221a99f90007"} Apr 22 17:53:49.014771 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:49.014694 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" event={"ID":"d947eb97-b29b-4a7e-bece-a9253fffdcd0","Type":"ContainerStarted","Data":"8d1532290b2c40c8990df72288533fb7584101a55902799928ace5bb4d69fef6"} Apr 22 17:53:49.028023 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:49.027981 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rhw5j" podStartSLOduration=4.226576438 podStartE2EDuration="22.027968525s" podCreationTimestamp="2026-04-22 17:53:27 +0000 UTC" firstStartedPulling="2026-04-22 17:53:29.026412818 +0000 UTC m=+1.826306020" lastFinishedPulling="2026-04-22 17:53:46.8278049 +0000 UTC m=+19.627698107" observedRunningTime="2026-04-22 17:53:49.02776326 +0000 UTC m=+21.827656484" watchObservedRunningTime="2026-04-22 17:53:49.027968525 +0000 UTC m=+21.827861745" Apr 22 17:53:49.162227 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:49.162195 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-lfws2" Apr 22 17:53:49.714779 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:49.714675 2566 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T17:53:48.758426867Z","UUID":"2416daed-1bac-441a-a31d-588a9dd05c8b","Handler":null,"Name":"","Endpoint":""} Apr 22 17:53:49.716554 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:49.716535 2566 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 17:53:49.716657 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:49.716563 2566 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 17:53:49.774884 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:49.774822 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:49.775012 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:49.774974 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngcz8" podUID="6fb74441-7ec5-4482-ad08-21d23adeeb37" Apr 22 17:53:50.020959 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:50.020891 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 17:53:50.021374 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:50.021299 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" event={"ID":"1685d860-db2e-419d-b016-516c1932fa2f","Type":"ContainerStarted","Data":"9eed42fb5a3b71f316d8f0ab51bc220b7e4b113edfc896cf2f8942fe8e7802c4"} Apr 22 17:53:50.023407 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:50.023382 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" event={"ID":"d947eb97-b29b-4a7e-bece-a9253fffdcd0","Type":"ContainerStarted","Data":"423bdbb0b09f4743c90333a97f7dc9f7e14bcfb9030fe94d4faa2d0ac3acef08"} Apr 22 17:53:50.025142 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:50.025117 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-106.ec2.internal" event={"ID":"e0fb4cdf390b8d2f8fff1cf55f8e8acb","Type":"ContainerStarted","Data":"af421bad3460109d6349eabfe3e6877730e63335b19c3867d348370cfbb2c28f"} Apr 22 17:53:50.040381 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:50.040335 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jclhk" podStartSLOduration=2.286961175 podStartE2EDuration="23.040321618s" podCreationTimestamp="2026-04-22 17:53:27 +0000 UTC" firstStartedPulling="2026-04-22 17:53:28.981741579 +0000 UTC m=+1.781634778" lastFinishedPulling="2026-04-22 17:53:49.735102022 +0000 UTC m=+22.534995221" observedRunningTime="2026-04-22 17:53:50.040031466 +0000 UTC m=+22.839924684" watchObservedRunningTime="2026-04-22 17:53:50.040321618 +0000 UTC m=+22.840214838" Apr 22 17:53:50.054174 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:50.054132 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-106.ec2.internal" podStartSLOduration=22.054117062 podStartE2EDuration="22.054117062s" podCreationTimestamp="2026-04-22 17:53:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:53:50.053940178 +0000 UTC m=+22.853833398" watchObservedRunningTime="2026-04-22 17:53:50.054117062 +0000 UTC m=+22.854010283" Apr 22 17:53:50.774885 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:50.774844 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:50.775061 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:50.774962 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv96w" podUID="92650e2d-54ea-4904-8ee5-235164ed2949" Apr 22 17:53:51.741654 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:51.741624 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-lfws2" Apr 22 17:53:51.742259 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:51.742233 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-lfws2" Apr 22 17:53:51.775248 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:51.775218 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:51.775373 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:51.775324 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngcz8" podUID="6fb74441-7ec5-4482-ad08-21d23adeeb37" Apr 22 17:53:52.031933 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:52.031781 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 17:53:52.032548 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:52.032319 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" event={"ID":"1685d860-db2e-419d-b016-516c1932fa2f","Type":"ContainerStarted","Data":"d9deff03089e3d1745a353f3ebba5c3cb184ab60070d2b6199a86ffaf5cfb8e1"} Apr 22 17:53:52.032740 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:52.032710 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:52.032740 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:52.032737 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:52.033226 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:52.032751 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:52.033226 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:52.032976 2566 scope.go:117] "RemoveContainer" containerID="01087cdfee4c881d2beee0a4e00bafbbcc1197c91523cd0ca10d6373598414fa" Apr 22 17:53:52.036502 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:52.036484 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-lfws2" Apr 22 17:53:52.049982 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:52.049852 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:52.050433 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:52.050418 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:53:52.775323 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:52.775297 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:52.775896 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:52.775388 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv96w" podUID="92650e2d-54ea-4904-8ee5-235164ed2949" Apr 22 17:53:53.040439 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:53.040369 2566 generic.go:358] "Generic (PLEG): container finished" podID="c437ff89-37eb-4bee-a67b-f2918685eee5" containerID="eb768a7526be34870b864980f203f64d2d89e9ea28f7ab40806ed76999e9aae9" exitCode=0 Apr 22 17:53:53.040566 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:53.040454 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zmjph" event={"ID":"c437ff89-37eb-4bee-a67b-f2918685eee5","Type":"ContainerDied","Data":"eb768a7526be34870b864980f203f64d2d89e9ea28f7ab40806ed76999e9aae9"} Apr 22 17:53:53.043930 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:53.043913 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 17:53:53.044238 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:53.044219 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" event={"ID":"1685d860-db2e-419d-b016-516c1932fa2f","Type":"ContainerStarted","Data":"ba0665416f4f9167e2a8eb133140077be76e313df3294a966889fea5aabb3b47"} Apr 22 17:53:53.090155 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:53.089974 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" podStartSLOduration=8.262437545 podStartE2EDuration="26.089957385s" podCreationTimestamp="2026-04-22 17:53:27 +0000 UTC" firstStartedPulling="2026-04-22 17:53:29.044618788 +0000 UTC m=+1.844511990" lastFinishedPulling="2026-04-22 17:53:46.872138618 +0000 UTC m=+19.672031830" observedRunningTime="2026-04-22 17:53:53.089350777 +0000 UTC m=+25.889243998" watchObservedRunningTime="2026-04-22 17:53:53.089957385 +0000 UTC m=+25.889850606" Apr 22 17:53:53.775401 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:53.775263 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:53.775818 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:53.775472 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngcz8" podUID="6fb74441-7ec5-4482-ad08-21d23adeeb37" Apr 22 17:53:53.866480 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:53.866456 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dv96w"] Apr 22 17:53:53.866618 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:53.866604 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:53.866722 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:53.866702 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv96w" podUID="92650e2d-54ea-4904-8ee5-235164ed2949" Apr 22 17:53:53.869062 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:53.869042 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ngcz8"] Apr 22 17:53:54.048171 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:54.048147 2566 generic.go:358] "Generic (PLEG): container finished" podID="c437ff89-37eb-4bee-a67b-f2918685eee5" containerID="0b1e8556e1e7681207a61e54749e2d048c83ae79af45ebdc42a1fc5852fe68c1" exitCode=0 Apr 22 17:53:54.048316 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:54.048219 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:54.048316 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:54.048219 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zmjph" event={"ID":"c437ff89-37eb-4bee-a67b-f2918685eee5","Type":"ContainerDied","Data":"0b1e8556e1e7681207a61e54749e2d048c83ae79af45ebdc42a1fc5852fe68c1"} Apr 22 17:53:54.048471 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:54.048448 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngcz8" podUID="6fb74441-7ec5-4482-ad08-21d23adeeb37" Apr 22 17:53:55.052428 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:55.052394 2566 generic.go:358] "Generic (PLEG): container finished" podID="c437ff89-37eb-4bee-a67b-f2918685eee5" containerID="31e748266a5c3c4a0cf8cb71d8dead595ea4e00abb4c1a85b4242b86b570be54" exitCode=0 Apr 22 17:53:55.052801 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:55.052467 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zmjph" event={"ID":"c437ff89-37eb-4bee-a67b-f2918685eee5","Type":"ContainerDied","Data":"31e748266a5c3c4a0cf8cb71d8dead595ea4e00abb4c1a85b4242b86b570be54"} Apr 22 17:53:55.774929 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:55.774893 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:55.775088 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:55.774946 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:55.775088 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:55.775036 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv96w" podUID="92650e2d-54ea-4904-8ee5-235164ed2949" Apr 22 17:53:55.775468 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:55.775404 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngcz8" podUID="6fb74441-7ec5-4482-ad08-21d23adeeb37" Apr 22 17:53:57.775944 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:57.775913 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:57.776378 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:57.776018 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv96w" podUID="92650e2d-54ea-4904-8ee5-235164ed2949" Apr 22 17:53:57.776378 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:57.776052 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:57.776378 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:57.776147 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngcz8" podUID="6fb74441-7ec5-4482-ad08-21d23adeeb37" Apr 22 17:53:59.775212 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:59.775174 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:53:59.775615 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:53:59.775174 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:53:59.775615 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:59.775310 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ngcz8" podUID="6fb74441-7ec5-4482-ad08-21d23adeeb37" Apr 22 17:53:59.776674 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:53:59.775745 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dv96w" podUID="92650e2d-54ea-4904-8ee5-235164ed2949" Apr 22 17:54:00.008150 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.008075 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-106.ec2.internal" event="NodeReady" Apr 22 17:54:00.008286 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.008204 2566 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 17:54:00.059726 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.059684 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rbrk7"] Apr 22 17:54:00.092015 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.091992 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hmcpt"] Apr 22 17:54:00.092173 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.092137 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rbrk7" Apr 22 17:54:00.094903 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.094880 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 17:54:00.095012 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.094916 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 17:54:00.095168 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.095149 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-frn4n\"" Apr 22 17:54:00.095244 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.095168 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 17:54:00.103346 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.103328 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rbrk7"] Apr 22 17:54:00.103462 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.103361 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hmcpt"] Apr 22 17:54:00.103507 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.103466 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hmcpt" Apr 22 17:54:00.106355 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.106335 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-r5nxd\"" Apr 22 17:54:00.106456 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.106369 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 17:54:00.106635 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.106616 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 17:54:00.195823 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.195791 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls\") pod \"dns-default-hmcpt\" (UID: \"705dd2ce-2ac7-4745-a314-14e119a14624\") " pod="openshift-dns/dns-default-hmcpt" Apr 22 17:54:00.195999 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.195882 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l64xj\" (UniqueName: \"kubernetes.io/projected/705dd2ce-2ac7-4745-a314-14e119a14624-kube-api-access-l64xj\") pod \"dns-default-hmcpt\" (UID: \"705dd2ce-2ac7-4745-a314-14e119a14624\") " pod="openshift-dns/dns-default-hmcpt" Apr 22 17:54:00.195999 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.195942 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/705dd2ce-2ac7-4745-a314-14e119a14624-config-volume\") pod \"dns-default-hmcpt\" (UID: \"705dd2ce-2ac7-4745-a314-14e119a14624\") " pod="openshift-dns/dns-default-hmcpt" Apr 22 17:54:00.196106 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.196038 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert\") pod \"ingress-canary-rbrk7\" (UID: \"aa418e11-9a0e-463a-8262-e078bca4e7a8\") " pod="openshift-ingress-canary/ingress-canary-rbrk7" Apr 22 17:54:00.196106 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.196070 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwxfq\" (UniqueName: \"kubernetes.io/projected/aa418e11-9a0e-463a-8262-e078bca4e7a8-kube-api-access-lwxfq\") pod \"ingress-canary-rbrk7\" (UID: \"aa418e11-9a0e-463a-8262-e078bca4e7a8\") " pod="openshift-ingress-canary/ingress-canary-rbrk7" Apr 22 17:54:00.196223 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.196130 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/705dd2ce-2ac7-4745-a314-14e119a14624-tmp-dir\") pod \"dns-default-hmcpt\" (UID: \"705dd2ce-2ac7-4745-a314-14e119a14624\") " pod="openshift-dns/dns-default-hmcpt" Apr 22 17:54:00.297225 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.297153 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls\") pod \"dns-default-hmcpt\" (UID: \"705dd2ce-2ac7-4745-a314-14e119a14624\") " pod="openshift-dns/dns-default-hmcpt" Apr 22 17:54:00.297225 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.297197 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l64xj\" (UniqueName: \"kubernetes.io/projected/705dd2ce-2ac7-4745-a314-14e119a14624-kube-api-access-l64xj\") pod \"dns-default-hmcpt\" (UID: \"705dd2ce-2ac7-4745-a314-14e119a14624\") " pod="openshift-dns/dns-default-hmcpt" Apr 22 17:54:00.297225 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.297227 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/705dd2ce-2ac7-4745-a314-14e119a14624-config-volume\") pod \"dns-default-hmcpt\" (UID: \"705dd2ce-2ac7-4745-a314-14e119a14624\") " pod="openshift-dns/dns-default-hmcpt" Apr 22 17:54:00.297476 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.297266 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert\") pod \"ingress-canary-rbrk7\" (UID: \"aa418e11-9a0e-463a-8262-e078bca4e7a8\") " pod="openshift-ingress-canary/ingress-canary-rbrk7" Apr 22 17:54:00.297476 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.297304 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwxfq\" (UniqueName: \"kubernetes.io/projected/aa418e11-9a0e-463a-8262-e078bca4e7a8-kube-api-access-lwxfq\") pod \"ingress-canary-rbrk7\" (UID: \"aa418e11-9a0e-463a-8262-e078bca4e7a8\") " pod="openshift-ingress-canary/ingress-canary-rbrk7" Apr 22 17:54:00.297476 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:00.297326 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:00.297476 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.297343 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/705dd2ce-2ac7-4745-a314-14e119a14624-tmp-dir\") pod \"dns-default-hmcpt\" (UID: \"705dd2ce-2ac7-4745-a314-14e119a14624\") " pod="openshift-dns/dns-default-hmcpt" Apr 22 17:54:00.297476 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:00.297393 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls podName:705dd2ce-2ac7-4745-a314-14e119a14624 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:00.797372031 +0000 UTC m=+33.597265240 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls") pod "dns-default-hmcpt" (UID: "705dd2ce-2ac7-4745-a314-14e119a14624") : secret "dns-default-metrics-tls" not found Apr 22 17:54:00.297476 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:00.297418 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:00.297476 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:00.297470 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert podName:aa418e11-9a0e-463a-8262-e078bca4e7a8 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:00.797454837 +0000 UTC m=+33.597348044 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert") pod "ingress-canary-rbrk7" (UID: "aa418e11-9a0e-463a-8262-e078bca4e7a8") : secret "canary-serving-cert" not found Apr 22 17:54:00.297993 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.297968 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/705dd2ce-2ac7-4745-a314-14e119a14624-tmp-dir\") pod \"dns-default-hmcpt\" (UID: \"705dd2ce-2ac7-4745-a314-14e119a14624\") " pod="openshift-dns/dns-default-hmcpt" Apr 22 17:54:00.298229 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.298205 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/705dd2ce-2ac7-4745-a314-14e119a14624-config-volume\") pod \"dns-default-hmcpt\" (UID: \"705dd2ce-2ac7-4745-a314-14e119a14624\") " pod="openshift-dns/dns-default-hmcpt" Apr 22 17:54:00.309289 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.309177 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l64xj\" (UniqueName: \"kubernetes.io/projected/705dd2ce-2ac7-4745-a314-14e119a14624-kube-api-access-l64xj\") pod \"dns-default-hmcpt\" (UID: \"705dd2ce-2ac7-4745-a314-14e119a14624\") " pod="openshift-dns/dns-default-hmcpt" Apr 22 17:54:00.309383 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.309289 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwxfq\" (UniqueName: \"kubernetes.io/projected/aa418e11-9a0e-463a-8262-e078bca4e7a8-kube-api-access-lwxfq\") pod \"ingress-canary-rbrk7\" (UID: \"aa418e11-9a0e-463a-8262-e078bca4e7a8\") " pod="openshift-ingress-canary/ingress-canary-rbrk7" Apr 22 17:54:00.398173 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.398146 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs\") pod \"network-metrics-daemon-dv96w\" (UID: \"92650e2d-54ea-4904-8ee5-235164ed2949\") " pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:54:00.398316 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:00.398296 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:54:00.398379 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:00.398369 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs podName:92650e2d-54ea-4904-8ee5-235164ed2949 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:32.398350529 +0000 UTC m=+65.198243746 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs") pod "network-metrics-daemon-dv96w" (UID: "92650e2d-54ea-4904-8ee5-235164ed2949") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 17:54:00.499237 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.499212 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ll6tj\" (UniqueName: \"kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj\") pod \"network-check-target-ngcz8\" (UID: \"6fb74441-7ec5-4482-ad08-21d23adeeb37\") " pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:54:00.499422 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:00.499403 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 17:54:00.499470 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:00.499429 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 17:54:00.499470 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:00.499445 2566 projected.go:194] Error preparing data for projected volume kube-api-access-ll6tj for pod openshift-network-diagnostics/network-check-target-ngcz8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:54:00.499569 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:00.499499 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj podName:6fb74441-7ec5-4482-ad08-21d23adeeb37 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:32.499482957 +0000 UTC m=+65.299376155 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-ll6tj" (UniqueName: "kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj") pod "network-check-target-ngcz8" (UID: "6fb74441-7ec5-4482-ad08-21d23adeeb37") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 17:54:00.801288 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.801258 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert\") pod \"ingress-canary-rbrk7\" (UID: \"aa418e11-9a0e-463a-8262-e078bca4e7a8\") " pod="openshift-ingress-canary/ingress-canary-rbrk7" Apr 22 17:54:00.801700 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:00.801311 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls\") pod \"dns-default-hmcpt\" (UID: \"705dd2ce-2ac7-4745-a314-14e119a14624\") " pod="openshift-dns/dns-default-hmcpt" Apr 22 17:54:00.801700 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:00.801393 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:00.801700 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:00.801412 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:00.801700 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:00.801444 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls podName:705dd2ce-2ac7-4745-a314-14e119a14624 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:01.80142854 +0000 UTC m=+34.601321739 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls") pod "dns-default-hmcpt" (UID: "705dd2ce-2ac7-4745-a314-14e119a14624") : secret "dns-default-metrics-tls" not found Apr 22 17:54:00.801700 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:00.801475 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert podName:aa418e11-9a0e-463a-8262-e078bca4e7a8 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:01.801456436 +0000 UTC m=+34.601349637 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert") pod "ingress-canary-rbrk7" (UID: "aa418e11-9a0e-463a-8262-e078bca4e7a8") : secret "canary-serving-cert" not found Apr 22 17:54:01.775254 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:01.775225 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:54:01.775495 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:01.775225 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:54:01.778350 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:01.778327 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 17:54:01.778456 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:01.778358 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 17:54:01.778456 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:01.778358 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:54:01.779714 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:01.779693 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jtfq9\"" Apr 22 17:54:01.779815 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:01.779729 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qsx25\"" Apr 22 17:54:01.808400 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:01.808377 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls\") pod \"dns-default-hmcpt\" (UID: \"705dd2ce-2ac7-4745-a314-14e119a14624\") " pod="openshift-dns/dns-default-hmcpt" Apr 22 17:54:01.808736 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:01.808422 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert\") pod \"ingress-canary-rbrk7\" (UID: \"aa418e11-9a0e-463a-8262-e078bca4e7a8\") " pod="openshift-ingress-canary/ingress-canary-rbrk7" Apr 22 17:54:01.808736 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:01.808496 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:01.808736 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:01.808505 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:01.808736 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:01.808547 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls podName:705dd2ce-2ac7-4745-a314-14e119a14624 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:03.808532947 +0000 UTC m=+36.608426145 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls") pod "dns-default-hmcpt" (UID: "705dd2ce-2ac7-4745-a314-14e119a14624") : secret "dns-default-metrics-tls" not found Apr 22 17:54:01.808736 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:01.808560 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert podName:aa418e11-9a0e-463a-8262-e078bca4e7a8 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:03.808554922 +0000 UTC m=+36.608448119 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert") pod "ingress-canary-rbrk7" (UID: "aa418e11-9a0e-463a-8262-e078bca4e7a8") : secret "canary-serving-cert" not found Apr 22 17:54:02.069304 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:02.069234 2566 generic.go:358] "Generic (PLEG): container finished" podID="c437ff89-37eb-4bee-a67b-f2918685eee5" containerID="c9e043e03a99ab264a6fb2fed67e96ec78c82f16c20d5ff460110133bb776c51" exitCode=0 Apr 22 17:54:02.069442 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:02.069312 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zmjph" event={"ID":"c437ff89-37eb-4bee-a67b-f2918685eee5","Type":"ContainerDied","Data":"c9e043e03a99ab264a6fb2fed67e96ec78c82f16c20d5ff460110133bb776c51"} Apr 22 17:54:03.073914 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:03.073882 2566 generic.go:358] "Generic (PLEG): container finished" podID="c437ff89-37eb-4bee-a67b-f2918685eee5" containerID="5adfe5c30e459285bdcbc5b4bae88cef06c6365d5ccfe0d1305ab0c345291347" exitCode=0 Apr 22 17:54:03.073914 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:03.073922 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zmjph" event={"ID":"c437ff89-37eb-4bee-a67b-f2918685eee5","Type":"ContainerDied","Data":"5adfe5c30e459285bdcbc5b4bae88cef06c6365d5ccfe0d1305ab0c345291347"} Apr 22 17:54:03.820078 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:03.820050 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls\") pod \"dns-default-hmcpt\" (UID: \"705dd2ce-2ac7-4745-a314-14e119a14624\") " pod="openshift-dns/dns-default-hmcpt" Apr 22 17:54:03.820217 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:03.820095 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert\") pod \"ingress-canary-rbrk7\" (UID: \"aa418e11-9a0e-463a-8262-e078bca4e7a8\") " pod="openshift-ingress-canary/ingress-canary-rbrk7" Apr 22 17:54:03.820217 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:03.820181 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:03.820217 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:03.820199 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:03.820316 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:03.820236 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls podName:705dd2ce-2ac7-4745-a314-14e119a14624 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:07.820220615 +0000 UTC m=+40.620113813 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls") pod "dns-default-hmcpt" (UID: "705dd2ce-2ac7-4745-a314-14e119a14624") : secret "dns-default-metrics-tls" not found Apr 22 17:54:03.820316 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:03.820251 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert podName:aa418e11-9a0e-463a-8262-e078bca4e7a8 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:07.820244348 +0000 UTC m=+40.620137546 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert") pod "ingress-canary-rbrk7" (UID: "aa418e11-9a0e-463a-8262-e078bca4e7a8") : secret "canary-serving-cert" not found Apr 22 17:54:04.078306 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:04.078231 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zmjph" event={"ID":"c437ff89-37eb-4bee-a67b-f2918685eee5","Type":"ContainerStarted","Data":"da900a77fb43ec3cdbea39ff3f08fa0a456e25f4cc6ccfe4d5b62ec0f25677d2"} Apr 22 17:54:04.109816 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:04.109769 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zmjph" podStartSLOduration=5.133267861 podStartE2EDuration="37.10975374s" podCreationTimestamp="2026-04-22 17:53:27 +0000 UTC" firstStartedPulling="2026-04-22 17:53:28.996022051 +0000 UTC m=+1.795915249" lastFinishedPulling="2026-04-22 17:54:00.972507931 +0000 UTC m=+33.772401128" observedRunningTime="2026-04-22 17:54:04.108397907 +0000 UTC m=+36.908291126" watchObservedRunningTime="2026-04-22 17:54:04.10975374 +0000 UTC m=+36.909646957" Apr 22 17:54:07.847579 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:07.847547 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls\") pod \"dns-default-hmcpt\" (UID: \"705dd2ce-2ac7-4745-a314-14e119a14624\") " pod="openshift-dns/dns-default-hmcpt" Apr 22 17:54:07.847960 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:07.847594 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert\") pod \"ingress-canary-rbrk7\" (UID: \"aa418e11-9a0e-463a-8262-e078bca4e7a8\") " pod="openshift-ingress-canary/ingress-canary-rbrk7" Apr 22 17:54:07.847960 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:07.847688 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:07.847960 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:07.847746 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls podName:705dd2ce-2ac7-4745-a314-14e119a14624 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:15.847730877 +0000 UTC m=+48.647624074 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls") pod "dns-default-hmcpt" (UID: "705dd2ce-2ac7-4745-a314-14e119a14624") : secret "dns-default-metrics-tls" not found Apr 22 17:54:07.847960 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:07.847696 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:07.847960 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:07.847823 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert podName:aa418e11-9a0e-463a-8262-e078bca4e7a8 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:15.847810003 +0000 UTC m=+48.647703201 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert") pod "ingress-canary-rbrk7" (UID: "aa418e11-9a0e-463a-8262-e078bca4e7a8") : secret "canary-serving-cert" not found Apr 22 17:54:15.910262 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:15.910216 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls\") pod \"dns-default-hmcpt\" (UID: \"705dd2ce-2ac7-4745-a314-14e119a14624\") " pod="openshift-dns/dns-default-hmcpt" Apr 22 17:54:15.910756 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:15.910273 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert\") pod \"ingress-canary-rbrk7\" (UID: \"aa418e11-9a0e-463a-8262-e078bca4e7a8\") " pod="openshift-ingress-canary/ingress-canary-rbrk7" Apr 22 17:54:15.910756 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:15.910353 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:15.910756 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:15.910357 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:15.910756 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:15.910413 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert podName:aa418e11-9a0e-463a-8262-e078bca4e7a8 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:31.910397254 +0000 UTC m=+64.710290453 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert") pod "ingress-canary-rbrk7" (UID: "aa418e11-9a0e-463a-8262-e078bca4e7a8") : secret "canary-serving-cert" not found Apr 22 17:54:15.910756 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:15.910448 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls podName:705dd2ce-2ac7-4745-a314-14e119a14624 nodeName:}" failed. No retries permitted until 2026-04-22 17:54:31.910422306 +0000 UTC m=+64.710315507 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls") pod "dns-default-hmcpt" (UID: "705dd2ce-2ac7-4745-a314-14e119a14624") : secret "dns-default-metrics-tls" not found Apr 22 17:54:24.065007 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:24.064979 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9z9pj" Apr 22 17:54:32.011018 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:32.010985 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls\") pod \"dns-default-hmcpt\" (UID: \"705dd2ce-2ac7-4745-a314-14e119a14624\") " pod="openshift-dns/dns-default-hmcpt" Apr 22 17:54:32.011471 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:32.011030 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert\") pod \"ingress-canary-rbrk7\" (UID: \"aa418e11-9a0e-463a-8262-e078bca4e7a8\") " pod="openshift-ingress-canary/ingress-canary-rbrk7" Apr 22 17:54:32.011471 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:32.011120 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:54:32.011471 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:32.011121 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:54:32.011471 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:32.011168 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert podName:aa418e11-9a0e-463a-8262-e078bca4e7a8 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:04.011154846 +0000 UTC m=+96.811048044 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert") pod "ingress-canary-rbrk7" (UID: "aa418e11-9a0e-463a-8262-e078bca4e7a8") : secret "canary-serving-cert" not found Apr 22 17:54:32.011471 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:32.011180 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls podName:705dd2ce-2ac7-4745-a314-14e119a14624 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:04.011174642 +0000 UTC m=+96.811067839 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls") pod "dns-default-hmcpt" (UID: "705dd2ce-2ac7-4745-a314-14e119a14624") : secret "dns-default-metrics-tls" not found Apr 22 17:54:32.413146 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:32.413115 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs\") pod \"network-metrics-daemon-dv96w\" (UID: \"92650e2d-54ea-4904-8ee5-235164ed2949\") " pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:54:32.415841 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:32.415824 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 17:54:32.424170 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:32.424153 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:54:32.424248 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:54:32.424208 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs podName:92650e2d-54ea-4904-8ee5-235164ed2949 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:36.424190356 +0000 UTC m=+129.224083554 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs") pod "network-metrics-daemon-dv96w" (UID: "92650e2d-54ea-4904-8ee5-235164ed2949") : secret "metrics-daemon-secret" not found Apr 22 17:54:32.513866 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:32.513845 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ll6tj\" (UniqueName: \"kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj\") pod \"network-check-target-ngcz8\" (UID: \"6fb74441-7ec5-4482-ad08-21d23adeeb37\") " pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:54:32.518928 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:32.518912 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 17:54:32.527834 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:32.527817 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 17:54:32.537736 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:32.537712 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll6tj\" (UniqueName: \"kubernetes.io/projected/6fb74441-7ec5-4482-ad08-21d23adeeb37-kube-api-access-ll6tj\") pod \"network-check-target-ngcz8\" (UID: \"6fb74441-7ec5-4482-ad08-21d23adeeb37\") " pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:54:32.688266 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:32.688208 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jtfq9\"" Apr 22 17:54:32.696572 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:32.696554 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:54:32.846901 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:32.846874 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ngcz8"] Apr 22 17:54:32.850405 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:54:32.850380 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fb74441_7ec5_4482_ad08_21d23adeeb37.slice/crio-7b18c83f1694b68d1c8c60d95558ab988dccee37136f1a523aca54f3264ee8ed WatchSource:0}: Error finding container 7b18c83f1694b68d1c8c60d95558ab988dccee37136f1a523aca54f3264ee8ed: Status 404 returned error can't find the container with id 7b18c83f1694b68d1c8c60d95558ab988dccee37136f1a523aca54f3264ee8ed Apr 22 17:54:33.133847 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:33.133777 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ngcz8" event={"ID":"6fb74441-7ec5-4482-ad08-21d23adeeb37","Type":"ContainerStarted","Data":"7b18c83f1694b68d1c8c60d95558ab988dccee37136f1a523aca54f3264ee8ed"} Apr 22 17:54:36.140927 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:36.140895 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ngcz8" event={"ID":"6fb74441-7ec5-4482-ad08-21d23adeeb37","Type":"ContainerStarted","Data":"8709170defb5f42e32193f8550dd8804c39037dc6d067d165063ae40931ac1f3"} Apr 22 17:54:36.141309 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:36.141012 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:54:36.156974 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:54:36.156924 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-ngcz8" podStartSLOduration=66.503921747 podStartE2EDuration="1m9.156913261s" podCreationTimestamp="2026-04-22 17:53:27 +0000 UTC" firstStartedPulling="2026-04-22 17:54:32.8522027 +0000 UTC m=+65.652095898" lastFinishedPulling="2026-04-22 17:54:35.505194211 +0000 UTC m=+68.305087412" observedRunningTime="2026-04-22 17:54:36.156391647 +0000 UTC m=+68.956284860" watchObservedRunningTime="2026-04-22 17:54:36.156913261 +0000 UTC m=+68.956806481" Apr 22 17:55:04.017482 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:04.017433 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert\") pod \"ingress-canary-rbrk7\" (UID: \"aa418e11-9a0e-463a-8262-e078bca4e7a8\") " pod="openshift-ingress-canary/ingress-canary-rbrk7" Apr 22 17:55:04.017966 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:04.017524 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls\") pod \"dns-default-hmcpt\" (UID: \"705dd2ce-2ac7-4745-a314-14e119a14624\") " pod="openshift-dns/dns-default-hmcpt" Apr 22 17:55:04.017966 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:04.017583 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 17:55:04.017966 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:04.017629 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 17:55:04.017966 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:04.017662 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert podName:aa418e11-9a0e-463a-8262-e078bca4e7a8 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:08.017645647 +0000 UTC m=+160.817538865 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert") pod "ingress-canary-rbrk7" (UID: "aa418e11-9a0e-463a-8262-e078bca4e7a8") : secret "canary-serving-cert" not found Apr 22 17:55:04.017966 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:04.017691 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls podName:705dd2ce-2ac7-4745-a314-14e119a14624 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:08.017674148 +0000 UTC m=+160.817567347 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls") pod "dns-default-hmcpt" (UID: "705dd2ce-2ac7-4745-a314-14e119a14624") : secret "dns-default-metrics-tls" not found Apr 22 17:55:07.144997 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:07.144970 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-ngcz8" Apr 22 17:55:33.543051 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.543008 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-2fnvq"] Apr 22 17:55:33.545843 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.545826 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-2fnvq" Apr 22 17:55:33.548834 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.548810 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 17:55:33.548939 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.548851 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 17:55:33.550084 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.550043 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 17:55:33.550173 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.550066 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-5gtd2\"" Apr 22 17:55:33.550173 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.550115 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 17:55:33.557370 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.557345 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-2fnvq"] Apr 22 17:55:33.558843 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.558824 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 17:55:33.620149 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.620123 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9af35975-ba53-433d-8c3b-454c55c4ffd7-snapshots\") pod \"insights-operator-585dfdc468-2fnvq\" (UID: \"9af35975-ba53-433d-8c3b-454c55c4ffd7\") " pod="openshift-insights/insights-operator-585dfdc468-2fnvq" Apr 22 17:55:33.620245 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.620169 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9af35975-ba53-433d-8c3b-454c55c4ffd7-service-ca-bundle\") pod \"insights-operator-585dfdc468-2fnvq\" (UID: \"9af35975-ba53-433d-8c3b-454c55c4ffd7\") " pod="openshift-insights/insights-operator-585dfdc468-2fnvq" Apr 22 17:55:33.620245 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.620223 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9af35975-ba53-433d-8c3b-454c55c4ffd7-tmp\") pod \"insights-operator-585dfdc468-2fnvq\" (UID: \"9af35975-ba53-433d-8c3b-454c55c4ffd7\") " pod="openshift-insights/insights-operator-585dfdc468-2fnvq" Apr 22 17:55:33.620330 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.620262 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jmxv\" (UniqueName: \"kubernetes.io/projected/9af35975-ba53-433d-8c3b-454c55c4ffd7-kube-api-access-8jmxv\") pod \"insights-operator-585dfdc468-2fnvq\" (UID: \"9af35975-ba53-433d-8c3b-454c55c4ffd7\") " pod="openshift-insights/insights-operator-585dfdc468-2fnvq" Apr 22 17:55:33.620330 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.620297 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9af35975-ba53-433d-8c3b-454c55c4ffd7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-2fnvq\" (UID: \"9af35975-ba53-433d-8c3b-454c55c4ffd7\") " pod="openshift-insights/insights-operator-585dfdc468-2fnvq" Apr 22 17:55:33.620330 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.620317 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9af35975-ba53-433d-8c3b-454c55c4ffd7-serving-cert\") pod \"insights-operator-585dfdc468-2fnvq\" (UID: \"9af35975-ba53-433d-8c3b-454c55c4ffd7\") " pod="openshift-insights/insights-operator-585dfdc468-2fnvq" Apr 22 17:55:33.648785 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.648760 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-mrr8j"] Apr 22 17:55:33.651415 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.651401 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv"] Apr 22 17:55:33.651544 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.651528 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" Apr 22 17:55:33.654437 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.654421 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv" Apr 22 17:55:33.654603 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.654575 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:55:33.654719 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.654596 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 17:55:33.654843 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.654825 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 17:55:33.655071 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.655055 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-7s7s7\"" Apr 22 17:55:33.655346 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.655332 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 17:55:33.659234 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.659214 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 17:55:33.659330 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.659249 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-j5q25\"" Apr 22 17:55:33.659330 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.659218 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 17:55:33.659330 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.659288 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 17:55:33.660111 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.660096 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 17:55:33.662209 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.662189 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 17:55:33.671012 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.670993 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-mrr8j"] Apr 22 17:55:33.671922 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.671902 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv"] Apr 22 17:55:33.720656 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.720635 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2rt2\" (UniqueName: \"kubernetes.io/projected/84a18cd9-ceac-4cc8-972b-92e3c17a262b-kube-api-access-f2rt2\") pod \"console-operator-9d4b6777b-mrr8j\" (UID: \"84a18cd9-ceac-4cc8-972b-92e3c17a262b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" Apr 22 17:55:33.720757 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.720662 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4f93890b-e678-4591-8604-de9e0ff75905-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-wplrv\" (UID: \"4f93890b-e678-4591-8604-de9e0ff75905\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv" Apr 22 17:55:33.720757 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.720686 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jmxv\" (UniqueName: \"kubernetes.io/projected/9af35975-ba53-433d-8c3b-454c55c4ffd7-kube-api-access-8jmxv\") pod \"insights-operator-585dfdc468-2fnvq\" (UID: \"9af35975-ba53-433d-8c3b-454c55c4ffd7\") " pod="openshift-insights/insights-operator-585dfdc468-2fnvq" Apr 22 17:55:33.720757 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.720703 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnrwr\" (UniqueName: \"kubernetes.io/projected/4f93890b-e678-4591-8604-de9e0ff75905-kube-api-access-gnrwr\") pod \"cluster-monitoring-operator-75587bd455-wplrv\" (UID: \"4f93890b-e678-4591-8604-de9e0ff75905\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv" Apr 22 17:55:33.720905 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.720830 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84a18cd9-ceac-4cc8-972b-92e3c17a262b-config\") pod \"console-operator-9d4b6777b-mrr8j\" (UID: \"84a18cd9-ceac-4cc8-972b-92e3c17a262b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" Apr 22 17:55:33.720946 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.720901 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9af35975-ba53-433d-8c3b-454c55c4ffd7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-2fnvq\" (UID: \"9af35975-ba53-433d-8c3b-454c55c4ffd7\") " pod="openshift-insights/insights-operator-585dfdc468-2fnvq" Apr 22 17:55:33.720946 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.720924 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84a18cd9-ceac-4cc8-972b-92e3c17a262b-trusted-ca\") pod \"console-operator-9d4b6777b-mrr8j\" (UID: \"84a18cd9-ceac-4cc8-972b-92e3c17a262b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" Apr 22 17:55:33.720946 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.720944 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9af35975-ba53-433d-8c3b-454c55c4ffd7-serving-cert\") pod \"insights-operator-585dfdc468-2fnvq\" (UID: \"9af35975-ba53-433d-8c3b-454c55c4ffd7\") " pod="openshift-insights/insights-operator-585dfdc468-2fnvq" Apr 22 17:55:33.721114 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.720966 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9af35975-ba53-433d-8c3b-454c55c4ffd7-snapshots\") pod \"insights-operator-585dfdc468-2fnvq\" (UID: \"9af35975-ba53-433d-8c3b-454c55c4ffd7\") " pod="openshift-insights/insights-operator-585dfdc468-2fnvq" Apr 22 17:55:33.721114 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.721011 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f93890b-e678-4591-8604-de9e0ff75905-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wplrv\" (UID: \"4f93890b-e678-4591-8604-de9e0ff75905\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv" Apr 22 17:55:33.721114 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.721042 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9af35975-ba53-433d-8c3b-454c55c4ffd7-service-ca-bundle\") pod \"insights-operator-585dfdc468-2fnvq\" (UID: \"9af35975-ba53-433d-8c3b-454c55c4ffd7\") " pod="openshift-insights/insights-operator-585dfdc468-2fnvq" Apr 22 17:55:33.721114 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.721068 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9af35975-ba53-433d-8c3b-454c55c4ffd7-tmp\") pod \"insights-operator-585dfdc468-2fnvq\" (UID: \"9af35975-ba53-433d-8c3b-454c55c4ffd7\") " pod="openshift-insights/insights-operator-585dfdc468-2fnvq" Apr 22 17:55:33.721114 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.721095 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84a18cd9-ceac-4cc8-972b-92e3c17a262b-serving-cert\") pod \"console-operator-9d4b6777b-mrr8j\" (UID: \"84a18cd9-ceac-4cc8-972b-92e3c17a262b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" Apr 22 17:55:33.721569 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.721547 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9af35975-ba53-433d-8c3b-454c55c4ffd7-tmp\") pod \"insights-operator-585dfdc468-2fnvq\" (UID: \"9af35975-ba53-433d-8c3b-454c55c4ffd7\") " pod="openshift-insights/insights-operator-585dfdc468-2fnvq" Apr 22 17:55:33.721669 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.721650 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9af35975-ba53-433d-8c3b-454c55c4ffd7-snapshots\") pod \"insights-operator-585dfdc468-2fnvq\" (UID: \"9af35975-ba53-433d-8c3b-454c55c4ffd7\") " pod="openshift-insights/insights-operator-585dfdc468-2fnvq" Apr 22 17:55:33.721669 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.721661 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9af35975-ba53-433d-8c3b-454c55c4ffd7-service-ca-bundle\") pod \"insights-operator-585dfdc468-2fnvq\" (UID: \"9af35975-ba53-433d-8c3b-454c55c4ffd7\") " pod="openshift-insights/insights-operator-585dfdc468-2fnvq" Apr 22 17:55:33.721979 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.721959 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9af35975-ba53-433d-8c3b-454c55c4ffd7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-2fnvq\" (UID: \"9af35975-ba53-433d-8c3b-454c55c4ffd7\") " pod="openshift-insights/insights-operator-585dfdc468-2fnvq" Apr 22 17:55:33.723239 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.723213 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9af35975-ba53-433d-8c3b-454c55c4ffd7-serving-cert\") pod \"insights-operator-585dfdc468-2fnvq\" (UID: \"9af35975-ba53-433d-8c3b-454c55c4ffd7\") " pod="openshift-insights/insights-operator-585dfdc468-2fnvq" Apr 22 17:55:33.728458 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.728436 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jmxv\" (UniqueName: \"kubernetes.io/projected/9af35975-ba53-433d-8c3b-454c55c4ffd7-kube-api-access-8jmxv\") pod \"insights-operator-585dfdc468-2fnvq\" (UID: \"9af35975-ba53-433d-8c3b-454c55c4ffd7\") " pod="openshift-insights/insights-operator-585dfdc468-2fnvq" Apr 22 17:55:33.740374 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.740356 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ctlkb"] Apr 22 17:55:33.744074 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.744060 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ctlkb" Apr 22 17:55:33.747057 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.747037 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 17:55:33.747057 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.747049 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 17:55:33.747211 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.747080 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-926vz\"" Apr 22 17:55:33.747211 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.747082 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 17:55:33.747211 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.747127 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-56975f448b-wh8bp"] Apr 22 17:55:33.749725 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.749708 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:33.752439 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.752422 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 17:55:33.753113 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.753091 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 17:55:33.753575 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.753560 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 17:55:33.754008 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.753989 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 17:55:33.754337 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.754321 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 17:55:33.754536 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.754512 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ctlkb"] Apr 22 17:55:33.754895 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.754876 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-5fvs6\"" Apr 22 17:55:33.755915 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.755897 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 17:55:33.762874 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.762840 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-56975f448b-wh8bp"] Apr 22 17:55:33.822278 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.822209 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnrwr\" (UniqueName: \"kubernetes.io/projected/4f93890b-e678-4591-8604-de9e0ff75905-kube-api-access-gnrwr\") pod \"cluster-monitoring-operator-75587bd455-wplrv\" (UID: \"4f93890b-e678-4591-8604-de9e0ff75905\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv" Apr 22 17:55:33.822278 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.822242 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84a18cd9-ceac-4cc8-972b-92e3c17a262b-config\") pod \"console-operator-9d4b6777b-mrr8j\" (UID: \"84a18cd9-ceac-4cc8-972b-92e3c17a262b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" Apr 22 17:55:33.822278 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.822264 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-default-certificate\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:33.822518 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.822288 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-stats-auth\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:33.822518 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.822352 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/92750ce1-a0fa-4514-b9b0-09845663619d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ctlkb\" (UID: \"92750ce1-a0fa-4514-b9b0-09845663619d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ctlkb" Apr 22 17:55:33.822518 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.822385 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpqpl\" (UniqueName: \"kubernetes.io/projected/92750ce1-a0fa-4514-b9b0-09845663619d-kube-api-access-fpqpl\") pod \"cluster-samples-operator-6dc5bdb6b4-ctlkb\" (UID: \"92750ce1-a0fa-4514-b9b0-09845663619d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ctlkb" Apr 22 17:55:33.822518 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.822408 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84a18cd9-ceac-4cc8-972b-92e3c17a262b-trusted-ca\") pod \"console-operator-9d4b6777b-mrr8j\" (UID: \"84a18cd9-ceac-4cc8-972b-92e3c17a262b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" Apr 22 17:55:33.822518 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.822464 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f93890b-e678-4591-8604-de9e0ff75905-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wplrv\" (UID: \"4f93890b-e678-4591-8604-de9e0ff75905\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv" Apr 22 17:55:33.822518 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.822489 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-service-ca-bundle\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:33.822518 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.822516 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbfd5\" (UniqueName: \"kubernetes.io/projected/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-kube-api-access-lbfd5\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:33.822833 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:33.822625 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:55:33.822833 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.822666 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-metrics-certs\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:33.822833 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:33.822698 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f93890b-e678-4591-8604-de9e0ff75905-cluster-monitoring-operator-tls podName:4f93890b-e678-4591-8604-de9e0ff75905 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:34.32267806 +0000 UTC m=+127.122571278 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4f93890b-e678-4591-8604-de9e0ff75905-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wplrv" (UID: "4f93890b-e678-4591-8604-de9e0ff75905") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:55:33.822833 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.822761 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84a18cd9-ceac-4cc8-972b-92e3c17a262b-serving-cert\") pod \"console-operator-9d4b6777b-mrr8j\" (UID: \"84a18cd9-ceac-4cc8-972b-92e3c17a262b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" Apr 22 17:55:33.822833 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.822791 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2rt2\" (UniqueName: \"kubernetes.io/projected/84a18cd9-ceac-4cc8-972b-92e3c17a262b-kube-api-access-f2rt2\") pod \"console-operator-9d4b6777b-mrr8j\" (UID: \"84a18cd9-ceac-4cc8-972b-92e3c17a262b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" Apr 22 17:55:33.822833 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.822823 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4f93890b-e678-4591-8604-de9e0ff75905-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-wplrv\" (UID: \"4f93890b-e678-4591-8604-de9e0ff75905\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv" Apr 22 17:55:33.823173 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.823139 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84a18cd9-ceac-4cc8-972b-92e3c17a262b-config\") pod \"console-operator-9d4b6777b-mrr8j\" (UID: \"84a18cd9-ceac-4cc8-972b-92e3c17a262b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" Apr 22 17:55:33.823379 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.823359 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/84a18cd9-ceac-4cc8-972b-92e3c17a262b-trusted-ca\") pod \"console-operator-9d4b6777b-mrr8j\" (UID: \"84a18cd9-ceac-4cc8-972b-92e3c17a262b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" Apr 22 17:55:33.823471 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.823452 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/4f93890b-e678-4591-8604-de9e0ff75905-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-wplrv\" (UID: \"4f93890b-e678-4591-8604-de9e0ff75905\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv" Apr 22 17:55:33.825053 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.825035 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84a18cd9-ceac-4cc8-972b-92e3c17a262b-serving-cert\") pod \"console-operator-9d4b6777b-mrr8j\" (UID: \"84a18cd9-ceac-4cc8-972b-92e3c17a262b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" Apr 22 17:55:33.830829 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.830807 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnrwr\" (UniqueName: \"kubernetes.io/projected/4f93890b-e678-4591-8604-de9e0ff75905-kube-api-access-gnrwr\") pod \"cluster-monitoring-operator-75587bd455-wplrv\" (UID: \"4f93890b-e678-4591-8604-de9e0ff75905\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv" Apr 22 17:55:33.831303 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.831286 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2rt2\" (UniqueName: \"kubernetes.io/projected/84a18cd9-ceac-4cc8-972b-92e3c17a262b-kube-api-access-f2rt2\") pod \"console-operator-9d4b6777b-mrr8j\" (UID: \"84a18cd9-ceac-4cc8-972b-92e3c17a262b\") " pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" Apr 22 17:55:33.856140 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.856121 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-2fnvq" Apr 22 17:55:33.923270 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.923242 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/92750ce1-a0fa-4514-b9b0-09845663619d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ctlkb\" (UID: \"92750ce1-a0fa-4514-b9b0-09845663619d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ctlkb" Apr 22 17:55:33.923354 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.923286 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpqpl\" (UniqueName: \"kubernetes.io/projected/92750ce1-a0fa-4514-b9b0-09845663619d-kube-api-access-fpqpl\") pod \"cluster-samples-operator-6dc5bdb6b4-ctlkb\" (UID: \"92750ce1-a0fa-4514-b9b0-09845663619d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ctlkb" Apr 22 17:55:33.923392 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.923364 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-service-ca-bundle\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:33.923450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.923392 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbfd5\" (UniqueName: \"kubernetes.io/projected/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-kube-api-access-lbfd5\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:33.923450 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.923425 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-metrics-certs\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:33.923545 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.923474 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-default-certificate\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:33.923545 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.923497 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-stats-auth\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:33.924063 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:33.923761 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:55:33.924063 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:33.923780 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-service-ca-bundle podName:c355c33b-68d6-4bb8-aeda-28cdf82e8d61 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:34.423758149 +0000 UTC m=+127.223651366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-service-ca-bundle") pod "router-default-56975f448b-wh8bp" (UID: "c355c33b-68d6-4bb8-aeda-28cdf82e8d61") : configmap references non-existent config key: service-ca.crt Apr 22 17:55:33.924063 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:33.923816 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-metrics-certs podName:c355c33b-68d6-4bb8-aeda-28cdf82e8d61 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:34.423800866 +0000 UTC m=+127.223694077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-metrics-certs") pod "router-default-56975f448b-wh8bp" (UID: "c355c33b-68d6-4bb8-aeda-28cdf82e8d61") : secret "router-metrics-certs-default" not found Apr 22 17:55:33.924063 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:33.923874 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:55:33.924063 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:33.923921 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92750ce1-a0fa-4514-b9b0-09845663619d-samples-operator-tls podName:92750ce1-a0fa-4514-b9b0-09845663619d nodeName:}" failed. No retries permitted until 2026-04-22 17:55:34.423910859 +0000 UTC m=+127.223804062 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/92750ce1-a0fa-4514-b9b0-09845663619d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ctlkb" (UID: "92750ce1-a0fa-4514-b9b0-09845663619d") : secret "samples-operator-tls" not found Apr 22 17:55:33.926117 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.926091 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-default-certificate\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:33.926418 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.926401 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-stats-auth\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:33.932886 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.932826 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpqpl\" (UniqueName: \"kubernetes.io/projected/92750ce1-a0fa-4514-b9b0-09845663619d-kube-api-access-fpqpl\") pod \"cluster-samples-operator-6dc5bdb6b4-ctlkb\" (UID: \"92750ce1-a0fa-4514-b9b0-09845663619d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ctlkb" Apr 22 17:55:33.932886 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.932877 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbfd5\" (UniqueName: \"kubernetes.io/projected/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-kube-api-access-lbfd5\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:33.961815 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.961790 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" Apr 22 17:55:33.964500 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:33.964477 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-2fnvq"] Apr 22 17:55:33.968188 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:55:33.968167 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9af35975_ba53_433d_8c3b_454c55c4ffd7.slice/crio-4e2a1f035312ffad90c8086cda4dfe71a3ba05c5bc89b534b2f9cc47549f3463 WatchSource:0}: Error finding container 4e2a1f035312ffad90c8086cda4dfe71a3ba05c5bc89b534b2f9cc47549f3463: Status 404 returned error can't find the container with id 4e2a1f035312ffad90c8086cda4dfe71a3ba05c5bc89b534b2f9cc47549f3463 Apr 22 17:55:34.076215 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:34.076151 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-mrr8j"] Apr 22 17:55:34.079831 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:55:34.079800 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84a18cd9_ceac_4cc8_972b_92e3c17a262b.slice/crio-af880b9a326d3ec3e4533cc50ae88754b86966d018a71d17c5f7acd34c784c93 WatchSource:0}: Error finding container af880b9a326d3ec3e4533cc50ae88754b86966d018a71d17c5f7acd34c784c93: Status 404 returned error can't find the container with id af880b9a326d3ec3e4533cc50ae88754b86966d018a71d17c5f7acd34c784c93 Apr 22 17:55:34.243990 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:34.243951 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" event={"ID":"84a18cd9-ceac-4cc8-972b-92e3c17a262b","Type":"ContainerStarted","Data":"af880b9a326d3ec3e4533cc50ae88754b86966d018a71d17c5f7acd34c784c93"} Apr 22 17:55:34.244778 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:34.244755 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-2fnvq" event={"ID":"9af35975-ba53-433d-8c3b-454c55c4ffd7","Type":"ContainerStarted","Data":"4e2a1f035312ffad90c8086cda4dfe71a3ba05c5bc89b534b2f9cc47549f3463"} Apr 22 17:55:34.326582 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:34.326517 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f93890b-e678-4591-8604-de9e0ff75905-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wplrv\" (UID: \"4f93890b-e678-4591-8604-de9e0ff75905\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv" Apr 22 17:55:34.326681 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:34.326662 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:55:34.326732 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:34.326723 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f93890b-e678-4591-8604-de9e0ff75905-cluster-monitoring-operator-tls podName:4f93890b-e678-4591-8604-de9e0ff75905 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:35.326708825 +0000 UTC m=+128.126602026 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4f93890b-e678-4591-8604-de9e0ff75905-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wplrv" (UID: "4f93890b-e678-4591-8604-de9e0ff75905") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:55:34.427463 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:34.427436 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/92750ce1-a0fa-4514-b9b0-09845663619d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ctlkb\" (UID: \"92750ce1-a0fa-4514-b9b0-09845663619d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ctlkb" Apr 22 17:55:34.427574 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:34.427495 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-service-ca-bundle\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:34.427574 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:34.427518 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-metrics-certs\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:34.427673 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:34.427602 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:55:34.427673 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:34.427605 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:55:34.427673 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:34.427646 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-metrics-certs podName:c355c33b-68d6-4bb8-aeda-28cdf82e8d61 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:35.427634126 +0000 UTC m=+128.227527324 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-metrics-certs") pod "router-default-56975f448b-wh8bp" (UID: "c355c33b-68d6-4bb8-aeda-28cdf82e8d61") : secret "router-metrics-certs-default" not found Apr 22 17:55:34.427673 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:34.427659 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-service-ca-bundle podName:c355c33b-68d6-4bb8-aeda-28cdf82e8d61 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:35.427653209 +0000 UTC m=+128.227546407 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-service-ca-bundle") pod "router-default-56975f448b-wh8bp" (UID: "c355c33b-68d6-4bb8-aeda-28cdf82e8d61") : configmap references non-existent config key: service-ca.crt Apr 22 17:55:34.427673 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:34.427668 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92750ce1-a0fa-4514-b9b0-09845663619d-samples-operator-tls podName:92750ce1-a0fa-4514-b9b0-09845663619d nodeName:}" failed. No retries permitted until 2026-04-22 17:55:35.427663387 +0000 UTC m=+128.227556584 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/92750ce1-a0fa-4514-b9b0-09845663619d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ctlkb" (UID: "92750ce1-a0fa-4514-b9b0-09845663619d") : secret "samples-operator-tls" not found Apr 22 17:55:35.335079 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:35.335043 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f93890b-e678-4591-8604-de9e0ff75905-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wplrv\" (UID: \"4f93890b-e678-4591-8604-de9e0ff75905\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv" Apr 22 17:55:35.335514 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:35.335162 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:55:35.335514 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:35.335215 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f93890b-e678-4591-8604-de9e0ff75905-cluster-monitoring-operator-tls podName:4f93890b-e678-4591-8604-de9e0ff75905 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:37.335201409 +0000 UTC m=+130.135094607 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4f93890b-e678-4591-8604-de9e0ff75905-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wplrv" (UID: "4f93890b-e678-4591-8604-de9e0ff75905") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:55:35.436129 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:35.436092 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-service-ca-bundle\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:35.436309 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:35.436148 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-metrics-certs\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:35.436309 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:35.436255 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:55:35.436309 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:35.436279 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-service-ca-bundle podName:c355c33b-68d6-4bb8-aeda-28cdf82e8d61 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:37.436255604 +0000 UTC m=+130.236148803 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-service-ca-bundle") pod "router-default-56975f448b-wh8bp" (UID: "c355c33b-68d6-4bb8-aeda-28cdf82e8d61") : configmap references non-existent config key: service-ca.crt Apr 22 17:55:35.436469 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:35.436311 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-metrics-certs podName:c355c33b-68d6-4bb8-aeda-28cdf82e8d61 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:37.436300316 +0000 UTC m=+130.236193530 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-metrics-certs") pod "router-default-56975f448b-wh8bp" (UID: "c355c33b-68d6-4bb8-aeda-28cdf82e8d61") : secret "router-metrics-certs-default" not found Apr 22 17:55:35.436469 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:35.436364 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/92750ce1-a0fa-4514-b9b0-09845663619d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ctlkb\" (UID: \"92750ce1-a0fa-4514-b9b0-09845663619d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ctlkb" Apr 22 17:55:35.436469 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:35.436447 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:55:35.436629 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:35.436489 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92750ce1-a0fa-4514-b9b0-09845663619d-samples-operator-tls podName:92750ce1-a0fa-4514-b9b0-09845663619d nodeName:}" failed. No retries permitted until 2026-04-22 17:55:37.436480813 +0000 UTC m=+130.236374011 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/92750ce1-a0fa-4514-b9b0-09845663619d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ctlkb" (UID: "92750ce1-a0fa-4514-b9b0-09845663619d") : secret "samples-operator-tls" not found Apr 22 17:55:36.445129 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:36.445095 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs\") pod \"network-metrics-daemon-dv96w\" (UID: \"92650e2d-54ea-4904-8ee5-235164ed2949\") " pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:55:36.445484 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:36.445227 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 17:55:36.445484 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:36.445287 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs podName:92650e2d-54ea-4904-8ee5-235164ed2949 nodeName:}" failed. No retries permitted until 2026-04-22 17:57:38.445269432 +0000 UTC m=+251.245162650 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs") pod "network-metrics-daemon-dv96w" (UID: "92650e2d-54ea-4904-8ee5-235164ed2949") : secret "metrics-daemon-secret" not found Apr 22 17:55:37.251501 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:37.251469 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-2fnvq" event={"ID":"9af35975-ba53-433d-8c3b-454c55c4ffd7","Type":"ContainerStarted","Data":"d650d108f4859b06a704bf7058072dabcb734cab8d2a015f00a437b5b9c8aad8"} Apr 22 17:55:37.252983 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:37.252964 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/0.log" Apr 22 17:55:37.253101 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:37.252997 2566 generic.go:358] "Generic (PLEG): container finished" podID="84a18cd9-ceac-4cc8-972b-92e3c17a262b" containerID="d214ca9968ac09c49c698ab12d5e0e037d2883c71eb82324b422042f5596908d" exitCode=255 Apr 22 17:55:37.253101 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:37.253036 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" event={"ID":"84a18cd9-ceac-4cc8-972b-92e3c17a262b","Type":"ContainerDied","Data":"d214ca9968ac09c49c698ab12d5e0e037d2883c71eb82324b422042f5596908d"} Apr 22 17:55:37.253220 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:37.253208 2566 scope.go:117] "RemoveContainer" containerID="d214ca9968ac09c49c698ab12d5e0e037d2883c71eb82324b422042f5596908d" Apr 22 17:55:37.268080 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:37.268041 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-2fnvq" podStartSLOduration=1.992936644 podStartE2EDuration="4.26802798s" podCreationTimestamp="2026-04-22 17:55:33 +0000 UTC" firstStartedPulling="2026-04-22 17:55:33.969962003 +0000 UTC m=+126.769855201" lastFinishedPulling="2026-04-22 17:55:36.245053321 +0000 UTC m=+129.044946537" observedRunningTime="2026-04-22 17:55:37.267206377 +0000 UTC m=+130.067099611" watchObservedRunningTime="2026-04-22 17:55:37.26802798 +0000 UTC m=+130.067921196" Apr 22 17:55:37.351504 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:37.351464 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f93890b-e678-4591-8604-de9e0ff75905-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wplrv\" (UID: \"4f93890b-e678-4591-8604-de9e0ff75905\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv" Apr 22 17:55:37.351641 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:37.351540 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:55:37.351641 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:37.351615 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f93890b-e678-4591-8604-de9e0ff75905-cluster-monitoring-operator-tls podName:4f93890b-e678-4591-8604-de9e0ff75905 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:41.351597019 +0000 UTC m=+134.151490233 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4f93890b-e678-4591-8604-de9e0ff75905-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wplrv" (UID: "4f93890b-e678-4591-8604-de9e0ff75905") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:55:37.451795 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:37.451769 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-metrics-certs\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:37.452085 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:37.451834 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/92750ce1-a0fa-4514-b9b0-09845663619d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ctlkb\" (UID: \"92750ce1-a0fa-4514-b9b0-09845663619d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ctlkb" Apr 22 17:55:37.452085 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:37.451918 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:55:37.452085 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:37.451966 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:55:37.452085 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:37.451986 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-metrics-certs podName:c355c33b-68d6-4bb8-aeda-28cdf82e8d61 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:41.451967907 +0000 UTC m=+134.251861126 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-metrics-certs") pod "router-default-56975f448b-wh8bp" (UID: "c355c33b-68d6-4bb8-aeda-28cdf82e8d61") : secret "router-metrics-certs-default" not found Apr 22 17:55:37.452085 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:37.451922 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-service-ca-bundle\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:37.452085 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:37.452002 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-service-ca-bundle podName:c355c33b-68d6-4bb8-aeda-28cdf82e8d61 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:41.451995503 +0000 UTC m=+134.251888701 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-service-ca-bundle") pod "router-default-56975f448b-wh8bp" (UID: "c355c33b-68d6-4bb8-aeda-28cdf82e8d61") : configmap references non-existent config key: service-ca.crt Apr 22 17:55:37.452085 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:37.452066 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92750ce1-a0fa-4514-b9b0-09845663619d-samples-operator-tls podName:92750ce1-a0fa-4514-b9b0-09845663619d nodeName:}" failed. No retries permitted until 2026-04-22 17:55:41.452044949 +0000 UTC m=+134.251938150 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/92750ce1-a0fa-4514-b9b0-09845663619d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ctlkb" (UID: "92750ce1-a0fa-4514-b9b0-09845663619d") : secret "samples-operator-tls" not found Apr 22 17:55:38.032783 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:38.032748 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-psp4c"] Apr 22 17:55:38.035449 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:38.035435 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-psp4c" Apr 22 17:55:38.038092 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:38.038072 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 17:55:38.038192 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:38.038116 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 17:55:38.039201 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:38.039183 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-cx6s6\"" Apr 22 17:55:38.043369 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:38.043348 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-psp4c"] Apr 22 17:55:38.158105 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:38.158076 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62tph\" (UniqueName: \"kubernetes.io/projected/2f310f35-5bdc-4e57-86df-62e51a3c8cdf-kube-api-access-62tph\") pod \"migrator-74bb7799d9-psp4c\" (UID: \"2f310f35-5bdc-4e57-86df-62e51a3c8cdf\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-psp4c" Apr 22 17:55:38.256270 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:38.256242 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/1.log" Apr 22 17:55:38.256586 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:38.256572 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/0.log" Apr 22 17:55:38.256639 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:38.256605 2566 generic.go:358] "Generic (PLEG): container finished" podID="84a18cd9-ceac-4cc8-972b-92e3c17a262b" containerID="41d3416eb7a2b7f28228969a3731d8f3ac812fd0cdfdfdb26a4d3a180222e8ec" exitCode=255 Apr 22 17:55:38.256673 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:38.256633 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" event={"ID":"84a18cd9-ceac-4cc8-972b-92e3c17a262b","Type":"ContainerDied","Data":"41d3416eb7a2b7f28228969a3731d8f3ac812fd0cdfdfdb26a4d3a180222e8ec"} Apr 22 17:55:38.256706 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:38.256670 2566 scope.go:117] "RemoveContainer" containerID="d214ca9968ac09c49c698ab12d5e0e037d2883c71eb82324b422042f5596908d" Apr 22 17:55:38.256911 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:38.256893 2566 scope.go:117] "RemoveContainer" containerID="41d3416eb7a2b7f28228969a3731d8f3ac812fd0cdfdfdb26a4d3a180222e8ec" Apr 22 17:55:38.257092 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:38.257076 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-mrr8j_openshift-console-operator(84a18cd9-ceac-4cc8-972b-92e3c17a262b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" podUID="84a18cd9-ceac-4cc8-972b-92e3c17a262b" Apr 22 17:55:38.258762 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:38.258739 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62tph\" (UniqueName: \"kubernetes.io/projected/2f310f35-5bdc-4e57-86df-62e51a3c8cdf-kube-api-access-62tph\") pod \"migrator-74bb7799d9-psp4c\" (UID: \"2f310f35-5bdc-4e57-86df-62e51a3c8cdf\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-psp4c" Apr 22 17:55:38.266489 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:38.266468 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62tph\" (UniqueName: \"kubernetes.io/projected/2f310f35-5bdc-4e57-86df-62e51a3c8cdf-kube-api-access-62tph\") pod \"migrator-74bb7799d9-psp4c\" (UID: \"2f310f35-5bdc-4e57-86df-62e51a3c8cdf\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-psp4c" Apr 22 17:55:38.344268 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:38.344215 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-psp4c" Apr 22 17:55:38.463269 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:38.463240 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-psp4c"] Apr 22 17:55:38.466024 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:55:38.465996 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f310f35_5bdc_4e57_86df_62e51a3c8cdf.slice/crio-50e1a7bef48b33fca846ed2e790cfdd171229911a22ba06cc584535c20c38dd4 WatchSource:0}: Error finding container 50e1a7bef48b33fca846ed2e790cfdd171229911a22ba06cc584535c20c38dd4: Status 404 returned error can't find the container with id 50e1a7bef48b33fca846ed2e790cfdd171229911a22ba06cc584535c20c38dd4 Apr 22 17:55:39.148498 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:39.148468 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9kn68_bfef47bf-9dff-44e0-8b1a-1397bf347548/dns-node-resolver/0.log" Apr 22 17:55:39.260278 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:39.260247 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/1.log" Apr 22 17:55:39.260677 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:39.260657 2566 scope.go:117] "RemoveContainer" containerID="41d3416eb7a2b7f28228969a3731d8f3ac812fd0cdfdfdb26a4d3a180222e8ec" Apr 22 17:55:39.260911 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:39.260890 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-mrr8j_openshift-console-operator(84a18cd9-ceac-4cc8-972b-92e3c17a262b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" podUID="84a18cd9-ceac-4cc8-972b-92e3c17a262b" Apr 22 17:55:39.261521 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:39.261490 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-psp4c" event={"ID":"2f310f35-5bdc-4e57-86df-62e51a3c8cdf","Type":"ContainerStarted","Data":"50e1a7bef48b33fca846ed2e790cfdd171229911a22ba06cc584535c20c38dd4"} Apr 22 17:55:39.947648 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:39.947623 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2vnst_90a8af9e-a3b4-4682-86d6-985e15148048/node-ca/0.log" Apr 22 17:55:40.265839 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:40.265760 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-psp4c" event={"ID":"2f310f35-5bdc-4e57-86df-62e51a3c8cdf","Type":"ContainerStarted","Data":"e3a46155a0f8ffffd4b62b5de106745930234191a1bd599e7056d7360fde4834"} Apr 22 17:55:40.265839 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:40.265797 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-psp4c" event={"ID":"2f310f35-5bdc-4e57-86df-62e51a3c8cdf","Type":"ContainerStarted","Data":"a6356d2d72060dca1801dc5e814724bcafb5dc86329801ca5f2373e9df5ecfbc"} Apr 22 17:55:40.282658 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:40.282617 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-psp4c" podStartSLOduration=1.279094215 podStartE2EDuration="2.282604124s" podCreationTimestamp="2026-04-22 17:55:38 +0000 UTC" firstStartedPulling="2026-04-22 17:55:38.468226088 +0000 UTC m=+131.268119286" lastFinishedPulling="2026-04-22 17:55:39.471735997 +0000 UTC m=+132.271629195" observedRunningTime="2026-04-22 17:55:40.281438071 +0000 UTC m=+133.081331291" watchObservedRunningTime="2026-04-22 17:55:40.282604124 +0000 UTC m=+133.082497322" Apr 22 17:55:41.384211 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:41.384172 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f93890b-e678-4591-8604-de9e0ff75905-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wplrv\" (UID: \"4f93890b-e678-4591-8604-de9e0ff75905\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv" Apr 22 17:55:41.384575 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:41.384312 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:55:41.384575 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:41.384376 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f93890b-e678-4591-8604-de9e0ff75905-cluster-monitoring-operator-tls podName:4f93890b-e678-4591-8604-de9e0ff75905 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:49.384360155 +0000 UTC m=+142.184253353 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4f93890b-e678-4591-8604-de9e0ff75905-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wplrv" (UID: "4f93890b-e678-4591-8604-de9e0ff75905") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:55:41.484913 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:41.484889 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/92750ce1-a0fa-4514-b9b0-09845663619d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ctlkb\" (UID: \"92750ce1-a0fa-4514-b9b0-09845663619d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ctlkb" Apr 22 17:55:41.484999 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:41.484945 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-service-ca-bundle\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:41.484999 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:41.484970 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-metrics-certs\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:41.485100 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:41.485014 2566 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 17:55:41.485100 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:41.485038 2566 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 17:55:41.485100 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:41.485061 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92750ce1-a0fa-4514-b9b0-09845663619d-samples-operator-tls podName:92750ce1-a0fa-4514-b9b0-09845663619d nodeName:}" failed. No retries permitted until 2026-04-22 17:55:49.485048568 +0000 UTC m=+142.284941767 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/92750ce1-a0fa-4514-b9b0-09845663619d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-ctlkb" (UID: "92750ce1-a0fa-4514-b9b0-09845663619d") : secret "samples-operator-tls" not found Apr 22 17:55:41.485100 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:41.485076 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-metrics-certs podName:c355c33b-68d6-4bb8-aeda-28cdf82e8d61 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:49.48506901 +0000 UTC m=+142.284962208 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-metrics-certs") pod "router-default-56975f448b-wh8bp" (UID: "c355c33b-68d6-4bb8-aeda-28cdf82e8d61") : secret "router-metrics-certs-default" not found Apr 22 17:55:41.485249 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:41.485129 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-service-ca-bundle podName:c355c33b-68d6-4bb8-aeda-28cdf82e8d61 nodeName:}" failed. No retries permitted until 2026-04-22 17:55:49.485110651 +0000 UTC m=+142.285003853 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-service-ca-bundle") pod "router-default-56975f448b-wh8bp" (UID: "c355c33b-68d6-4bb8-aeda-28cdf82e8d61") : configmap references non-existent config key: service-ca.crt Apr 22 17:55:43.962362 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:43.962334 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" Apr 22 17:55:43.962716 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:43.962366 2566 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" Apr 22 17:55:43.962716 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:43.962693 2566 scope.go:117] "RemoveContainer" containerID="41d3416eb7a2b7f28228969a3731d8f3ac812fd0cdfdfdb26a4d3a180222e8ec" Apr 22 17:55:43.962909 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:43.962891 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-mrr8j_openshift-console-operator(84a18cd9-ceac-4cc8-972b-92e3c17a262b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" podUID="84a18cd9-ceac-4cc8-972b-92e3c17a262b" Apr 22 17:55:44.275694 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:44.275632 2566 scope.go:117] "RemoveContainer" containerID="41d3416eb7a2b7f28228969a3731d8f3ac812fd0cdfdfdb26a4d3a180222e8ec" Apr 22 17:55:44.275816 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:44.275791 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-mrr8j_openshift-console-operator(84a18cd9-ceac-4cc8-972b-92e3c17a262b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" podUID="84a18cd9-ceac-4cc8-972b-92e3c17a262b" Apr 22 17:55:49.448094 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:49.448061 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f93890b-e678-4591-8604-de9e0ff75905-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wplrv\" (UID: \"4f93890b-e678-4591-8604-de9e0ff75905\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv" Apr 22 17:55:49.448454 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:49.448214 2566 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 17:55:49.448454 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:49.448293 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f93890b-e678-4591-8604-de9e0ff75905-cluster-monitoring-operator-tls podName:4f93890b-e678-4591-8604-de9e0ff75905 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:05.448274296 +0000 UTC m=+158.248167494 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/4f93890b-e678-4591-8604-de9e0ff75905-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-wplrv" (UID: "4f93890b-e678-4591-8604-de9e0ff75905") : secret "cluster-monitoring-operator-tls" not found Apr 22 17:55:49.548958 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:49.548926 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/92750ce1-a0fa-4514-b9b0-09845663619d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ctlkb\" (UID: \"92750ce1-a0fa-4514-b9b0-09845663619d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ctlkb" Apr 22 17:55:49.549104 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:49.548994 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-service-ca-bundle\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:49.549167 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:49.549147 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-metrics-certs\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:49.549537 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:49.549521 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-service-ca-bundle\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:49.551355 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:49.551331 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c355c33b-68d6-4bb8-aeda-28cdf82e8d61-metrics-certs\") pod \"router-default-56975f448b-wh8bp\" (UID: \"c355c33b-68d6-4bb8-aeda-28cdf82e8d61\") " pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:49.551564 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:49.551547 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/92750ce1-a0fa-4514-b9b0-09845663619d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-ctlkb\" (UID: \"92750ce1-a0fa-4514-b9b0-09845663619d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ctlkb" Apr 22 17:55:49.652407 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:49.652380 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ctlkb" Apr 22 17:55:49.660058 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:49.660034 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:49.781281 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:49.781244 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-56975f448b-wh8bp"] Apr 22 17:55:49.784069 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:55:49.784044 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc355c33b_68d6_4bb8_aeda_28cdf82e8d61.slice/crio-9571a19d36adb757ea25d8396fcc4ce7ed171a58f3bd032015342e911e57c35c WatchSource:0}: Error finding container 9571a19d36adb757ea25d8396fcc4ce7ed171a58f3bd032015342e911e57c35c: Status 404 returned error can't find the container with id 9571a19d36adb757ea25d8396fcc4ce7ed171a58f3bd032015342e911e57c35c Apr 22 17:55:49.795472 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:49.795451 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ctlkb"] Apr 22 17:55:50.289322 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:50.289286 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ctlkb" event={"ID":"92750ce1-a0fa-4514-b9b0-09845663619d","Type":"ContainerStarted","Data":"821b49e0bf274fd7e06a7ab24185b3bbcb1059347ef1978b666a25c4053f3d99"} Apr 22 17:55:50.290469 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:50.290445 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-56975f448b-wh8bp" event={"ID":"c355c33b-68d6-4bb8-aeda-28cdf82e8d61","Type":"ContainerStarted","Data":"c1b724af69a14af826494d7b1b58ca3191f7f6a45dd429502bb94534e82f1118"} Apr 22 17:55:50.290584 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:50.290473 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-56975f448b-wh8bp" event={"ID":"c355c33b-68d6-4bb8-aeda-28cdf82e8d61","Type":"ContainerStarted","Data":"9571a19d36adb757ea25d8396fcc4ce7ed171a58f3bd032015342e911e57c35c"} Apr 22 17:55:50.308714 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:50.308669 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-56975f448b-wh8bp" podStartSLOduration=17.308656489 podStartE2EDuration="17.308656489s" podCreationTimestamp="2026-04-22 17:55:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:55:50.308179562 +0000 UTC m=+143.108072784" watchObservedRunningTime="2026-04-22 17:55:50.308656489 +0000 UTC m=+143.108549712" Apr 22 17:55:50.660393 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:50.660363 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:50.663078 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:50.663057 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:51.293058 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:51.293033 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:51.294317 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:51.294293 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-56975f448b-wh8bp" Apr 22 17:55:52.296299 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:52.296261 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ctlkb" event={"ID":"92750ce1-a0fa-4514-b9b0-09845663619d","Type":"ContainerStarted","Data":"5f00950a00beb8cac6cb73d4d0f573272e6edbaaac6326d5145e9ec0ce1ec268"} Apr 22 17:55:52.296299 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:52.296301 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ctlkb" event={"ID":"92750ce1-a0fa-4514-b9b0-09845663619d","Type":"ContainerStarted","Data":"513358d2ce956157b944c90827b5b3bef4515400a7ada5ff03d3c8c01b9d58d2"} Apr 22 17:55:52.315298 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:52.315250 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-ctlkb" podStartSLOduration=17.718583704 podStartE2EDuration="19.315236629s" podCreationTimestamp="2026-04-22 17:55:33 +0000 UTC" firstStartedPulling="2026-04-22 17:55:49.835192589 +0000 UTC m=+142.635085786" lastFinishedPulling="2026-04-22 17:55:51.43184551 +0000 UTC m=+144.231738711" observedRunningTime="2026-04-22 17:55:52.314111629 +0000 UTC m=+145.114004849" watchObservedRunningTime="2026-04-22 17:55:52.315236629 +0000 UTC m=+145.115129843" Apr 22 17:55:58.775236 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:58.775207 2566 scope.go:117] "RemoveContainer" containerID="41d3416eb7a2b7f28228969a3731d8f3ac812fd0cdfdfdb26a4d3a180222e8ec" Apr 22 17:55:59.314391 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:59.314364 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 17:55:59.314711 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:59.314694 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/1.log" Apr 22 17:55:59.314770 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:59.314727 2566 generic.go:358] "Generic (PLEG): container finished" podID="84a18cd9-ceac-4cc8-972b-92e3c17a262b" containerID="e0d1b0aded456d63035abc99af7d8fb6d34f66764c22dcc6f651258a517c275d" exitCode=255 Apr 22 17:55:59.314805 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:59.314770 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" event={"ID":"84a18cd9-ceac-4cc8-972b-92e3c17a262b","Type":"ContainerDied","Data":"e0d1b0aded456d63035abc99af7d8fb6d34f66764c22dcc6f651258a517c275d"} Apr 22 17:55:59.314805 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:59.314799 2566 scope.go:117] "RemoveContainer" containerID="41d3416eb7a2b7f28228969a3731d8f3ac812fd0cdfdfdb26a4d3a180222e8ec" Apr 22 17:55:59.315104 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:55:59.315088 2566 scope.go:117] "RemoveContainer" containerID="e0d1b0aded456d63035abc99af7d8fb6d34f66764c22dcc6f651258a517c275d" Apr 22 17:55:59.315297 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:55:59.315279 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-mrr8j_openshift-console-operator(84a18cd9-ceac-4cc8-972b-92e3c17a262b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" podUID="84a18cd9-ceac-4cc8-972b-92e3c17a262b" Apr 22 17:56:00.318113 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:00.318085 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 17:56:03.102606 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:56:03.102572 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-rbrk7" podUID="aa418e11-9a0e-463a-8262-e078bca4e7a8" Apr 22 17:56:03.112730 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:56:03.112700 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-hmcpt" podUID="705dd2ce-2ac7-4745-a314-14e119a14624" Apr 22 17:56:03.154967 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.154944 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-m9ns5"] Apr 22 17:56:03.159068 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.159049 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-m9ns5" Apr 22 17:56:03.161995 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.161977 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4dkvz\"" Apr 22 17:56:03.165472 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.165451 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 17:56:03.167340 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.167321 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 17:56:03.185351 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.185332 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-m9ns5"] Apr 22 17:56:03.326011 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.325980 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rbrk7" Apr 22 17:56:03.348256 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.348224 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ac83d655-eec6-4d15-b10d-55f5c7a109d0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m9ns5\" (UID: \"ac83d655-eec6-4d15-b10d-55f5c7a109d0\") " pod="openshift-insights/insights-runtime-extractor-m9ns5" Apr 22 17:56:03.348365 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.348260 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2frk\" (UniqueName: \"kubernetes.io/projected/ac83d655-eec6-4d15-b10d-55f5c7a109d0-kube-api-access-z2frk\") pod \"insights-runtime-extractor-m9ns5\" (UID: \"ac83d655-eec6-4d15-b10d-55f5c7a109d0\") " pod="openshift-insights/insights-runtime-extractor-m9ns5" Apr 22 17:56:03.348365 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.348329 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ac83d655-eec6-4d15-b10d-55f5c7a109d0-crio-socket\") pod \"insights-runtime-extractor-m9ns5\" (UID: \"ac83d655-eec6-4d15-b10d-55f5c7a109d0\") " pod="openshift-insights/insights-runtime-extractor-m9ns5" Apr 22 17:56:03.348441 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.348383 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ac83d655-eec6-4d15-b10d-55f5c7a109d0-data-volume\") pod \"insights-runtime-extractor-m9ns5\" (UID: \"ac83d655-eec6-4d15-b10d-55f5c7a109d0\") " pod="openshift-insights/insights-runtime-extractor-m9ns5" Apr 22 17:56:03.348441 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.348413 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ac83d655-eec6-4d15-b10d-55f5c7a109d0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m9ns5\" (UID: \"ac83d655-eec6-4d15-b10d-55f5c7a109d0\") " pod="openshift-insights/insights-runtime-extractor-m9ns5" Apr 22 17:56:03.448713 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.448690 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ac83d655-eec6-4d15-b10d-55f5c7a109d0-data-volume\") pod \"insights-runtime-extractor-m9ns5\" (UID: \"ac83d655-eec6-4d15-b10d-55f5c7a109d0\") " pod="openshift-insights/insights-runtime-extractor-m9ns5" Apr 22 17:56:03.448791 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.448743 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ac83d655-eec6-4d15-b10d-55f5c7a109d0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m9ns5\" (UID: \"ac83d655-eec6-4d15-b10d-55f5c7a109d0\") " pod="openshift-insights/insights-runtime-extractor-m9ns5" Apr 22 17:56:03.448791 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.448775 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ac83d655-eec6-4d15-b10d-55f5c7a109d0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m9ns5\" (UID: \"ac83d655-eec6-4d15-b10d-55f5c7a109d0\") " pod="openshift-insights/insights-runtime-extractor-m9ns5" Apr 22 17:56:03.448898 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.448798 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2frk\" (UniqueName: \"kubernetes.io/projected/ac83d655-eec6-4d15-b10d-55f5c7a109d0-kube-api-access-z2frk\") pod \"insights-runtime-extractor-m9ns5\" (UID: \"ac83d655-eec6-4d15-b10d-55f5c7a109d0\") " pod="openshift-insights/insights-runtime-extractor-m9ns5" Apr 22 17:56:03.448898 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.448835 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ac83d655-eec6-4d15-b10d-55f5c7a109d0-crio-socket\") pod \"insights-runtime-extractor-m9ns5\" (UID: \"ac83d655-eec6-4d15-b10d-55f5c7a109d0\") " pod="openshift-insights/insights-runtime-extractor-m9ns5" Apr 22 17:56:03.448975 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.448932 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ac83d655-eec6-4d15-b10d-55f5c7a109d0-crio-socket\") pod \"insights-runtime-extractor-m9ns5\" (UID: \"ac83d655-eec6-4d15-b10d-55f5c7a109d0\") " pod="openshift-insights/insights-runtime-extractor-m9ns5" Apr 22 17:56:03.449134 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.449112 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ac83d655-eec6-4d15-b10d-55f5c7a109d0-data-volume\") pod \"insights-runtime-extractor-m9ns5\" (UID: \"ac83d655-eec6-4d15-b10d-55f5c7a109d0\") " pod="openshift-insights/insights-runtime-extractor-m9ns5" Apr 22 17:56:03.449355 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.449337 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ac83d655-eec6-4d15-b10d-55f5c7a109d0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m9ns5\" (UID: \"ac83d655-eec6-4d15-b10d-55f5c7a109d0\") " pod="openshift-insights/insights-runtime-extractor-m9ns5" Apr 22 17:56:03.451179 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.451163 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ac83d655-eec6-4d15-b10d-55f5c7a109d0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m9ns5\" (UID: \"ac83d655-eec6-4d15-b10d-55f5c7a109d0\") " pod="openshift-insights/insights-runtime-extractor-m9ns5" Apr 22 17:56:03.462595 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.462573 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2frk\" (UniqueName: \"kubernetes.io/projected/ac83d655-eec6-4d15-b10d-55f5c7a109d0-kube-api-access-z2frk\") pod \"insights-runtime-extractor-m9ns5\" (UID: \"ac83d655-eec6-4d15-b10d-55f5c7a109d0\") " pod="openshift-insights/insights-runtime-extractor-m9ns5" Apr 22 17:56:03.467359 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.467341 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-m9ns5" Apr 22 17:56:03.599849 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.599814 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-m9ns5"] Apr 22 17:56:03.603841 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:56:03.603811 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac83d655_eec6_4d15_b10d_55f5c7a109d0.slice/crio-940d7c76b4761a2e4dd56f6aa7a0b86c12131b50e938e931f629abc70e9bfc0c WatchSource:0}: Error finding container 940d7c76b4761a2e4dd56f6aa7a0b86c12131b50e938e931f629abc70e9bfc0c: Status 404 returned error can't find the container with id 940d7c76b4761a2e4dd56f6aa7a0b86c12131b50e938e931f629abc70e9bfc0c Apr 22 17:56:03.962370 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.962341 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" Apr 22 17:56:03.962370 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.962373 2566 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" Apr 22 17:56:03.962665 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:03.962653 2566 scope.go:117] "RemoveContainer" containerID="e0d1b0aded456d63035abc99af7d8fb6d34f66764c22dcc6f651258a517c275d" Apr 22 17:56:03.962824 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:56:03.962808 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-mrr8j_openshift-console-operator(84a18cd9-ceac-4cc8-972b-92e3c17a262b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" podUID="84a18cd9-ceac-4cc8-972b-92e3c17a262b" Apr 22 17:56:04.329755 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:04.329679 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m9ns5" event={"ID":"ac83d655-eec6-4d15-b10d-55f5c7a109d0","Type":"ContainerStarted","Data":"186a500d8d63644fe9225335fced5dab206498e022ba485e4119658039b20a85"} Apr 22 17:56:04.329755 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:04.329717 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m9ns5" event={"ID":"ac83d655-eec6-4d15-b10d-55f5c7a109d0","Type":"ContainerStarted","Data":"940d7c76b4761a2e4dd56f6aa7a0b86c12131b50e938e931f629abc70e9bfc0c"} Apr 22 17:56:04.790214 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:56:04.790175 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-dv96w" podUID="92650e2d-54ea-4904-8ee5-235164ed2949" Apr 22 17:56:05.333317 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:05.333278 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m9ns5" event={"ID":"ac83d655-eec6-4d15-b10d-55f5c7a109d0","Type":"ContainerStarted","Data":"8930f9f791e702a92fa6665d8330d996644f66e54892dcb422743cc398213a96"} Apr 22 17:56:05.463984 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:05.463944 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f93890b-e678-4591-8604-de9e0ff75905-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wplrv\" (UID: \"4f93890b-e678-4591-8604-de9e0ff75905\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv" Apr 22 17:56:05.466781 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:05.466732 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/4f93890b-e678-4591-8604-de9e0ff75905-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-wplrv\" (UID: \"4f93890b-e678-4591-8604-de9e0ff75905\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv" Apr 22 17:56:05.467013 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:05.466995 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv" Apr 22 17:56:05.859206 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:05.859180 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv"] Apr 22 17:56:05.861685 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:56:05.861661 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f93890b_e678_4591_8604_de9e0ff75905.slice/crio-e74b2fb96fe2c59943a0e949885cc78ad05ea5dd59835bd2726b1f29361ad765 WatchSource:0}: Error finding container e74b2fb96fe2c59943a0e949885cc78ad05ea5dd59835bd2726b1f29361ad765: Status 404 returned error can't find the container with id e74b2fb96fe2c59943a0e949885cc78ad05ea5dd59835bd2726b1f29361ad765 Apr 22 17:56:06.337197 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:06.337150 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv" event={"ID":"4f93890b-e678-4591-8604-de9e0ff75905","Type":"ContainerStarted","Data":"e74b2fb96fe2c59943a0e949885cc78ad05ea5dd59835bd2726b1f29361ad765"} Apr 22 17:56:06.339135 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:06.339100 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m9ns5" event={"ID":"ac83d655-eec6-4d15-b10d-55f5c7a109d0","Type":"ContainerStarted","Data":"3a7c13e8341583d804c08f25fdd21b3dd84e663b6b22cbb9a4e1e1ceebea1ed5"} Apr 22 17:56:06.362435 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:06.362378 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-m9ns5" podStartSLOduration=1.2504365960000001 podStartE2EDuration="3.362364775s" podCreationTimestamp="2026-04-22 17:56:03 +0000 UTC" firstStartedPulling="2026-04-22 17:56:03.659065553 +0000 UTC m=+156.458958754" lastFinishedPulling="2026-04-22 17:56:05.770993722 +0000 UTC m=+158.570886933" observedRunningTime="2026-04-22 17:56:06.361377619 +0000 UTC m=+159.161270833" watchObservedRunningTime="2026-04-22 17:56:06.362364775 +0000 UTC m=+159.162258029" Apr 22 17:56:07.784563 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:07.784537 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5nghc"] Apr 22 17:56:07.787433 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:07.787417 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5nghc" Apr 22 17:56:07.790771 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:07.790747 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-ldqxd\"" Apr 22 17:56:07.790855 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:07.790786 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 17:56:07.795385 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:07.795360 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5nghc"] Apr 22 17:56:07.881764 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:07.881740 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f5ebb09d-fd51-4c07-b267-b6a533e95f1f-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-5nghc\" (UID: \"f5ebb09d-fd51-4c07-b267-b6a533e95f1f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5nghc" Apr 22 17:56:07.982287 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:07.982255 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f5ebb09d-fd51-4c07-b267-b6a533e95f1f-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-5nghc\" (UID: \"f5ebb09d-fd51-4c07-b267-b6a533e95f1f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5nghc" Apr 22 17:56:07.984551 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:07.984520 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f5ebb09d-fd51-4c07-b267-b6a533e95f1f-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-5nghc\" (UID: \"f5ebb09d-fd51-4c07-b267-b6a533e95f1f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5nghc" Apr 22 17:56:08.083460 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:08.083372 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert\") pod \"ingress-canary-rbrk7\" (UID: \"aa418e11-9a0e-463a-8262-e078bca4e7a8\") " pod="openshift-ingress-canary/ingress-canary-rbrk7" Apr 22 17:56:08.083460 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:08.083430 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls\") pod \"dns-default-hmcpt\" (UID: \"705dd2ce-2ac7-4745-a314-14e119a14624\") " pod="openshift-dns/dns-default-hmcpt" Apr 22 17:56:08.085734 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:08.085711 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/705dd2ce-2ac7-4745-a314-14e119a14624-metrics-tls\") pod \"dns-default-hmcpt\" (UID: \"705dd2ce-2ac7-4745-a314-14e119a14624\") " pod="openshift-dns/dns-default-hmcpt" Apr 22 17:56:08.085837 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:08.085743 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa418e11-9a0e-463a-8262-e078bca4e7a8-cert\") pod \"ingress-canary-rbrk7\" (UID: \"aa418e11-9a0e-463a-8262-e078bca4e7a8\") " pod="openshift-ingress-canary/ingress-canary-rbrk7" Apr 22 17:56:08.095695 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:08.095666 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5nghc" Apr 22 17:56:08.129814 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:08.129775 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-frn4n\"" Apr 22 17:56:08.137491 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:08.137460 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rbrk7" Apr 22 17:56:08.227722 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:08.227670 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5nghc"] Apr 22 17:56:08.231285 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:56:08.231256 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5ebb09d_fd51_4c07_b267_b6a533e95f1f.slice/crio-36909331e18067dcae0c5bc1b9c688f3251d2ea14c24d40d1e0b8c5c03c45075 WatchSource:0}: Error finding container 36909331e18067dcae0c5bc1b9c688f3251d2ea14c24d40d1e0b8c5c03c45075: Status 404 returned error can't find the container with id 36909331e18067dcae0c5bc1b9c688f3251d2ea14c24d40d1e0b8c5c03c45075 Apr 22 17:56:08.266544 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:08.266516 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rbrk7"] Apr 22 17:56:08.269251 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:56:08.269225 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa418e11_9a0e_463a_8262_e078bca4e7a8.slice/crio-a681d42d04e6bd2d8f44903e88063fa77cf511f34125eba3703ac21453fb2c10 WatchSource:0}: Error finding container a681d42d04e6bd2d8f44903e88063fa77cf511f34125eba3703ac21453fb2c10: Status 404 returned error can't find the container with id a681d42d04e6bd2d8f44903e88063fa77cf511f34125eba3703ac21453fb2c10 Apr 22 17:56:08.346187 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:08.346101 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv" event={"ID":"4f93890b-e678-4591-8604-de9e0ff75905","Type":"ContainerStarted","Data":"ee1f8d577eb79944d834838474e977dd00cea5dc2aad2c8864ab94d77fb92f62"} Apr 22 17:56:08.347186 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:08.347151 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rbrk7" event={"ID":"aa418e11-9a0e-463a-8262-e078bca4e7a8","Type":"ContainerStarted","Data":"a681d42d04e6bd2d8f44903e88063fa77cf511f34125eba3703ac21453fb2c10"} Apr 22 17:56:08.348092 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:08.348072 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5nghc" event={"ID":"f5ebb09d-fd51-4c07-b267-b6a533e95f1f","Type":"ContainerStarted","Data":"36909331e18067dcae0c5bc1b9c688f3251d2ea14c24d40d1e0b8c5c03c45075"} Apr 22 17:56:08.363734 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:08.363695 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-wplrv" podStartSLOduration=33.914846599 podStartE2EDuration="35.363682896s" podCreationTimestamp="2026-04-22 17:55:33 +0000 UTC" firstStartedPulling="2026-04-22 17:56:05.863573073 +0000 UTC m=+158.663466278" lastFinishedPulling="2026-04-22 17:56:07.312409377 +0000 UTC m=+160.112302575" observedRunningTime="2026-04-22 17:56:08.362762263 +0000 UTC m=+161.162655483" watchObservedRunningTime="2026-04-22 17:56:08.363682896 +0000 UTC m=+161.163576112" Apr 22 17:56:10.354331 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:10.354237 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rbrk7" event={"ID":"aa418e11-9a0e-463a-8262-e078bca4e7a8","Type":"ContainerStarted","Data":"c4c6cf7c71d84670c1b7d4afa2bb2e77232879b913f9e0e485e5f6cfe86a5b97"} Apr 22 17:56:10.355574 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:10.355553 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5nghc" event={"ID":"f5ebb09d-fd51-4c07-b267-b6a533e95f1f","Type":"ContainerStarted","Data":"6ec5783e9335ccc0d4752564d1338acb9857098f1ff3c4afba7533cb8edc8697"} Apr 22 17:56:10.355727 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:10.355706 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5nghc" Apr 22 17:56:10.360429 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:10.360404 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5nghc" Apr 22 17:56:10.370230 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:10.370182 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rbrk7" podStartSLOduration=128.572905566 podStartE2EDuration="2m10.370168234s" podCreationTimestamp="2026-04-22 17:54:00 +0000 UTC" firstStartedPulling="2026-04-22 17:56:08.271062937 +0000 UTC m=+161.070956135" lastFinishedPulling="2026-04-22 17:56:10.068325602 +0000 UTC m=+162.868218803" observedRunningTime="2026-04-22 17:56:10.369584575 +0000 UTC m=+163.169477794" watchObservedRunningTime="2026-04-22 17:56:10.370168234 +0000 UTC m=+163.170061453" Apr 22 17:56:10.384948 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:10.384902 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-5nghc" podStartSLOduration=2.101875192 podStartE2EDuration="3.384884822s" podCreationTimestamp="2026-04-22 17:56:07 +0000 UTC" firstStartedPulling="2026-04-22 17:56:08.233451476 +0000 UTC m=+161.033344691" lastFinishedPulling="2026-04-22 17:56:09.516461108 +0000 UTC m=+162.316354321" observedRunningTime="2026-04-22 17:56:10.384441757 +0000 UTC m=+163.184334990" watchObservedRunningTime="2026-04-22 17:56:10.384884822 +0000 UTC m=+163.184778036" Apr 22 17:56:15.211154 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.211119 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-w4jff"] Apr 22 17:56:15.214463 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.214437 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-b9pjt"] Apr 22 17:56:15.214618 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.214599 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.217253 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.217227 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 17:56:15.217385 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.217259 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 17:56:15.217385 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.217336 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 17:56:15.217480 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.217433 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 17:56:15.217480 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.217455 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-lp8xv\"" Apr 22 17:56:15.218009 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.217994 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9pjt" Apr 22 17:56:15.220488 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.220462 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 17:56:15.220612 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.220471 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 17:56:15.220612 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.220539 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-k5zpc\"" Apr 22 17:56:15.224780 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.224757 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-b9pjt"] Apr 22 17:56:15.239164 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.239133 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.239321 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.239174 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-node-exporter-tls\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.239321 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.239194 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ecc790c-43bc-443b-b15e-cb2adabc8a2f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-b9pjt\" (UID: \"0ecc790c-43bc-443b-b15e-cb2adabc8a2f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9pjt" Apr 22 17:56:15.239321 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.239223 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-metrics-client-ca\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.239321 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.239294 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0ecc790c-43bc-443b-b15e-cb2adabc8a2f-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-b9pjt\" (UID: \"0ecc790c-43bc-443b-b15e-cb2adabc8a2f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9pjt" Apr 22 17:56:15.239474 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.239336 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-sys\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.239474 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.239361 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-node-exporter-wtmp\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.239474 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.239432 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-node-exporter-accelerators-collector-config\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.239474 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.239459 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjrtb\" (UniqueName: \"kubernetes.io/projected/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-kube-api-access-jjrtb\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.239635 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.239485 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxhdk\" (UniqueName: \"kubernetes.io/projected/0ecc790c-43bc-443b-b15e-cb2adabc8a2f-kube-api-access-gxhdk\") pod \"openshift-state-metrics-9d44df66c-b9pjt\" (UID: \"0ecc790c-43bc-443b-b15e-cb2adabc8a2f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9pjt" Apr 22 17:56:15.239635 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.239541 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-node-exporter-textfile\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.239635 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.239570 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-root\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.239635 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.239601 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0ecc790c-43bc-443b-b15e-cb2adabc8a2f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-b9pjt\" (UID: \"0ecc790c-43bc-443b-b15e-cb2adabc8a2f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9pjt" Apr 22 17:56:15.340243 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.340199 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-node-exporter-accelerators-collector-config\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.340243 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.340246 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jjrtb\" (UniqueName: \"kubernetes.io/projected/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-kube-api-access-jjrtb\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.340508 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.340279 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxhdk\" (UniqueName: \"kubernetes.io/projected/0ecc790c-43bc-443b-b15e-cb2adabc8a2f-kube-api-access-gxhdk\") pod \"openshift-state-metrics-9d44df66c-b9pjt\" (UID: \"0ecc790c-43bc-443b-b15e-cb2adabc8a2f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9pjt" Apr 22 17:56:15.340508 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.340306 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-node-exporter-textfile\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.340508 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.340327 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-root\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.340508 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.340353 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0ecc790c-43bc-443b-b15e-cb2adabc8a2f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-b9pjt\" (UID: \"0ecc790c-43bc-443b-b15e-cb2adabc8a2f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9pjt" Apr 22 17:56:15.340508 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.340409 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-root\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.340508 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.340417 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.340508 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.340486 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-node-exporter-tls\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.340838 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.340515 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ecc790c-43bc-443b-b15e-cb2adabc8a2f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-b9pjt\" (UID: \"0ecc790c-43bc-443b-b15e-cb2adabc8a2f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9pjt" Apr 22 17:56:15.340838 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.340550 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-metrics-client-ca\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.340838 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:56:15.340588 2566 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 17:56:15.340838 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:56:15.340662 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-node-exporter-tls podName:2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24 nodeName:}" failed. No retries permitted until 2026-04-22 17:56:15.840641825 +0000 UTC m=+168.640535026 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-node-exporter-tls") pod "node-exporter-w4jff" (UID: "2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24") : secret "node-exporter-tls" not found Apr 22 17:56:15.340838 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.340756 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-node-exporter-textfile\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.340838 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.340592 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0ecc790c-43bc-443b-b15e-cb2adabc8a2f-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-b9pjt\" (UID: \"0ecc790c-43bc-443b-b15e-cb2adabc8a2f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9pjt" Apr 22 17:56:15.340838 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.340814 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-sys\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.341315 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.340851 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-node-exporter-wtmp\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.341315 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.340926 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-sys\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.341315 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.340971 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-node-exporter-accelerators-collector-config\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.341315 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.341044 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-node-exporter-wtmp\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.341315 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.341246 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0ecc790c-43bc-443b-b15e-cb2adabc8a2f-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-b9pjt\" (UID: \"0ecc790c-43bc-443b-b15e-cb2adabc8a2f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9pjt" Apr 22 17:56:15.341531 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.341433 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-metrics-client-ca\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.342953 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.342928 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.343371 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.343349 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0ecc790c-43bc-443b-b15e-cb2adabc8a2f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-b9pjt\" (UID: \"0ecc790c-43bc-443b-b15e-cb2adabc8a2f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9pjt" Apr 22 17:56:15.343459 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.343440 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0ecc790c-43bc-443b-b15e-cb2adabc8a2f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-b9pjt\" (UID: \"0ecc790c-43bc-443b-b15e-cb2adabc8a2f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9pjt" Apr 22 17:56:15.349324 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.349299 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxhdk\" (UniqueName: \"kubernetes.io/projected/0ecc790c-43bc-443b-b15e-cb2adabc8a2f-kube-api-access-gxhdk\") pod \"openshift-state-metrics-9d44df66c-b9pjt\" (UID: \"0ecc790c-43bc-443b-b15e-cb2adabc8a2f\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9pjt" Apr 22 17:56:15.349510 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.349489 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjrtb\" (UniqueName: \"kubernetes.io/projected/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-kube-api-access-jjrtb\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.531231 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.531122 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9pjt" Apr 22 17:56:15.649088 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.649053 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-b9pjt"] Apr 22 17:56:15.651899 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:56:15.651852 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ecc790c_43bc_443b_b15e_cb2adabc8a2f.slice/crio-f087bf8a6cc153ef932ba36efec3ab56430725adf4baa66aed05b69256542926 WatchSource:0}: Error finding container f087bf8a6cc153ef932ba36efec3ab56430725adf4baa66aed05b69256542926: Status 404 returned error can't find the container with id f087bf8a6cc153ef932ba36efec3ab56430725adf4baa66aed05b69256542926 Apr 22 17:56:15.845133 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.845008 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-node-exporter-tls\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:15.847392 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:15.847363 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24-node-exporter-tls\") pod \"node-exporter-w4jff\" (UID: \"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24\") " pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:16.125508 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.125424 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-w4jff" Apr 22 17:56:16.133425 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:56:16.133393 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b1d8567_1ee7_40d9_8c1d_dc6ab3b97d24.slice/crio-bec08b2a59554fb7101bc222b89729c581b5b64e720ce13ee0135785ae60caa5 WatchSource:0}: Error finding container bec08b2a59554fb7101bc222b89729c581b5b64e720ce13ee0135785ae60caa5: Status 404 returned error can't find the container with id bec08b2a59554fb7101bc222b89729c581b5b64e720ce13ee0135785ae60caa5 Apr 22 17:56:16.338447 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.338407 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:56:16.342288 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.342264 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.345635 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.345293 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 17:56:16.345635 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.345314 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 17:56:16.345635 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.345370 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 17:56:16.345635 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.345596 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 17:56:16.345635 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.345609 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 17:56:16.345635 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.345612 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 17:56:16.345635 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.345627 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 17:56:16.346168 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.345596 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 17:56:16.346168 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.346015 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-gv4mz\"" Apr 22 17:56:16.351480 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.351450 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 17:56:16.362350 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.362149 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:56:16.372778 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.372741 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9pjt" event={"ID":"0ecc790c-43bc-443b-b15e-cb2adabc8a2f","Type":"ContainerStarted","Data":"9c3b9d14c27ed6eff18663465ccb95b8693a58b2fb0e8e6d2924e659f75f5703"} Apr 22 17:56:16.372778 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.372783 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9pjt" event={"ID":"0ecc790c-43bc-443b-b15e-cb2adabc8a2f","Type":"ContainerStarted","Data":"ec81f803ab91cb36b176adb1b6b32931e5a38f5f5283854ec284ff85a2440bfc"} Apr 22 17:56:16.373036 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.372793 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9pjt" event={"ID":"0ecc790c-43bc-443b-b15e-cb2adabc8a2f","Type":"ContainerStarted","Data":"f087bf8a6cc153ef932ba36efec3ab56430725adf4baa66aed05b69256542926"} Apr 22 17:56:16.374091 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.374052 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w4jff" event={"ID":"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24","Type":"ContainerStarted","Data":"bec08b2a59554fb7101bc222b89729c581b5b64e720ce13ee0135785ae60caa5"} Apr 22 17:56:16.449622 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.449570 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-config-volume\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.449825 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.449656 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.449825 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.449718 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.449825 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.449755 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.449825 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.449792 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.450058 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.449834 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.450058 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.449908 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-web-config\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.450058 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.449959 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.450058 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.450005 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.450058 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.450045 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.450299 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.450074 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.450299 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.450110 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-config-out\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.450299 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.450136 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnzng\" (UniqueName: \"kubernetes.io/projected/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-kube-api-access-nnzng\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.551584 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.551553 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.551584 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.551595 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.551849 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.551636 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.551849 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.551675 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.551849 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.551706 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-web-config\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.551849 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.551741 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.551849 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.551772 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.551849 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.551813 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.551849 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.551841 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.552218 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.551903 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-config-out\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.552218 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.551931 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnzng\" (UniqueName: \"kubernetes.io/projected/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-kube-api-access-nnzng\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.552218 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.551979 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-config-volume\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.552218 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.552004 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.553181 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.553144 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.553965 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.553936 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.554629 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.554603 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.556291 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.555044 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.556291 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.555529 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.556291 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.556018 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.556481 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.556399 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.556750 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.556720 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.556879 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.556829 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-web-config\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.557234 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.557178 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-config-volume\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.557314 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.557257 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.557458 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.557420 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-config-out\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.561506 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.561476 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnzng\" (UniqueName: \"kubernetes.io/projected/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-kube-api-access-nnzng\") pod \"alertmanager-main-0\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.665914 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.665873 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:56:16.775511 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.775490 2566 scope.go:117] "RemoveContainer" containerID="e0d1b0aded456d63035abc99af7d8fb6d34f66764c22dcc6f651258a517c275d" Apr 22 17:56:16.775695 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:56:16.775644 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-mrr8j_openshift-console-operator(84a18cd9-ceac-4cc8-972b-92e3c17a262b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" podUID="84a18cd9-ceac-4cc8-972b-92e3c17a262b" Apr 22 17:56:16.807555 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:16.807505 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:56:16.811427 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:56:16.811391 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f0c3d42_3428_4a03_a4c3_5419b2d9e7a9.slice/crio-f6a03cb4ec03c663546bec68ac956a01c68f711d033221f39684f9d06d0a188a WatchSource:0}: Error finding container f6a03cb4ec03c663546bec68ac956a01c68f711d033221f39684f9d06d0a188a: Status 404 returned error can't find the container with id f6a03cb4ec03c663546bec68ac956a01c68f711d033221f39684f9d06d0a188a Apr 22 17:56:17.293064 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.293027 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv"] Apr 22 17:56:17.296601 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.296568 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.299404 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.299379 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 17:56:17.299544 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.299411 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 17:56:17.299544 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.299384 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 17:56:17.299544 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.299464 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 17:56:17.299675 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.299577 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-bpmdb\"" Apr 22 17:56:17.299675 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.299617 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 17:56:17.300389 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.300370 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-53nmhfu6k5gv9\"" Apr 22 17:56:17.309219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.309192 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv"] Apr 22 17:56:17.358408 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.358372 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/e5bdfb2b-3526-43a6-a2d3-5b997408d869-secret-thanos-querier-tls\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.358790 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.358426 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lqh5\" (UniqueName: \"kubernetes.io/projected/e5bdfb2b-3526-43a6-a2d3-5b997408d869-kube-api-access-2lqh5\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.358790 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.358458 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e5bdfb2b-3526-43a6-a2d3-5b997408d869-secret-grpc-tls\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.358790 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.358498 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e5bdfb2b-3526-43a6-a2d3-5b997408d869-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.358790 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.358527 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/e5bdfb2b-3526-43a6-a2d3-5b997408d869-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.358790 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.358661 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/e5bdfb2b-3526-43a6-a2d3-5b997408d869-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.358790 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.358702 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5bdfb2b-3526-43a6-a2d3-5b997408d869-metrics-client-ca\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.358790 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.358738 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e5bdfb2b-3526-43a6-a2d3-5b997408d869-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.378737 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.378693 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9pjt" event={"ID":"0ecc790c-43bc-443b-b15e-cb2adabc8a2f","Type":"ContainerStarted","Data":"280ffd612f4d0bf29f6ed4375e0735d6d865ef43a2156b4be7067c0d906aa474"} Apr 22 17:56:17.380315 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.380285 2566 generic.go:358] "Generic (PLEG): container finished" podID="2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24" containerID="3e85338066dbe2c4f1777c390e684e120aea38e5445d8faad96e9b34f594a923" exitCode=0 Apr 22 17:56:17.380448 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.380372 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w4jff" event={"ID":"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24","Type":"ContainerDied","Data":"3e85338066dbe2c4f1777c390e684e120aea38e5445d8faad96e9b34f594a923"} Apr 22 17:56:17.381636 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.381617 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9","Type":"ContainerStarted","Data":"f6a03cb4ec03c663546bec68ac956a01c68f711d033221f39684f9d06d0a188a"} Apr 22 17:56:17.402180 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.401820 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-b9pjt" podStartSLOduration=1.447917086 podStartE2EDuration="2.401799939s" podCreationTimestamp="2026-04-22 17:56:15 +0000 UTC" firstStartedPulling="2026-04-22 17:56:15.770252164 +0000 UTC m=+168.570145362" lastFinishedPulling="2026-04-22 17:56:16.724135017 +0000 UTC m=+169.524028215" observedRunningTime="2026-04-22 17:56:17.401425844 +0000 UTC m=+170.201319088" watchObservedRunningTime="2026-04-22 17:56:17.401799939 +0000 UTC m=+170.201693157" Apr 22 17:56:17.459160 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.459122 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lqh5\" (UniqueName: \"kubernetes.io/projected/e5bdfb2b-3526-43a6-a2d3-5b997408d869-kube-api-access-2lqh5\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.459301 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.459195 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e5bdfb2b-3526-43a6-a2d3-5b997408d869-secret-grpc-tls\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.459301 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.459248 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e5bdfb2b-3526-43a6-a2d3-5b997408d869-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.459408 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.459300 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/e5bdfb2b-3526-43a6-a2d3-5b997408d869-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.459572 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.459405 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/e5bdfb2b-3526-43a6-a2d3-5b997408d869-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.459572 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.459433 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5bdfb2b-3526-43a6-a2d3-5b997408d869-metrics-client-ca\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.459572 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.459478 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e5bdfb2b-3526-43a6-a2d3-5b997408d869-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.459731 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.459586 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/e5bdfb2b-3526-43a6-a2d3-5b997408d869-secret-thanos-querier-tls\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.461213 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.461180 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5bdfb2b-3526-43a6-a2d3-5b997408d869-metrics-client-ca\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.462599 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.462575 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/e5bdfb2b-3526-43a6-a2d3-5b997408d869-secret-thanos-querier-tls\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.464043 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.463983 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/e5bdfb2b-3526-43a6-a2d3-5b997408d869-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.464228 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.464114 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e5bdfb2b-3526-43a6-a2d3-5b997408d869-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.464675 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.464650 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e5bdfb2b-3526-43a6-a2d3-5b997408d869-secret-grpc-tls\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.464768 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.464742 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/e5bdfb2b-3526-43a6-a2d3-5b997408d869-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.465280 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.465257 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e5bdfb2b-3526-43a6-a2d3-5b997408d869-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.469306 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.469283 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lqh5\" (UniqueName: \"kubernetes.io/projected/e5bdfb2b-3526-43a6-a2d3-5b997408d869-kube-api-access-2lqh5\") pod \"thanos-querier-5bbd5cd8d-mc4cv\" (UID: \"e5bdfb2b-3526-43a6-a2d3-5b997408d869\") " pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.608894 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.608839 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:17.780088 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.780064 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hmcpt" Apr 22 17:56:17.782897 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.782870 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-r5nxd\"" Apr 22 17:56:17.790732 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.790710 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hmcpt" Apr 22 17:56:17.991538 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:17.991507 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hmcpt"] Apr 22 17:56:17.994703 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:56:17.994672 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod705dd2ce_2ac7_4745_a314_14e119a14624.slice/crio-c1da2ab88a259fdf3a5a91ff393bd9a94f3fd3ed3b463cdbfa9135ce7afb9ac7 WatchSource:0}: Error finding container c1da2ab88a259fdf3a5a91ff393bd9a94f3fd3ed3b463cdbfa9135ce7afb9ac7: Status 404 returned error can't find the container with id c1da2ab88a259fdf3a5a91ff393bd9a94f3fd3ed3b463cdbfa9135ce7afb9ac7 Apr 22 17:56:18.007349 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:18.007320 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv"] Apr 22 17:56:18.042017 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:56:18.041941 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5bdfb2b_3526_43a6_a2d3_5b997408d869.slice/crio-7dc1078f7d4c02f0faeba88d0d8ecdf4372d04e1888a73b304af4d15ed8e3ba1 WatchSource:0}: Error finding container 7dc1078f7d4c02f0faeba88d0d8ecdf4372d04e1888a73b304af4d15ed8e3ba1: Status 404 returned error can't find the container with id 7dc1078f7d4c02f0faeba88d0d8ecdf4372d04e1888a73b304af4d15ed8e3ba1 Apr 22 17:56:18.385421 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:18.385327 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" event={"ID":"e5bdfb2b-3526-43a6-a2d3-5b997408d869","Type":"ContainerStarted","Data":"7dc1078f7d4c02f0faeba88d0d8ecdf4372d04e1888a73b304af4d15ed8e3ba1"} Apr 22 17:56:18.387223 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:18.387194 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w4jff" event={"ID":"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24","Type":"ContainerStarted","Data":"304d76df4a18c409a1a683eca82eb8c28043b9e3f6e5af9969bf6c5b16487936"} Apr 22 17:56:18.387325 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:18.387228 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-w4jff" event={"ID":"2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24","Type":"ContainerStarted","Data":"6071e8405f65eb69dfe39f92018e83b3a669323c13399820bfb53485215a735b"} Apr 22 17:56:18.388480 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:18.388456 2566 generic.go:358] "Generic (PLEG): container finished" podID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerID="023d05de644f2b0aede566f62cdf712596ac7b545b8500ef9dfd85635321f48d" exitCode=0 Apr 22 17:56:18.388590 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:18.388515 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9","Type":"ContainerDied","Data":"023d05de644f2b0aede566f62cdf712596ac7b545b8500ef9dfd85635321f48d"} Apr 22 17:56:18.389500 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:18.389475 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hmcpt" event={"ID":"705dd2ce-2ac7-4745-a314-14e119a14624","Type":"ContainerStarted","Data":"c1da2ab88a259fdf3a5a91ff393bd9a94f3fd3ed3b463cdbfa9135ce7afb9ac7"} Apr 22 17:56:18.409790 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:18.409739 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-w4jff" podStartSLOduration=2.4066182830000002 podStartE2EDuration="3.409724209s" podCreationTimestamp="2026-04-22 17:56:15 +0000 UTC" firstStartedPulling="2026-04-22 17:56:16.135013613 +0000 UTC m=+168.934906810" lastFinishedPulling="2026-04-22 17:56:17.138119532 +0000 UTC m=+169.938012736" observedRunningTime="2026-04-22 17:56:18.409060212 +0000 UTC m=+171.208953445" watchObservedRunningTime="2026-04-22 17:56:18.409724209 +0000 UTC m=+171.209617666" Apr 22 17:56:18.775002 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:18.774966 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:56:19.986952 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:19.986925 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-vcjl7"] Apr 22 17:56:19.990109 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:19.990084 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vcjl7" Apr 22 17:56:19.993088 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:19.993066 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 17:56:19.993191 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:19.993098 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-4xdwp\"" Apr 22 17:56:19.999181 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:19.999160 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-vcjl7"] Apr 22 17:56:20.086619 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.086581 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b4f78a22-a4a4-4801-a800-a8983d8d8a1d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-vcjl7\" (UID: \"b4f78a22-a4a4-4801-a800-a8983d8d8a1d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vcjl7" Apr 22 17:56:20.188048 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.188005 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b4f78a22-a4a4-4801-a800-a8983d8d8a1d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-vcjl7\" (UID: \"b4f78a22-a4a4-4801-a800-a8983d8d8a1d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vcjl7" Apr 22 17:56:20.191348 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.191321 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b4f78a22-a4a4-4801-a800-a8983d8d8a1d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-vcjl7\" (UID: \"b4f78a22-a4a4-4801-a800-a8983d8d8a1d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vcjl7" Apr 22 17:56:20.300077 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.299985 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vcjl7" Apr 22 17:56:20.428926 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.427352 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6df96bf54d-9v5k4"] Apr 22 17:56:20.434018 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.433995 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.436932 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.436779 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 17:56:20.437025 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.436961 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-df9p2\"" Apr 22 17:56:20.437384 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.437307 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 17:56:20.437725 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.437603 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 17:56:20.437878 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.437825 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 17:56:20.438243 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.438053 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 17:56:20.443944 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.443919 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6df96bf54d-9v5k4"] Apr 22 17:56:20.452663 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.448562 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 17:56:20.491498 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.491472 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d88b8107-8f12-4fa1-aa21-eeaa65216c76-telemeter-client-tls\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.491655 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.491530 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d88b8107-8f12-4fa1-aa21-eeaa65216c76-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.491655 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.491550 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d88b8107-8f12-4fa1-aa21-eeaa65216c76-federate-client-tls\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.491655 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.491604 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d88b8107-8f12-4fa1-aa21-eeaa65216c76-metrics-client-ca\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.491823 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.491664 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88b8107-8f12-4fa1-aa21-eeaa65216c76-serving-certs-ca-bundle\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.491823 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.491703 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d88b8107-8f12-4fa1-aa21-eeaa65216c76-secret-telemeter-client\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.491823 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.491793 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftzv4\" (UniqueName: \"kubernetes.io/projected/d88b8107-8f12-4fa1-aa21-eeaa65216c76-kube-api-access-ftzv4\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.492007 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.491841 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88b8107-8f12-4fa1-aa21-eeaa65216c76-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.577947 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.577896 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-vcjl7"] Apr 22 17:56:20.581534 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:56:20.581503 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4f78a22_a4a4_4801_a800_a8983d8d8a1d.slice/crio-94f39603545c99008ba4cdf24d18b24b7232bdd4a829824d16c0137459f0ad67 WatchSource:0}: Error finding container 94f39603545c99008ba4cdf24d18b24b7232bdd4a829824d16c0137459f0ad67: Status 404 returned error can't find the container with id 94f39603545c99008ba4cdf24d18b24b7232bdd4a829824d16c0137459f0ad67 Apr 22 17:56:20.592424 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.592392 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d88b8107-8f12-4fa1-aa21-eeaa65216c76-telemeter-client-tls\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.592525 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.592485 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d88b8107-8f12-4fa1-aa21-eeaa65216c76-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.592894 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.592677 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d88b8107-8f12-4fa1-aa21-eeaa65216c76-federate-client-tls\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.592894 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.592730 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d88b8107-8f12-4fa1-aa21-eeaa65216c76-metrics-client-ca\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.592894 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.592789 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88b8107-8f12-4fa1-aa21-eeaa65216c76-serving-certs-ca-bundle\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.592894 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.592828 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d88b8107-8f12-4fa1-aa21-eeaa65216c76-secret-telemeter-client\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.594001 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.593229 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftzv4\" (UniqueName: \"kubernetes.io/projected/d88b8107-8f12-4fa1-aa21-eeaa65216c76-kube-api-access-ftzv4\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.594001 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.593280 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88b8107-8f12-4fa1-aa21-eeaa65216c76-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.594001 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.593686 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88b8107-8f12-4fa1-aa21-eeaa65216c76-serving-certs-ca-bundle\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.594001 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.593792 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d88b8107-8f12-4fa1-aa21-eeaa65216c76-metrics-client-ca\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.594607 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.594573 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88b8107-8f12-4fa1-aa21-eeaa65216c76-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.597238 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.597151 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d88b8107-8f12-4fa1-aa21-eeaa65216c76-federate-client-tls\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.598112 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.598062 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d88b8107-8f12-4fa1-aa21-eeaa65216c76-telemeter-client-tls\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.598470 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.598431 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d88b8107-8f12-4fa1-aa21-eeaa65216c76-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.600204 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.600184 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d88b8107-8f12-4fa1-aa21-eeaa65216c76-secret-telemeter-client\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.605475 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.605450 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftzv4\" (UniqueName: \"kubernetes.io/projected/d88b8107-8f12-4fa1-aa21-eeaa65216c76-kube-api-access-ftzv4\") pod \"telemeter-client-6df96bf54d-9v5k4\" (UID: \"d88b8107-8f12-4fa1-aa21-eeaa65216c76\") " pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.762697 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.762649 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" Apr 22 17:56:20.893084 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:20.893058 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6df96bf54d-9v5k4"] Apr 22 17:56:20.895332 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:56:20.895296 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd88b8107_8f12_4fa1_aa21_eeaa65216c76.slice/crio-857ba00e81c3bab694ee2d71b8edc418e2a1279ecac00e55c0c3b839f5ff597d WatchSource:0}: Error finding container 857ba00e81c3bab694ee2d71b8edc418e2a1279ecac00e55c0c3b839f5ff597d: Status 404 returned error can't find the container with id 857ba00e81c3bab694ee2d71b8edc418e2a1279ecac00e55c0c3b839f5ff597d Apr 22 17:56:21.403879 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.403808 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" event={"ID":"e5bdfb2b-3526-43a6-a2d3-5b997408d869","Type":"ContainerStarted","Data":"4c57d8d51dc7f03e4303316f817d889088ad8a2616d92a13cbc42397ba95fc99"} Apr 22 17:56:21.403879 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.403873 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" event={"ID":"e5bdfb2b-3526-43a6-a2d3-5b997408d869","Type":"ContainerStarted","Data":"c51c033870780287ab9f8696f787575ad58e21de26b042c0852d7a5c2419a60b"} Apr 22 17:56:21.404389 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.403889 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" event={"ID":"e5bdfb2b-3526-43a6-a2d3-5b997408d869","Type":"ContainerStarted","Data":"232eb21b0111fb06231ba7c20b6dc21797f40bff0fa186466da39968e6e0c68e"} Apr 22 17:56:21.405143 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.405085 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" event={"ID":"d88b8107-8f12-4fa1-aa21-eeaa65216c76","Type":"ContainerStarted","Data":"857ba00e81c3bab694ee2d71b8edc418e2a1279ecac00e55c0c3b839f5ff597d"} Apr 22 17:56:21.408556 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.408527 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9","Type":"ContainerStarted","Data":"477368163e2f2831c2b5072ac473f5a021090bdb2d933eccb2ccb3424362199e"} Apr 22 17:56:21.408678 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.408566 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9","Type":"ContainerStarted","Data":"b5f33e3c59f7e59af6334aeac526e6b1aaf33224671d28ca22b282ed8ac83848"} Apr 22 17:56:21.408678 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.408581 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9","Type":"ContainerStarted","Data":"1e4684f2fa5966512ce843982234a5292157669d4b20c2117bae7f5ab0389a79"} Apr 22 17:56:21.408678 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.408594 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9","Type":"ContainerStarted","Data":"53273b6c71eb93a41a2553fd217f0f697044afe2457724e77778e16ef6ccc915"} Apr 22 17:56:21.408678 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.408606 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9","Type":"ContainerStarted","Data":"354d6a95f38062a52affb9c27f0ec3cbc7908664a9029a799f9d0f5ffb157265"} Apr 22 17:56:21.410749 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.410723 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hmcpt" event={"ID":"705dd2ce-2ac7-4745-a314-14e119a14624","Type":"ContainerStarted","Data":"7601c7ef9a340e8b27503b7071de858481d86b67fdb721dc9cefdc88ce46ab33"} Apr 22 17:56:21.410892 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.410758 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hmcpt" event={"ID":"705dd2ce-2ac7-4745-a314-14e119a14624","Type":"ContainerStarted","Data":"aa3e54fe06b2cb9acd9aa842aeccec5810b9e5389904d9090309e15bddf07abe"} Apr 22 17:56:21.410972 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.410893 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-hmcpt" Apr 22 17:56:21.412254 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.412227 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vcjl7" event={"ID":"b4f78a22-a4a4-4801-a800-a8983d8d8a1d","Type":"ContainerStarted","Data":"94f39603545c99008ba4cdf24d18b24b7232bdd4a829824d16c0137459f0ad67"} Apr 22 17:56:21.433459 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.433389 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hmcpt" podStartSLOduration=139.027219692 podStartE2EDuration="2m21.433369893s" podCreationTimestamp="2026-04-22 17:54:00 +0000 UTC" firstStartedPulling="2026-04-22 17:56:17.996656825 +0000 UTC m=+170.796550023" lastFinishedPulling="2026-04-22 17:56:20.402807012 +0000 UTC m=+173.202700224" observedRunningTime="2026-04-22 17:56:21.4318872 +0000 UTC m=+174.231780419" watchObservedRunningTime="2026-04-22 17:56:21.433369893 +0000 UTC m=+174.233263114" Apr 22 17:56:21.535065 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.535029 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:56:21.538990 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.538965 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.542872 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.542824 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 17:56:21.542872 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.542849 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 17:56:21.543635 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.543578 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 17:56:21.544010 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.543989 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 17:56:21.544112 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.544034 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 17:56:21.544177 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.544159 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 17:56:21.544278 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.544254 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5jq44qrsr09tb\"" Apr 22 17:56:21.544809 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.544790 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 17:56:21.544911 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.544806 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 17:56:21.544972 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.544919 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 17:56:21.545022 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.544817 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-j7s85\"" Apr 22 17:56:21.545132 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.544809 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 17:56:21.545132 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.545127 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 17:56:21.545228 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.545138 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 17:56:21.546327 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.546299 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 17:56:21.557783 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.557757 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:56:21.602219 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.602174 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.602387 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.602268 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-config\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.602387 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.602299 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnb48\" (UniqueName: \"kubernetes.io/projected/87284001-4dac-4179-9d8d-202143da5c90-kube-api-access-xnb48\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.602387 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.602352 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.602387 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.602380 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.602603 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.602440 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.602603 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.602465 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.602603 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.602488 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.602603 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.602522 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/87284001-4dac-4179-9d8d-202143da5c90-config-out\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.602603 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.602549 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.602603 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.602594 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.602964 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.602617 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.602964 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.602665 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.602964 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.602764 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-web-config\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.602964 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.602835 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.602964 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.602929 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.603192 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.603023 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/87284001-4dac-4179-9d8d-202143da5c90-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.603192 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.603061 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/87284001-4dac-4179-9d8d-202143da5c90-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.703673 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.703628 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/87284001-4dac-4179-9d8d-202143da5c90-config-out\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.703811 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.703678 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.703811 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.703723 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.703811 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.703750 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.703982 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.703819 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.703982 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.703920 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-web-config\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.703982 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.703957 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.704133 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.703982 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.704133 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.704001 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/87284001-4dac-4179-9d8d-202143da5c90-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.704133 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.704016 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/87284001-4dac-4179-9d8d-202143da5c90-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.704133 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.704037 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.704133 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.704074 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-config\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.704133 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.704093 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnb48\" (UniqueName: \"kubernetes.io/projected/87284001-4dac-4179-9d8d-202143da5c90-kube-api-access-xnb48\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.704133 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.704121 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.704455 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.704147 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.704455 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.704184 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.704455 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.704202 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.704455 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.704217 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.706225 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.705290 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.706225 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.705605 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/87284001-4dac-4179-9d8d-202143da5c90-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.706225 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.705991 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.707273 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.706980 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.707755 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.707641 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.711023 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.710913 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.712714 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.712641 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.713381 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.713331 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-web-config\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.713498 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.713474 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.714185 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.713828 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.714185 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.713956 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.714185 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.714130 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.714472 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.714451 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/87284001-4dac-4179-9d8d-202143da5c90-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.714560 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.714537 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.714755 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.714734 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/87284001-4dac-4179-9d8d-202143da5c90-config-out\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.715078 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.714942 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-config\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.715921 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.715900 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.717526 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.716700 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnb48\" (UniqueName: \"kubernetes.io/projected/87284001-4dac-4179-9d8d-202143da5c90-kube-api-access-xnb48\") pod \"prometheus-k8s-0\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:21.851745 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:21.851705 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:22.419234 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:22.419151 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9","Type":"ContainerStarted","Data":"9ad7199fc857f0594926febf2705cb3f18550c8314d78521afbe2aa62f0ec55f"} Apr 22 17:56:22.421773 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:22.421744 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" event={"ID":"e5bdfb2b-3526-43a6-a2d3-5b997408d869","Type":"ContainerStarted","Data":"95d9b696b14a9e0fd8c9e4d30b73770c831559609cb3329d86f4053714dfce14"} Apr 22 17:56:22.421927 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:22.421782 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" event={"ID":"e5bdfb2b-3526-43a6-a2d3-5b997408d869","Type":"ContainerStarted","Data":"f53f37b04eca7022ccb6d097ab8bd18c5c8e32ba9446ffe7a2a4b24ca3d1e35e"} Apr 22 17:56:22.445537 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:22.445482 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.648376134 podStartE2EDuration="6.44546612s" podCreationTimestamp="2026-04-22 17:56:16 +0000 UTC" firstStartedPulling="2026-04-22 17:56:16.813811232 +0000 UTC m=+169.613704435" lastFinishedPulling="2026-04-22 17:56:21.610901204 +0000 UTC m=+174.410794421" observedRunningTime="2026-04-22 17:56:22.444011252 +0000 UTC m=+175.243904472" watchObservedRunningTime="2026-04-22 17:56:22.44546612 +0000 UTC m=+175.245359340" Apr 22 17:56:22.771691 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:22.771665 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:56:22.773925 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:56:22.773896 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87284001_4dac_4179_9d8d_202143da5c90.slice/crio-795ce6be877d739ad168d06b24ceaf1b3c85037b9b65e9614dff8a4ced02bdfd WatchSource:0}: Error finding container 795ce6be877d739ad168d06b24ceaf1b3c85037b9b65e9614dff8a4ced02bdfd: Status 404 returned error can't find the container with id 795ce6be877d739ad168d06b24ceaf1b3c85037b9b65e9614dff8a4ced02bdfd Apr 22 17:56:23.426119 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:23.426085 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" event={"ID":"d88b8107-8f12-4fa1-aa21-eeaa65216c76","Type":"ContainerStarted","Data":"7c68c77773a27de26055fad66a73649303cccc3ca6a8cc639c053f5a4e683df0"} Apr 22 17:56:23.426119 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:23.426125 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" event={"ID":"d88b8107-8f12-4fa1-aa21-eeaa65216c76","Type":"ContainerStarted","Data":"cd36e17a9e5b73d20b22f49bb3e94c1681f694836093494ff51848fdba2e3ad9"} Apr 22 17:56:23.426654 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:23.426139 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" event={"ID":"d88b8107-8f12-4fa1-aa21-eeaa65216c76","Type":"ContainerStarted","Data":"2f07a983b21f7474115e163d6d37ba3f0d0b9b13c5d292f78939b4f58322291e"} Apr 22 17:56:23.427530 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:23.427506 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vcjl7" event={"ID":"b4f78a22-a4a4-4801-a800-a8983d8d8a1d","Type":"ContainerStarted","Data":"78a2c9962ca8d71486e7862789a44f398771766be102ff6112c8291e5faa8eee"} Apr 22 17:56:23.427680 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:23.427668 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vcjl7" Apr 22 17:56:23.431443 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:23.431405 2566 generic.go:358] "Generic (PLEG): container finished" podID="87284001-4dac-4179-9d8d-202143da5c90" containerID="127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e" exitCode=0 Apr 22 17:56:23.431594 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:23.431449 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87284001-4dac-4179-9d8d-202143da5c90","Type":"ContainerDied","Data":"127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e"} Apr 22 17:56:23.431594 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:23.431504 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87284001-4dac-4179-9d8d-202143da5c90","Type":"ContainerStarted","Data":"795ce6be877d739ad168d06b24ceaf1b3c85037b9b65e9614dff8a4ced02bdfd"} Apr 22 17:56:23.438265 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:23.438242 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vcjl7" Apr 22 17:56:23.439686 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:23.439662 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" event={"ID":"e5bdfb2b-3526-43a6-a2d3-5b997408d869","Type":"ContainerStarted","Data":"cee3fe1ed3296f5569fde2bb72e7882d04a19ceabff6bc6605c76ba1d2f2981b"} Apr 22 17:56:23.451875 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:23.451804 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6df96bf54d-9v5k4" podStartSLOduration=1.725692577 podStartE2EDuration="3.451790925s" podCreationTimestamp="2026-04-22 17:56:20 +0000 UTC" firstStartedPulling="2026-04-22 17:56:20.89721921 +0000 UTC m=+173.697112408" lastFinishedPulling="2026-04-22 17:56:22.62331755 +0000 UTC m=+175.423210756" observedRunningTime="2026-04-22 17:56:23.449566501 +0000 UTC m=+176.249459735" watchObservedRunningTime="2026-04-22 17:56:23.451790925 +0000 UTC m=+176.251684144" Apr 22 17:56:23.496934 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:23.496835 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" podStartSLOduration=2.951327821 podStartE2EDuration="6.496820577s" podCreationTimestamp="2026-04-22 17:56:17 +0000 UTC" firstStartedPulling="2026-04-22 17:56:18.043976948 +0000 UTC m=+170.843870146" lastFinishedPulling="2026-04-22 17:56:21.589469695 +0000 UTC m=+174.389362902" observedRunningTime="2026-04-22 17:56:23.496149678 +0000 UTC m=+176.296042898" watchObservedRunningTime="2026-04-22 17:56:23.496820577 +0000 UTC m=+176.296713797" Apr 22 17:56:23.512966 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:23.511376 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-vcjl7" podStartSLOduration=2.472947151 podStartE2EDuration="4.511355234s" podCreationTimestamp="2026-04-22 17:56:19 +0000 UTC" firstStartedPulling="2026-04-22 17:56:20.583774524 +0000 UTC m=+173.383667721" lastFinishedPulling="2026-04-22 17:56:22.622182591 +0000 UTC m=+175.422075804" observedRunningTime="2026-04-22 17:56:23.510886978 +0000 UTC m=+176.310780211" watchObservedRunningTime="2026-04-22 17:56:23.511355234 +0000 UTC m=+176.311248455" Apr 22 17:56:24.443833 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:24.443794 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:25.453176 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:25.453146 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5bbd5cd8d-mc4cv" Apr 22 17:56:26.453341 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:26.453276 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87284001-4dac-4179-9d8d-202143da5c90","Type":"ContainerStarted","Data":"c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd"} Apr 22 17:56:26.453341 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:26.453315 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87284001-4dac-4179-9d8d-202143da5c90","Type":"ContainerStarted","Data":"e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49"} Apr 22 17:56:26.453341 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:26.453329 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87284001-4dac-4179-9d8d-202143da5c90","Type":"ContainerStarted","Data":"232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129"} Apr 22 17:56:26.453341 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:26.453343 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87284001-4dac-4179-9d8d-202143da5c90","Type":"ContainerStarted","Data":"552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea"} Apr 22 17:56:26.453748 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:26.453358 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87284001-4dac-4179-9d8d-202143da5c90","Type":"ContainerStarted","Data":"6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c"} Apr 22 17:56:26.453748 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:26.453372 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87284001-4dac-4179-9d8d-202143da5c90","Type":"ContainerStarted","Data":"9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f"} Apr 22 17:56:26.486422 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:26.486375 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.854723883 podStartE2EDuration="5.486362022s" podCreationTimestamp="2026-04-22 17:56:21 +0000 UTC" firstStartedPulling="2026-04-22 17:56:23.433155063 +0000 UTC m=+176.233048278" lastFinishedPulling="2026-04-22 17:56:26.064793219 +0000 UTC m=+178.864686417" observedRunningTime="2026-04-22 17:56:26.484262944 +0000 UTC m=+179.284156163" watchObservedRunningTime="2026-04-22 17:56:26.486362022 +0000 UTC m=+179.286255242" Apr 22 17:56:26.852838 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:26.852807 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:56:31.424263 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:31.424234 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hmcpt" Apr 22 17:56:31.775195 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:31.775124 2566 scope.go:117] "RemoveContainer" containerID="e0d1b0aded456d63035abc99af7d8fb6d34f66764c22dcc6f651258a517c275d" Apr 22 17:56:32.472403 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:32.472374 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 17:56:32.472731 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:32.472454 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" event={"ID":"84a18cd9-ceac-4cc8-972b-92e3c17a262b","Type":"ContainerStarted","Data":"33c64296122c43c49339e9ffd4e77cfe16f2f4950daf2990db2e41b35891c463"} Apr 22 17:56:32.472845 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:32.472814 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" Apr 22 17:56:32.491162 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:32.491119 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" podStartSLOduration=57.325093279 podStartE2EDuration="59.491107685s" podCreationTimestamp="2026-04-22 17:55:33 +0000 UTC" firstStartedPulling="2026-04-22 17:55:34.081555343 +0000 UTC m=+126.881448541" lastFinishedPulling="2026-04-22 17:55:36.247569749 +0000 UTC m=+129.047462947" observedRunningTime="2026-04-22 17:56:32.488958589 +0000 UTC m=+185.288851811" watchObservedRunningTime="2026-04-22 17:56:32.491107685 +0000 UTC m=+185.291000904" Apr 22 17:56:33.176795 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:33.176769 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-mrr8j" Apr 22 17:56:57.543510 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:57.543479 2566 generic.go:358] "Generic (PLEG): container finished" podID="9af35975-ba53-433d-8c3b-454c55c4ffd7" containerID="d650d108f4859b06a704bf7058072dabcb734cab8d2a015f00a437b5b9c8aad8" exitCode=0 Apr 22 17:56:57.543904 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:57.543554 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-2fnvq" event={"ID":"9af35975-ba53-433d-8c3b-454c55c4ffd7","Type":"ContainerDied","Data":"d650d108f4859b06a704bf7058072dabcb734cab8d2a015f00a437b5b9c8aad8"} Apr 22 17:56:57.543904 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:57.543840 2566 scope.go:117] "RemoveContainer" containerID="d650d108f4859b06a704bf7058072dabcb734cab8d2a015f00a437b5b9c8aad8" Apr 22 17:56:58.548143 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:56:58.548111 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-2fnvq" event={"ID":"9af35975-ba53-433d-8c3b-454c55c4ffd7","Type":"ContainerStarted","Data":"b2f8deae26eaf421363ef10f930b23806f35ae50c3b338669765891bb8e9d4f8"} Apr 22 17:57:21.852365 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:21.852330 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:21.867781 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:21.867754 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:22.636263 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:22.636236 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:35.673774 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:35.673740 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:57:35.674266 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:35.674197 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="alertmanager" containerID="cri-o://354d6a95f38062a52affb9c27f0ec3cbc7908664a9029a799f9d0f5ffb157265" gracePeriod=120 Apr 22 17:57:35.674341 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:35.674278 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="kube-rbac-proxy-metric" containerID="cri-o://477368163e2f2831c2b5072ac473f5a021090bdb2d933eccb2ccb3424362199e" gracePeriod=120 Apr 22 17:57:35.674341 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:35.674321 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="config-reloader" containerID="cri-o://53273b6c71eb93a41a2553fd217f0f697044afe2457724e77778e16ef6ccc915" gracePeriod=120 Apr 22 17:57:35.674521 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:35.674345 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="kube-rbac-proxy" containerID="cri-o://b5f33e3c59f7e59af6334aeac526e6b1aaf33224671d28ca22b282ed8ac83848" gracePeriod=120 Apr 22 17:57:35.674521 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:35.674368 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="prom-label-proxy" containerID="cri-o://9ad7199fc857f0594926febf2705cb3f18550c8314d78521afbe2aa62f0ec55f" gracePeriod=120 Apr 22 17:57:35.674521 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:35.674284 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="kube-rbac-proxy-web" containerID="cri-o://1e4684f2fa5966512ce843982234a5292157669d4b20c2117bae7f5ab0389a79" gracePeriod=120 Apr 22 17:57:36.665916 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:36.665848 2566 generic.go:358] "Generic (PLEG): container finished" podID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerID="9ad7199fc857f0594926febf2705cb3f18550c8314d78521afbe2aa62f0ec55f" exitCode=0 Apr 22 17:57:36.665916 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:36.665896 2566 generic.go:358] "Generic (PLEG): container finished" podID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerID="b5f33e3c59f7e59af6334aeac526e6b1aaf33224671d28ca22b282ed8ac83848" exitCode=0 Apr 22 17:57:36.665916 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:36.665906 2566 generic.go:358] "Generic (PLEG): container finished" podID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerID="53273b6c71eb93a41a2553fd217f0f697044afe2457724e77778e16ef6ccc915" exitCode=0 Apr 22 17:57:36.665916 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:36.665918 2566 generic.go:358] "Generic (PLEG): container finished" podID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerID="354d6a95f38062a52affb9c27f0ec3cbc7908664a9029a799f9d0f5ffb157265" exitCode=0 Apr 22 17:57:36.666180 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:36.665928 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9","Type":"ContainerDied","Data":"9ad7199fc857f0594926febf2705cb3f18550c8314d78521afbe2aa62f0ec55f"} Apr 22 17:57:36.666180 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:36.665961 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9","Type":"ContainerDied","Data":"b5f33e3c59f7e59af6334aeac526e6b1aaf33224671d28ca22b282ed8ac83848"} Apr 22 17:57:36.666180 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:36.665970 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9","Type":"ContainerDied","Data":"53273b6c71eb93a41a2553fd217f0f697044afe2457724e77778e16ef6ccc915"} Apr 22 17:57:36.666180 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:36.665979 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9","Type":"ContainerDied","Data":"354d6a95f38062a52affb9c27f0ec3cbc7908664a9029a799f9d0f5ffb157265"} Apr 22 17:57:36.913522 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:36.913500 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.010520 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.010436 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-tls-assets\") pod \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " Apr 22 17:57:37.010520 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.010476 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-cluster-tls-config\") pod \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " Apr 22 17:57:37.010520 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.010504 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-main-tls\") pod \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " Apr 22 17:57:37.010789 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.010524 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnzng\" (UniqueName: \"kubernetes.io/projected/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-kube-api-access-nnzng\") pod \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " Apr 22 17:57:37.010789 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.010552 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-web-config\") pod \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " Apr 22 17:57:37.010789 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.010568 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " Apr 22 17:57:37.010789 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.010591 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-kube-rbac-proxy\") pod \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " Apr 22 17:57:37.011080 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.011057 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-config-volume\") pod \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " Apr 22 17:57:37.011141 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.011111 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-alertmanager-trusted-ca-bundle\") pod \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " Apr 22 17:57:37.011196 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.011162 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-metrics-client-ca\") pod \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " Apr 22 17:57:37.011254 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.011238 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-alertmanager-main-db\") pod \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " Apr 22 17:57:37.011308 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.011267 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-config-out\") pod \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " Apr 22 17:57:37.011308 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.011301 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-kube-rbac-proxy-web\") pod \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\" (UID: \"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9\") " Apr 22 17:57:37.012187 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.011780 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" (UID: "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:57:37.012187 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.011950 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" (UID: "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:57:37.012187 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.012051 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" (UID: "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:57:37.012187 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.012084 2566 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-alertmanager-main-db\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:37.012187 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.012103 2566 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-metrics-client-ca\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:37.014024 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.013986 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-kube-api-access-nnzng" (OuterVolumeSpecName: "kube-api-access-nnzng") pod "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" (UID: "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9"). InnerVolumeSpecName "kube-api-access-nnzng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:57:37.014256 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.014225 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" (UID: "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:57:37.014338 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.014275 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" (UID: "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:37.014576 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.014551 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" (UID: "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:37.014576 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.014569 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" (UID: "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:37.014738 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.014584 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" (UID: "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:37.015066 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.015046 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-config-volume" (OuterVolumeSpecName: "config-volume") pod "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" (UID: "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:37.015234 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.015211 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-config-out" (OuterVolumeSpecName: "config-out") pod "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" (UID: "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:57:37.018289 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.018182 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" (UID: "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:37.024257 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.024235 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-web-config" (OuterVolumeSpecName: "web-config") pod "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" (UID: "2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:37.112609 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.112578 2566 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-main-tls\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:37.112609 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.112606 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nnzng\" (UniqueName: \"kubernetes.io/projected/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-kube-api-access-nnzng\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:37.112776 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.112621 2566 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-web-config\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:37.112776 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.112635 2566 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:37.112776 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.112650 2566 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:37.112776 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.112664 2566 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-config-volume\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:37.112776 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.112676 2566 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:37.112776 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.112688 2566 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-config-out\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:37.112776 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.112701 2566 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:37.112776 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.112714 2566 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-tls-assets\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:37.112776 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.112727 2566 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9-cluster-tls-config\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:37.671930 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.671894 2566 generic.go:358] "Generic (PLEG): container finished" podID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerID="477368163e2f2831c2b5072ac473f5a021090bdb2d933eccb2ccb3424362199e" exitCode=0 Apr 22 17:57:37.671930 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.671919 2566 generic.go:358] "Generic (PLEG): container finished" podID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerID="1e4684f2fa5966512ce843982234a5292157669d4b20c2117bae7f5ab0389a79" exitCode=0 Apr 22 17:57:37.672130 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.671954 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9","Type":"ContainerDied","Data":"477368163e2f2831c2b5072ac473f5a021090bdb2d933eccb2ccb3424362199e"} Apr 22 17:57:37.672130 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.671995 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9","Type":"ContainerDied","Data":"1e4684f2fa5966512ce843982234a5292157669d4b20c2117bae7f5ab0389a79"} Apr 22 17:57:37.672130 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.672007 2566 scope.go:117] "RemoveContainer" containerID="9ad7199fc857f0594926febf2705cb3f18550c8314d78521afbe2aa62f0ec55f" Apr 22 17:57:37.672130 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.672007 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9","Type":"ContainerDied","Data":"f6a03cb4ec03c663546bec68ac956a01c68f711d033221f39684f9d06d0a188a"} Apr 22 17:57:37.672130 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.671994 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.681783 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.681548 2566 scope.go:117] "RemoveContainer" containerID="477368163e2f2831c2b5072ac473f5a021090bdb2d933eccb2ccb3424362199e" Apr 22 17:57:37.688893 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.688877 2566 scope.go:117] "RemoveContainer" containerID="b5f33e3c59f7e59af6334aeac526e6b1aaf33224671d28ca22b282ed8ac83848" Apr 22 17:57:37.695698 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.695684 2566 scope.go:117] "RemoveContainer" containerID="1e4684f2fa5966512ce843982234a5292157669d4b20c2117bae7f5ab0389a79" Apr 22 17:57:37.699877 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.699836 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:57:37.702945 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.702927 2566 scope.go:117] "RemoveContainer" containerID="53273b6c71eb93a41a2553fd217f0f697044afe2457724e77778e16ef6ccc915" Apr 22 17:57:37.703610 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.703588 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:57:37.709243 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.709225 2566 scope.go:117] "RemoveContainer" containerID="354d6a95f38062a52affb9c27f0ec3cbc7908664a9029a799f9d0f5ffb157265" Apr 22 17:57:37.720067 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.720011 2566 scope.go:117] "RemoveContainer" containerID="023d05de644f2b0aede566f62cdf712596ac7b545b8500ef9dfd85635321f48d" Apr 22 17:57:37.727890 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.727873 2566 scope.go:117] "RemoveContainer" containerID="9ad7199fc857f0594926febf2705cb3f18550c8314d78521afbe2aa62f0ec55f" Apr 22 17:57:37.728118 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:57:37.728100 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ad7199fc857f0594926febf2705cb3f18550c8314d78521afbe2aa62f0ec55f\": container with ID starting with 9ad7199fc857f0594926febf2705cb3f18550c8314d78521afbe2aa62f0ec55f not found: ID does not exist" containerID="9ad7199fc857f0594926febf2705cb3f18550c8314d78521afbe2aa62f0ec55f" Apr 22 17:57:37.728159 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.728127 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ad7199fc857f0594926febf2705cb3f18550c8314d78521afbe2aa62f0ec55f"} err="failed to get container status \"9ad7199fc857f0594926febf2705cb3f18550c8314d78521afbe2aa62f0ec55f\": rpc error: code = NotFound desc = could not find container \"9ad7199fc857f0594926febf2705cb3f18550c8314d78521afbe2aa62f0ec55f\": container with ID starting with 9ad7199fc857f0594926febf2705cb3f18550c8314d78521afbe2aa62f0ec55f not found: ID does not exist" Apr 22 17:57:37.728199 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.728161 2566 scope.go:117] "RemoveContainer" containerID="477368163e2f2831c2b5072ac473f5a021090bdb2d933eccb2ccb3424362199e" Apr 22 17:57:37.728415 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:57:37.728386 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"477368163e2f2831c2b5072ac473f5a021090bdb2d933eccb2ccb3424362199e\": container with ID starting with 477368163e2f2831c2b5072ac473f5a021090bdb2d933eccb2ccb3424362199e not found: ID does not exist" containerID="477368163e2f2831c2b5072ac473f5a021090bdb2d933eccb2ccb3424362199e" Apr 22 17:57:37.728495 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.728412 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"477368163e2f2831c2b5072ac473f5a021090bdb2d933eccb2ccb3424362199e"} err="failed to get container status \"477368163e2f2831c2b5072ac473f5a021090bdb2d933eccb2ccb3424362199e\": rpc error: code = NotFound desc = could not find container \"477368163e2f2831c2b5072ac473f5a021090bdb2d933eccb2ccb3424362199e\": container with ID starting with 477368163e2f2831c2b5072ac473f5a021090bdb2d933eccb2ccb3424362199e not found: ID does not exist" Apr 22 17:57:37.728495 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.728428 2566 scope.go:117] "RemoveContainer" containerID="b5f33e3c59f7e59af6334aeac526e6b1aaf33224671d28ca22b282ed8ac83848" Apr 22 17:57:37.728663 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:57:37.728644 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5f33e3c59f7e59af6334aeac526e6b1aaf33224671d28ca22b282ed8ac83848\": container with ID starting with b5f33e3c59f7e59af6334aeac526e6b1aaf33224671d28ca22b282ed8ac83848 not found: ID does not exist" containerID="b5f33e3c59f7e59af6334aeac526e6b1aaf33224671d28ca22b282ed8ac83848" Apr 22 17:57:37.728708 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.728668 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5f33e3c59f7e59af6334aeac526e6b1aaf33224671d28ca22b282ed8ac83848"} err="failed to get container status \"b5f33e3c59f7e59af6334aeac526e6b1aaf33224671d28ca22b282ed8ac83848\": rpc error: code = NotFound desc = could not find container \"b5f33e3c59f7e59af6334aeac526e6b1aaf33224671d28ca22b282ed8ac83848\": container with ID starting with b5f33e3c59f7e59af6334aeac526e6b1aaf33224671d28ca22b282ed8ac83848 not found: ID does not exist" Apr 22 17:57:37.728708 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.728685 2566 scope.go:117] "RemoveContainer" containerID="1e4684f2fa5966512ce843982234a5292157669d4b20c2117bae7f5ab0389a79" Apr 22 17:57:37.728929 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:57:37.728911 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e4684f2fa5966512ce843982234a5292157669d4b20c2117bae7f5ab0389a79\": container with ID starting with 1e4684f2fa5966512ce843982234a5292157669d4b20c2117bae7f5ab0389a79 not found: ID does not exist" containerID="1e4684f2fa5966512ce843982234a5292157669d4b20c2117bae7f5ab0389a79" Apr 22 17:57:37.729021 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.728931 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e4684f2fa5966512ce843982234a5292157669d4b20c2117bae7f5ab0389a79"} err="failed to get container status \"1e4684f2fa5966512ce843982234a5292157669d4b20c2117bae7f5ab0389a79\": rpc error: code = NotFound desc = could not find container \"1e4684f2fa5966512ce843982234a5292157669d4b20c2117bae7f5ab0389a79\": container with ID starting with 1e4684f2fa5966512ce843982234a5292157669d4b20c2117bae7f5ab0389a79 not found: ID does not exist" Apr 22 17:57:37.729021 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.728945 2566 scope.go:117] "RemoveContainer" containerID="53273b6c71eb93a41a2553fd217f0f697044afe2457724e77778e16ef6ccc915" Apr 22 17:57:37.729215 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:57:37.729197 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53273b6c71eb93a41a2553fd217f0f697044afe2457724e77778e16ef6ccc915\": container with ID starting with 53273b6c71eb93a41a2553fd217f0f697044afe2457724e77778e16ef6ccc915 not found: ID does not exist" containerID="53273b6c71eb93a41a2553fd217f0f697044afe2457724e77778e16ef6ccc915" Apr 22 17:57:37.729295 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.729221 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53273b6c71eb93a41a2553fd217f0f697044afe2457724e77778e16ef6ccc915"} err="failed to get container status \"53273b6c71eb93a41a2553fd217f0f697044afe2457724e77778e16ef6ccc915\": rpc error: code = NotFound desc = could not find container \"53273b6c71eb93a41a2553fd217f0f697044afe2457724e77778e16ef6ccc915\": container with ID starting with 53273b6c71eb93a41a2553fd217f0f697044afe2457724e77778e16ef6ccc915 not found: ID does not exist" Apr 22 17:57:37.729295 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.729241 2566 scope.go:117] "RemoveContainer" containerID="354d6a95f38062a52affb9c27f0ec3cbc7908664a9029a799f9d0f5ffb157265" Apr 22 17:57:37.729650 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:57:37.729549 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"354d6a95f38062a52affb9c27f0ec3cbc7908664a9029a799f9d0f5ffb157265\": container with ID starting with 354d6a95f38062a52affb9c27f0ec3cbc7908664a9029a799f9d0f5ffb157265 not found: ID does not exist" containerID="354d6a95f38062a52affb9c27f0ec3cbc7908664a9029a799f9d0f5ffb157265" Apr 22 17:57:37.729650 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.729597 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"354d6a95f38062a52affb9c27f0ec3cbc7908664a9029a799f9d0f5ffb157265"} err="failed to get container status \"354d6a95f38062a52affb9c27f0ec3cbc7908664a9029a799f9d0f5ffb157265\": rpc error: code = NotFound desc = could not find container \"354d6a95f38062a52affb9c27f0ec3cbc7908664a9029a799f9d0f5ffb157265\": container with ID starting with 354d6a95f38062a52affb9c27f0ec3cbc7908664a9029a799f9d0f5ffb157265 not found: ID does not exist" Apr 22 17:57:37.729650 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.729618 2566 scope.go:117] "RemoveContainer" containerID="023d05de644f2b0aede566f62cdf712596ac7b545b8500ef9dfd85635321f48d" Apr 22 17:57:37.729919 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:57:37.729874 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"023d05de644f2b0aede566f62cdf712596ac7b545b8500ef9dfd85635321f48d\": container with ID starting with 023d05de644f2b0aede566f62cdf712596ac7b545b8500ef9dfd85635321f48d not found: ID does not exist" containerID="023d05de644f2b0aede566f62cdf712596ac7b545b8500ef9dfd85635321f48d" Apr 22 17:57:37.729995 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.729927 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"023d05de644f2b0aede566f62cdf712596ac7b545b8500ef9dfd85635321f48d"} err="failed to get container status \"023d05de644f2b0aede566f62cdf712596ac7b545b8500ef9dfd85635321f48d\": rpc error: code = NotFound desc = could not find container \"023d05de644f2b0aede566f62cdf712596ac7b545b8500ef9dfd85635321f48d\": container with ID starting with 023d05de644f2b0aede566f62cdf712596ac7b545b8500ef9dfd85635321f48d not found: ID does not exist" Apr 22 17:57:37.729995 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.729950 2566 scope.go:117] "RemoveContainer" containerID="9ad7199fc857f0594926febf2705cb3f18550c8314d78521afbe2aa62f0ec55f" Apr 22 17:57:37.730209 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.730186 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ad7199fc857f0594926febf2705cb3f18550c8314d78521afbe2aa62f0ec55f"} err="failed to get container status \"9ad7199fc857f0594926febf2705cb3f18550c8314d78521afbe2aa62f0ec55f\": rpc error: code = NotFound desc = could not find container \"9ad7199fc857f0594926febf2705cb3f18550c8314d78521afbe2aa62f0ec55f\": container with ID starting with 9ad7199fc857f0594926febf2705cb3f18550c8314d78521afbe2aa62f0ec55f not found: ID does not exist" Apr 22 17:57:37.730209 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.730207 2566 scope.go:117] "RemoveContainer" containerID="477368163e2f2831c2b5072ac473f5a021090bdb2d933eccb2ccb3424362199e" Apr 22 17:57:37.730428 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.730413 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"477368163e2f2831c2b5072ac473f5a021090bdb2d933eccb2ccb3424362199e"} err="failed to get container status \"477368163e2f2831c2b5072ac473f5a021090bdb2d933eccb2ccb3424362199e\": rpc error: code = NotFound desc = could not find container \"477368163e2f2831c2b5072ac473f5a021090bdb2d933eccb2ccb3424362199e\": container with ID starting with 477368163e2f2831c2b5072ac473f5a021090bdb2d933eccb2ccb3424362199e not found: ID does not exist" Apr 22 17:57:37.730428 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.730427 2566 scope.go:117] "RemoveContainer" containerID="b5f33e3c59f7e59af6334aeac526e6b1aaf33224671d28ca22b282ed8ac83848" Apr 22 17:57:37.730653 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.730634 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5f33e3c59f7e59af6334aeac526e6b1aaf33224671d28ca22b282ed8ac83848"} err="failed to get container status \"b5f33e3c59f7e59af6334aeac526e6b1aaf33224671d28ca22b282ed8ac83848\": rpc error: code = NotFound desc = could not find container \"b5f33e3c59f7e59af6334aeac526e6b1aaf33224671d28ca22b282ed8ac83848\": container with ID starting with b5f33e3c59f7e59af6334aeac526e6b1aaf33224671d28ca22b282ed8ac83848 not found: ID does not exist" Apr 22 17:57:37.730723 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.730656 2566 scope.go:117] "RemoveContainer" containerID="1e4684f2fa5966512ce843982234a5292157669d4b20c2117bae7f5ab0389a79" Apr 22 17:57:37.730723 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.730682 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:57:37.730946 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.730922 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e4684f2fa5966512ce843982234a5292157669d4b20c2117bae7f5ab0389a79"} err="failed to get container status \"1e4684f2fa5966512ce843982234a5292157669d4b20c2117bae7f5ab0389a79\": rpc error: code = NotFound desc = could not find container \"1e4684f2fa5966512ce843982234a5292157669d4b20c2117bae7f5ab0389a79\": container with ID starting with 1e4684f2fa5966512ce843982234a5292157669d4b20c2117bae7f5ab0389a79 not found: ID does not exist" Apr 22 17:57:37.730993 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.730952 2566 scope.go:117] "RemoveContainer" containerID="53273b6c71eb93a41a2553fd217f0f697044afe2457724e77778e16ef6ccc915" Apr 22 17:57:37.731210 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731190 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53273b6c71eb93a41a2553fd217f0f697044afe2457724e77778e16ef6ccc915"} err="failed to get container status \"53273b6c71eb93a41a2553fd217f0f697044afe2457724e77778e16ef6ccc915\": rpc error: code = NotFound desc = could not find container \"53273b6c71eb93a41a2553fd217f0f697044afe2457724e77778e16ef6ccc915\": container with ID starting with 53273b6c71eb93a41a2553fd217f0f697044afe2457724e77778e16ef6ccc915 not found: ID does not exist" Apr 22 17:57:37.731255 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731211 2566 scope.go:117] "RemoveContainer" containerID="354d6a95f38062a52affb9c27f0ec3cbc7908664a9029a799f9d0f5ffb157265" Apr 22 17:57:37.731316 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731289 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="alertmanager" Apr 22 17:57:37.731359 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731325 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="alertmanager" Apr 22 17:57:37.731359 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731349 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="kube-rbac-proxy-metric" Apr 22 17:57:37.731437 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731369 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="kube-rbac-proxy-metric" Apr 22 17:57:37.731437 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731399 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="kube-rbac-proxy" Apr 22 17:57:37.731437 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731407 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="kube-rbac-proxy" Apr 22 17:57:37.731528 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731441 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="kube-rbac-proxy-web" Apr 22 17:57:37.731528 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731458 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="kube-rbac-proxy-web" Apr 22 17:57:37.731528 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731472 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="prom-label-proxy" Apr 22 17:57:37.731528 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731474 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"354d6a95f38062a52affb9c27f0ec3cbc7908664a9029a799f9d0f5ffb157265"} err="failed to get container status \"354d6a95f38062a52affb9c27f0ec3cbc7908664a9029a799f9d0f5ffb157265\": rpc error: code = NotFound desc = could not find container \"354d6a95f38062a52affb9c27f0ec3cbc7908664a9029a799f9d0f5ffb157265\": container with ID starting with 354d6a95f38062a52affb9c27f0ec3cbc7908664a9029a799f9d0f5ffb157265 not found: ID does not exist" Apr 22 17:57:37.731528 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731505 2566 scope.go:117] "RemoveContainer" containerID="023d05de644f2b0aede566f62cdf712596ac7b545b8500ef9dfd85635321f48d" Apr 22 17:57:37.731695 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731480 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="prom-label-proxy" Apr 22 17:57:37.731695 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731601 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="init-config-reloader" Apr 22 17:57:37.731695 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731621 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="init-config-reloader" Apr 22 17:57:37.731695 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731641 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="config-reloader" Apr 22 17:57:37.731695 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731655 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="config-reloader" Apr 22 17:57:37.731845 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731761 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"023d05de644f2b0aede566f62cdf712596ac7b545b8500ef9dfd85635321f48d"} err="failed to get container status \"023d05de644f2b0aede566f62cdf712596ac7b545b8500ef9dfd85635321f48d\": rpc error: code = NotFound desc = could not find container \"023d05de644f2b0aede566f62cdf712596ac7b545b8500ef9dfd85635321f48d\": container with ID starting with 023d05de644f2b0aede566f62cdf712596ac7b545b8500ef9dfd85635321f48d not found: ID does not exist" Apr 22 17:57:37.731845 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731780 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="config-reloader" Apr 22 17:57:37.731845 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731791 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="kube-rbac-proxy" Apr 22 17:57:37.731845 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731797 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="kube-rbac-proxy-metric" Apr 22 17:57:37.731845 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731803 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="prom-label-proxy" Apr 22 17:57:37.731845 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731810 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="alertmanager" Apr 22 17:57:37.731845 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.731816 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" containerName="kube-rbac-proxy-web" Apr 22 17:57:37.736937 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.736920 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.739546 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.739514 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 17:57:37.739546 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.739529 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 17:57:37.739701 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.739532 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 17:57:37.739701 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.739606 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-gv4mz\"" Apr 22 17:57:37.739701 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.739637 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 17:57:37.739815 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.739797 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 17:57:37.740013 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.740000 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 17:57:37.740088 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.740015 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 17:57:37.740139 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.740019 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 17:57:37.744688 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.744620 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 17:57:37.747717 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.747697 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:57:37.781621 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.781595 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9" path="/var/lib/kubelet/pods/2f0c3d42-3428-4a03-a4c3-5419b2d9e7a9/volumes" Apr 22 17:57:37.821355 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.821336 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/95086d3d-8975-4a72-a006-6c8106d580ae-config-out\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.821467 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.821362 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/95086d3d-8975-4a72-a006-6c8106d580ae-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.821467 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.821385 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/95086d3d-8975-4a72-a006-6c8106d580ae-tls-assets\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.821467 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.821404 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/95086d3d-8975-4a72-a006-6c8106d580ae-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.821578 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.821468 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/95086d3d-8975-4a72-a006-6c8106d580ae-web-config\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.821578 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.821532 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/95086d3d-8975-4a72-a006-6c8106d580ae-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.821578 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.821552 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/95086d3d-8975-4a72-a006-6c8106d580ae-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.821714 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.821634 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/95086d3d-8975-4a72-a006-6c8106d580ae-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.821714 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.821693 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/95086d3d-8975-4a72-a006-6c8106d580ae-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.821818 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.821738 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt4p5\" (UniqueName: \"kubernetes.io/projected/95086d3d-8975-4a72-a006-6c8106d580ae-kube-api-access-vt4p5\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.821818 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.821796 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/95086d3d-8975-4a72-a006-6c8106d580ae-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.821949 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.821827 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/95086d3d-8975-4a72-a006-6c8106d580ae-config-volume\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.821949 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.821892 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95086d3d-8975-4a72-a006-6c8106d580ae-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.923113 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.923052 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/95086d3d-8975-4a72-a006-6c8106d580ae-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.923484 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.923112 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/95086d3d-8975-4a72-a006-6c8106d580ae-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.923484 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.923143 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vt4p5\" (UniqueName: \"kubernetes.io/projected/95086d3d-8975-4a72-a006-6c8106d580ae-kube-api-access-vt4p5\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.923484 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.923181 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/95086d3d-8975-4a72-a006-6c8106d580ae-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.923484 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.923213 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/95086d3d-8975-4a72-a006-6c8106d580ae-config-volume\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.923484 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.923248 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95086d3d-8975-4a72-a006-6c8106d580ae-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.923484 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.923269 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/95086d3d-8975-4a72-a006-6c8106d580ae-config-out\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.923484 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.923287 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/95086d3d-8975-4a72-a006-6c8106d580ae-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.923484 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.923461 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/95086d3d-8975-4a72-a006-6c8106d580ae-tls-assets\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.923902 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.923501 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/95086d3d-8975-4a72-a006-6c8106d580ae-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.923902 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.923535 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/95086d3d-8975-4a72-a006-6c8106d580ae-web-config\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.923902 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.923576 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/95086d3d-8975-4a72-a006-6c8106d580ae-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.923902 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.923604 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/95086d3d-8975-4a72-a006-6c8106d580ae-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.924103 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.923962 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/95086d3d-8975-4a72-a006-6c8106d580ae-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.924267 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.924241 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95086d3d-8975-4a72-a006-6c8106d580ae-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.926151 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.926124 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/95086d3d-8975-4a72-a006-6c8106d580ae-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.926252 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.926211 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/95086d3d-8975-4a72-a006-6c8106d580ae-config-volume\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.926252 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.926225 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/95086d3d-8975-4a72-a006-6c8106d580ae-config-out\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.926391 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.926369 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/95086d3d-8975-4a72-a006-6c8106d580ae-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.926530 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.926506 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/95086d3d-8975-4a72-a006-6c8106d580ae-tls-assets\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.926795 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.926761 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/95086d3d-8975-4a72-a006-6c8106d580ae-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.926948 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.926924 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/95086d3d-8975-4a72-a006-6c8106d580ae-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.927028 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.926993 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/95086d3d-8975-4a72-a006-6c8106d580ae-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.927145 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.927128 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/95086d3d-8975-4a72-a006-6c8106d580ae-web-config\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.928123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.928100 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/95086d3d-8975-4a72-a006-6c8106d580ae-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:37.931351 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:37.931332 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt4p5\" (UniqueName: \"kubernetes.io/projected/95086d3d-8975-4a72-a006-6c8106d580ae-kube-api-access-vt4p5\") pod \"alertmanager-main-0\" (UID: \"95086d3d-8975-4a72-a006-6c8106d580ae\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:38.047704 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:38.047686 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 17:57:38.169809 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:38.169701 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 17:57:38.172446 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:57:38.172418 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95086d3d_8975_4a72_a006_6c8106d580ae.slice/crio-dd4ecf2fa997d2eac10ca585c99c5c05ab44450fad318db47eff5131bfb7e4da WatchSource:0}: Error finding container dd4ecf2fa997d2eac10ca585c99c5c05ab44450fad318db47eff5131bfb7e4da: Status 404 returned error can't find the container with id dd4ecf2fa997d2eac10ca585c99c5c05ab44450fad318db47eff5131bfb7e4da Apr 22 17:57:38.529771 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:38.529683 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs\") pod \"network-metrics-daemon-dv96w\" (UID: \"92650e2d-54ea-4904-8ee5-235164ed2949\") " pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:57:38.531841 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:38.531813 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/92650e2d-54ea-4904-8ee5-235164ed2949-metrics-certs\") pod \"network-metrics-daemon-dv96w\" (UID: \"92650e2d-54ea-4904-8ee5-235164ed2949\") " pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:57:38.578723 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:38.578697 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qsx25\"" Apr 22 17:57:38.586914 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:38.586894 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dv96w" Apr 22 17:57:38.677769 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:38.677736 2566 generic.go:358] "Generic (PLEG): container finished" podID="95086d3d-8975-4a72-a006-6c8106d580ae" containerID="956936e3820ce40712f8bf9d09e31637d3ade066329453e263eb082560e9695b" exitCode=0 Apr 22 17:57:38.677946 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:38.677775 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"95086d3d-8975-4a72-a006-6c8106d580ae","Type":"ContainerDied","Data":"956936e3820ce40712f8bf9d09e31637d3ade066329453e263eb082560e9695b"} Apr 22 17:57:38.677946 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:38.677831 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"95086d3d-8975-4a72-a006-6c8106d580ae","Type":"ContainerStarted","Data":"dd4ecf2fa997d2eac10ca585c99c5c05ab44450fad318db47eff5131bfb7e4da"} Apr 22 17:57:38.699527 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:38.699464 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dv96w"] Apr 22 17:57:38.701745 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:57:38.701723 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92650e2d_54ea_4904_8ee5_235164ed2949.slice/crio-94782d0f80b7d5a02f5eeb2d0914799e3dbf5986035b488d79cfd338235743d3 WatchSource:0}: Error finding container 94782d0f80b7d5a02f5eeb2d0914799e3dbf5986035b488d79cfd338235743d3: Status 404 returned error can't find the container with id 94782d0f80b7d5a02f5eeb2d0914799e3dbf5986035b488d79cfd338235743d3 Apr 22 17:57:39.685427 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:39.685394 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"95086d3d-8975-4a72-a006-6c8106d580ae","Type":"ContainerStarted","Data":"ed1e6dc4c4929e8f313ab3b503504ce8bbd0914e10df27f9202d53cf30878da9"} Apr 22 17:57:39.685809 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:39.685437 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"95086d3d-8975-4a72-a006-6c8106d580ae","Type":"ContainerStarted","Data":"a377d7a0586852fbd490bb1e6f2f8fc941e2225ca5563566f3edad6bf9d9f7ef"} Apr 22 17:57:39.685809 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:39.685451 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"95086d3d-8975-4a72-a006-6c8106d580ae","Type":"ContainerStarted","Data":"3f91a3582c319f0d4906c712318c7b5dd536a7b7f84974c69c2650232f4ca4af"} Apr 22 17:57:39.685809 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:39.685460 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"95086d3d-8975-4a72-a006-6c8106d580ae","Type":"ContainerStarted","Data":"45ec4e9a31fab2bf3fbf43eaabd68abac5038b6709400d23cdef6194dcbc983e"} Apr 22 17:57:39.685809 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:39.685470 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"95086d3d-8975-4a72-a006-6c8106d580ae","Type":"ContainerStarted","Data":"439ede938250c5ef3255f905ba0c47cf7f414bdbdef805bf8bc38677155d199b"} Apr 22 17:57:39.685809 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:39.685482 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"95086d3d-8975-4a72-a006-6c8106d580ae","Type":"ContainerStarted","Data":"915761c94e7e167be278ffd10b3856bcbc4425499af26e6d831369cc8f961826"} Apr 22 17:57:39.686599 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:39.686576 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dv96w" event={"ID":"92650e2d-54ea-4904-8ee5-235164ed2949","Type":"ContainerStarted","Data":"94782d0f80b7d5a02f5eeb2d0914799e3dbf5986035b488d79cfd338235743d3"} Apr 22 17:57:39.746516 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:39.746456 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.746432731 podStartE2EDuration="2.746432731s" podCreationTimestamp="2026-04-22 17:57:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:57:39.745064915 +0000 UTC m=+252.544958135" watchObservedRunningTime="2026-04-22 17:57:39.746432731 +0000 UTC m=+252.546325951" Apr 22 17:57:40.023510 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.023434 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:57:40.024062 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.023888 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="prometheus" containerID="cri-o://9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f" gracePeriod=600 Apr 22 17:57:40.024062 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.023920 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="kube-rbac-proxy" containerID="cri-o://e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49" gracePeriod=600 Apr 22 17:57:40.024062 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.023937 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="kube-rbac-proxy-thanos" containerID="cri-o://c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd" gracePeriod=600 Apr 22 17:57:40.024062 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.023962 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="thanos-sidecar" containerID="cri-o://552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea" gracePeriod=600 Apr 22 17:57:40.024062 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.023965 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="kube-rbac-proxy-web" containerID="cri-o://232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129" gracePeriod=600 Apr 22 17:57:40.024062 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.023983 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="config-reloader" containerID="cri-o://6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c" gracePeriod=600 Apr 22 17:57:40.252502 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.252481 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.347438 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.347370 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-metrics-client-certs\") pod \"87284001-4dac-4179-9d8d-202143da5c90\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " Apr 22 17:57:40.347438 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.347403 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"87284001-4dac-4179-9d8d-202143da5c90\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " Apr 22 17:57:40.347438 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.347423 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-prometheus-k8s-tls\") pod \"87284001-4dac-4179-9d8d-202143da5c90\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " Apr 22 17:57:40.347438 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.347438 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-grpc-tls\") pod \"87284001-4dac-4179-9d8d-202143da5c90\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " Apr 22 17:57:40.347735 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.347456 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-prometheus-k8s-rulefiles-0\") pod \"87284001-4dac-4179-9d8d-202143da5c90\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " Apr 22 17:57:40.347735 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.347529 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/87284001-4dac-4179-9d8d-202143da5c90-prometheus-k8s-db\") pod \"87284001-4dac-4179-9d8d-202143da5c90\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " Apr 22 17:57:40.347735 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.347575 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-prometheus-trusted-ca-bundle\") pod \"87284001-4dac-4179-9d8d-202143da5c90\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " Apr 22 17:57:40.347735 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.347634 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-config\") pod \"87284001-4dac-4179-9d8d-202143da5c90\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " Apr 22 17:57:40.347735 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.347664 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-configmap-metrics-client-ca\") pod \"87284001-4dac-4179-9d8d-202143da5c90\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " Apr 22 17:57:40.347735 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.347695 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-configmap-serving-certs-ca-bundle\") pod \"87284001-4dac-4179-9d8d-202143da5c90\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " Apr 22 17:57:40.347735 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.347730 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"87284001-4dac-4179-9d8d-202143da5c90\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " Apr 22 17:57:40.348109 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.347759 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-thanos-prometheus-http-client-file\") pod \"87284001-4dac-4179-9d8d-202143da5c90\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " Apr 22 17:57:40.348109 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.347786 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-kube-rbac-proxy\") pod \"87284001-4dac-4179-9d8d-202143da5c90\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " Apr 22 17:57:40.348109 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.347830 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/87284001-4dac-4179-9d8d-202143da5c90-tls-assets\") pod \"87284001-4dac-4179-9d8d-202143da5c90\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " Apr 22 17:57:40.348109 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.347885 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-web-config\") pod \"87284001-4dac-4179-9d8d-202143da5c90\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " Apr 22 17:57:40.348109 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.347915 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnb48\" (UniqueName: \"kubernetes.io/projected/87284001-4dac-4179-9d8d-202143da5c90-kube-api-access-xnb48\") pod \"87284001-4dac-4179-9d8d-202143da5c90\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " Apr 22 17:57:40.348109 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.347950 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-configmap-kubelet-serving-ca-bundle\") pod \"87284001-4dac-4179-9d8d-202143da5c90\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " Apr 22 17:57:40.348109 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.347997 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/87284001-4dac-4179-9d8d-202143da5c90-config-out\") pod \"87284001-4dac-4179-9d8d-202143da5c90\" (UID: \"87284001-4dac-4179-9d8d-202143da5c90\") " Apr 22 17:57:40.348109 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.348059 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "87284001-4dac-4179-9d8d-202143da5c90" (UID: "87284001-4dac-4179-9d8d-202143da5c90"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:57:40.348612 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.348296 2566 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-prometheus-trusted-ca-bundle\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:40.348612 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.348522 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87284001-4dac-4179-9d8d-202143da5c90-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "87284001-4dac-4179-9d8d-202143da5c90" (UID: "87284001-4dac-4179-9d8d-202143da5c90"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:57:40.349283 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.349017 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "87284001-4dac-4179-9d8d-202143da5c90" (UID: "87284001-4dac-4179-9d8d-202143da5c90"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:57:40.349283 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.349116 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "87284001-4dac-4179-9d8d-202143da5c90" (UID: "87284001-4dac-4179-9d8d-202143da5c90"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:57:40.350066 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.350039 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "87284001-4dac-4179-9d8d-202143da5c90" (UID: "87284001-4dac-4179-9d8d-202143da5c90"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:40.350160 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.350092 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "87284001-4dac-4179-9d8d-202143da5c90" (UID: "87284001-4dac-4179-9d8d-202143da5c90"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:40.350276 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.350246 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "87284001-4dac-4179-9d8d-202143da5c90" (UID: "87284001-4dac-4179-9d8d-202143da5c90"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:57:40.350403 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.350328 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "87284001-4dac-4179-9d8d-202143da5c90" (UID: "87284001-4dac-4179-9d8d-202143da5c90"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 17:57:40.350792 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.350761 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "87284001-4dac-4179-9d8d-202143da5c90" (UID: "87284001-4dac-4179-9d8d-202143da5c90"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:40.350792 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.350779 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "87284001-4dac-4179-9d8d-202143da5c90" (UID: "87284001-4dac-4179-9d8d-202143da5c90"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:40.351103 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.351077 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87284001-4dac-4179-9d8d-202143da5c90-config-out" (OuterVolumeSpecName: "config-out") pod "87284001-4dac-4179-9d8d-202143da5c90" (UID: "87284001-4dac-4179-9d8d-202143da5c90"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 17:57:40.351678 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.351652 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "87284001-4dac-4179-9d8d-202143da5c90" (UID: "87284001-4dac-4179-9d8d-202143da5c90"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:40.352136 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.352113 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "87284001-4dac-4179-9d8d-202143da5c90" (UID: "87284001-4dac-4179-9d8d-202143da5c90"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:40.352243 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.352224 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87284001-4dac-4179-9d8d-202143da5c90-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "87284001-4dac-4179-9d8d-202143da5c90" (UID: "87284001-4dac-4179-9d8d-202143da5c90"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:57:40.352495 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.352462 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "87284001-4dac-4179-9d8d-202143da5c90" (UID: "87284001-4dac-4179-9d8d-202143da5c90"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:40.352586 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.352570 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87284001-4dac-4179-9d8d-202143da5c90-kube-api-access-xnb48" (OuterVolumeSpecName: "kube-api-access-xnb48") pod "87284001-4dac-4179-9d8d-202143da5c90" (UID: "87284001-4dac-4179-9d8d-202143da5c90"). InnerVolumeSpecName "kube-api-access-xnb48". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 17:57:40.352931 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.352917 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-config" (OuterVolumeSpecName: "config") pod "87284001-4dac-4179-9d8d-202143da5c90" (UID: "87284001-4dac-4179-9d8d-202143da5c90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:40.361938 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.361918 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-web-config" (OuterVolumeSpecName: "web-config") pod "87284001-4dac-4179-9d8d-202143da5c90" (UID: "87284001-4dac-4179-9d8d-202143da5c90"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 17:57:40.448715 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.448692 2566 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/87284001-4dac-4179-9d8d-202143da5c90-prometheus-k8s-db\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:40.448715 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.448712 2566 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-config\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:40.448820 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.448728 2566 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-configmap-metrics-client-ca\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:40.448820 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.448741 2566 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:40.448820 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.448750 2566 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:40.448820 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.448760 2566 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-thanos-prometheus-http-client-file\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:40.448820 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.448769 2566 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-kube-rbac-proxy\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:40.448820 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.448777 2566 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/87284001-4dac-4179-9d8d-202143da5c90-tls-assets\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:40.448820 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.448785 2566 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-web-config\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:40.448820 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.448794 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xnb48\" (UniqueName: \"kubernetes.io/projected/87284001-4dac-4179-9d8d-202143da5c90-kube-api-access-xnb48\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:40.448820 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.448807 2566 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:40.448820 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.448818 2566 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/87284001-4dac-4179-9d8d-202143da5c90-config-out\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:40.449139 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.448827 2566 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-metrics-client-certs\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:40.449139 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.448836 2566 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:40.449139 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.448845 2566 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-prometheus-k8s-tls\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:40.449139 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.448854 2566 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/87284001-4dac-4179-9d8d-202143da5c90-secret-grpc-tls\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:40.449139 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.448888 2566 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/87284001-4dac-4179-9d8d-202143da5c90-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 17:57:40.692159 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.692130 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dv96w" event={"ID":"92650e2d-54ea-4904-8ee5-235164ed2949","Type":"ContainerStarted","Data":"a56a2b7df9a7fb7857991df69e1fd2edf777a8c230931a068d8d901e8e9caedc"} Apr 22 17:57:40.692553 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.692166 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dv96w" event={"ID":"92650e2d-54ea-4904-8ee5-235164ed2949","Type":"ContainerStarted","Data":"61120893ef62f5ef1178cbd5e4afa9e3bdb6c2f1fda63f530af4510d0c205c44"} Apr 22 17:57:40.695006 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.694981 2566 generic.go:358] "Generic (PLEG): container finished" podID="87284001-4dac-4179-9d8d-202143da5c90" containerID="c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd" exitCode=0 Apr 22 17:57:40.695006 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.695005 2566 generic.go:358] "Generic (PLEG): container finished" podID="87284001-4dac-4179-9d8d-202143da5c90" containerID="e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49" exitCode=0 Apr 22 17:57:40.695137 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.695018 2566 generic.go:358] "Generic (PLEG): container finished" podID="87284001-4dac-4179-9d8d-202143da5c90" containerID="232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129" exitCode=0 Apr 22 17:57:40.695137 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.695029 2566 generic.go:358] "Generic (PLEG): container finished" podID="87284001-4dac-4179-9d8d-202143da5c90" containerID="552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea" exitCode=0 Apr 22 17:57:40.695137 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.695037 2566 generic.go:358] "Generic (PLEG): container finished" podID="87284001-4dac-4179-9d8d-202143da5c90" containerID="6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c" exitCode=0 Apr 22 17:57:40.695137 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.695045 2566 generic.go:358] "Generic (PLEG): container finished" podID="87284001-4dac-4179-9d8d-202143da5c90" containerID="9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f" exitCode=0 Apr 22 17:57:40.695137 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.695043 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87284001-4dac-4179-9d8d-202143da5c90","Type":"ContainerDied","Data":"c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd"} Apr 22 17:57:40.695137 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.695082 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87284001-4dac-4179-9d8d-202143da5c90","Type":"ContainerDied","Data":"e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49"} Apr 22 17:57:40.695137 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.695093 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87284001-4dac-4179-9d8d-202143da5c90","Type":"ContainerDied","Data":"232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129"} Apr 22 17:57:40.695137 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.695097 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.695137 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.695103 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87284001-4dac-4179-9d8d-202143da5c90","Type":"ContainerDied","Data":"552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea"} Apr 22 17:57:40.695137 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.695113 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87284001-4dac-4179-9d8d-202143da5c90","Type":"ContainerDied","Data":"6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c"} Apr 22 17:57:40.695137 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.695123 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87284001-4dac-4179-9d8d-202143da5c90","Type":"ContainerDied","Data":"9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f"} Apr 22 17:57:40.695137 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.695135 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"87284001-4dac-4179-9d8d-202143da5c90","Type":"ContainerDied","Data":"795ce6be877d739ad168d06b24ceaf1b3c85037b9b65e9614dff8a4ced02bdfd"} Apr 22 17:57:40.695137 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.695152 2566 scope.go:117] "RemoveContainer" containerID="c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd" Apr 22 17:57:40.706650 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.706632 2566 scope.go:117] "RemoveContainer" containerID="e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49" Apr 22 17:57:40.708983 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.708936 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-dv96w" podStartSLOduration=252.749340786 podStartE2EDuration="4m13.708921091s" podCreationTimestamp="2026-04-22 17:53:27 +0000 UTC" firstStartedPulling="2026-04-22 17:57:38.703727162 +0000 UTC m=+251.503620362" lastFinishedPulling="2026-04-22 17:57:39.66330747 +0000 UTC m=+252.463200667" observedRunningTime="2026-04-22 17:57:40.707736602 +0000 UTC m=+253.507629822" watchObservedRunningTime="2026-04-22 17:57:40.708921091 +0000 UTC m=+253.508814312" Apr 22 17:57:40.714354 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.714333 2566 scope.go:117] "RemoveContainer" containerID="232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129" Apr 22 17:57:40.720919 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.720904 2566 scope.go:117] "RemoveContainer" containerID="552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea" Apr 22 17:57:40.725151 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.725128 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:57:40.728244 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.728225 2566 scope.go:117] "RemoveContainer" containerID="6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c" Apr 22 17:57:40.729933 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.729915 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:57:40.734659 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.734640 2566 scope.go:117] "RemoveContainer" containerID="9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f" Apr 22 17:57:40.741437 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.741423 2566 scope.go:117] "RemoveContainer" containerID="127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e" Apr 22 17:57:40.747775 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.747760 2566 scope.go:117] "RemoveContainer" containerID="c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd" Apr 22 17:57:40.748034 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:57:40.748007 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd\": container with ID starting with c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd not found: ID does not exist" containerID="c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd" Apr 22 17:57:40.748086 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.748033 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd"} err="failed to get container status \"c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd\": rpc error: code = NotFound desc = could not find container \"c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd\": container with ID starting with c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd not found: ID does not exist" Apr 22 17:57:40.748086 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.748054 2566 scope.go:117] "RemoveContainer" containerID="e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49" Apr 22 17:57:40.748292 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:57:40.748273 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49\": container with ID starting with e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49 not found: ID does not exist" containerID="e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49" Apr 22 17:57:40.748327 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.748299 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49"} err="failed to get container status \"e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49\": rpc error: code = NotFound desc = could not find container \"e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49\": container with ID starting with e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49 not found: ID does not exist" Apr 22 17:57:40.748327 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.748318 2566 scope.go:117] "RemoveContainer" containerID="232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129" Apr 22 17:57:40.748566 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:57:40.748549 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129\": container with ID starting with 232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129 not found: ID does not exist" containerID="232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129" Apr 22 17:57:40.748648 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.748572 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129"} err="failed to get container status \"232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129\": rpc error: code = NotFound desc = could not find container \"232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129\": container with ID starting with 232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129 not found: ID does not exist" Apr 22 17:57:40.748648 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.748588 2566 scope.go:117] "RemoveContainer" containerID="552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea" Apr 22 17:57:40.748826 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:57:40.748811 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea\": container with ID starting with 552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea not found: ID does not exist" containerID="552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea" Apr 22 17:57:40.748882 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.748830 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea"} err="failed to get container status \"552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea\": rpc error: code = NotFound desc = could not find container \"552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea\": container with ID starting with 552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea not found: ID does not exist" Apr 22 17:57:40.748882 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.748842 2566 scope.go:117] "RemoveContainer" containerID="6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c" Apr 22 17:57:40.749165 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:57:40.749147 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c\": container with ID starting with 6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c not found: ID does not exist" containerID="6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c" Apr 22 17:57:40.749260 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.749170 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c"} err="failed to get container status \"6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c\": rpc error: code = NotFound desc = could not find container \"6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c\": container with ID starting with 6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c not found: ID does not exist" Apr 22 17:57:40.749260 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.749185 2566 scope.go:117] "RemoveContainer" containerID="9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f" Apr 22 17:57:40.749425 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:57:40.749413 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f\": container with ID starting with 9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f not found: ID does not exist" containerID="9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f" Apr 22 17:57:40.749465 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.749428 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f"} err="failed to get container status \"9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f\": rpc error: code = NotFound desc = could not find container \"9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f\": container with ID starting with 9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f not found: ID does not exist" Apr 22 17:57:40.749465 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.749440 2566 scope.go:117] "RemoveContainer" containerID="127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e" Apr 22 17:57:40.749649 ip-10-0-132-106 kubenswrapper[2566]: E0422 17:57:40.749636 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e\": container with ID starting with 127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e not found: ID does not exist" containerID="127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e" Apr 22 17:57:40.749688 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.749653 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e"} err="failed to get container status \"127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e\": rpc error: code = NotFound desc = could not find container \"127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e\": container with ID starting with 127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e not found: ID does not exist" Apr 22 17:57:40.749688 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.749663 2566 scope.go:117] "RemoveContainer" containerID="c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd" Apr 22 17:57:40.749881 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.749844 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd"} err="failed to get container status \"c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd\": rpc error: code = NotFound desc = could not find container \"c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd\": container with ID starting with c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd not found: ID does not exist" Apr 22 17:57:40.749934 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.749879 2566 scope.go:117] "RemoveContainer" containerID="e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49" Apr 22 17:57:40.750107 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.750088 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49"} err="failed to get container status \"e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49\": rpc error: code = NotFound desc = could not find container \"e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49\": container with ID starting with e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49 not found: ID does not exist" Apr 22 17:57:40.750151 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.750111 2566 scope.go:117] "RemoveContainer" containerID="232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129" Apr 22 17:57:40.750296 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.750276 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129"} err="failed to get container status \"232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129\": rpc error: code = NotFound desc = could not find container \"232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129\": container with ID starting with 232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129 not found: ID does not exist" Apr 22 17:57:40.750338 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.750297 2566 scope.go:117] "RemoveContainer" containerID="552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea" Apr 22 17:57:40.750499 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.750478 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea"} err="failed to get container status \"552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea\": rpc error: code = NotFound desc = could not find container \"552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea\": container with ID starting with 552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea not found: ID does not exist" Apr 22 17:57:40.750499 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.750493 2566 scope.go:117] "RemoveContainer" containerID="6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c" Apr 22 17:57:40.750675 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.750660 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c"} err="failed to get container status \"6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c\": rpc error: code = NotFound desc = could not find container \"6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c\": container with ID starting with 6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c not found: ID does not exist" Apr 22 17:57:40.750725 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.750678 2566 scope.go:117] "RemoveContainer" containerID="9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f" Apr 22 17:57:40.750883 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.750845 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f"} err="failed to get container status \"9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f\": rpc error: code = NotFound desc = could not find container \"9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f\": container with ID starting with 9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f not found: ID does not exist" Apr 22 17:57:40.750883 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.750878 2566 scope.go:117] "RemoveContainer" containerID="127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e" Apr 22 17:57:40.751054 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.751033 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e"} err="failed to get container status \"127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e\": rpc error: code = NotFound desc = could not find container \"127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e\": container with ID starting with 127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e not found: ID does not exist" Apr 22 17:57:40.751103 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.751054 2566 scope.go:117] "RemoveContainer" containerID="c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd" Apr 22 17:57:40.751217 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.751198 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd"} err="failed to get container status \"c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd\": rpc error: code = NotFound desc = could not find container \"c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd\": container with ID starting with c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd not found: ID does not exist" Apr 22 17:57:40.751217 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.751216 2566 scope.go:117] "RemoveContainer" containerID="e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49" Apr 22 17:57:40.751380 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.751355 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49"} err="failed to get container status \"e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49\": rpc error: code = NotFound desc = could not find container \"e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49\": container with ID starting with e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49 not found: ID does not exist" Apr 22 17:57:40.751422 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.751381 2566 scope.go:117] "RemoveContainer" containerID="232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129" Apr 22 17:57:40.751598 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.751571 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129"} err="failed to get container status \"232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129\": rpc error: code = NotFound desc = could not find container \"232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129\": container with ID starting with 232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129 not found: ID does not exist" Apr 22 17:57:40.751649 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.751599 2566 scope.go:117] "RemoveContainer" containerID="552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea" Apr 22 17:57:40.751814 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.751797 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea"} err="failed to get container status \"552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea\": rpc error: code = NotFound desc = could not find container \"552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea\": container with ID starting with 552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea not found: ID does not exist" Apr 22 17:57:40.751872 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.751815 2566 scope.go:117] "RemoveContainer" containerID="6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c" Apr 22 17:57:40.752048 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.752031 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c"} err="failed to get container status \"6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c\": rpc error: code = NotFound desc = could not find container \"6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c\": container with ID starting with 6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c not found: ID does not exist" Apr 22 17:57:40.752123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.752049 2566 scope.go:117] "RemoveContainer" containerID="9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f" Apr 22 17:57:40.752272 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.752256 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f"} err="failed to get container status \"9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f\": rpc error: code = NotFound desc = could not find container \"9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f\": container with ID starting with 9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f not found: ID does not exist" Apr 22 17:57:40.752315 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.752272 2566 scope.go:117] "RemoveContainer" containerID="127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e" Apr 22 17:57:40.752500 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.752480 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e"} err="failed to get container status \"127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e\": rpc error: code = NotFound desc = could not find container \"127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e\": container with ID starting with 127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e not found: ID does not exist" Apr 22 17:57:40.752545 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.752500 2566 scope.go:117] "RemoveContainer" containerID="c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd" Apr 22 17:57:40.752719 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.752701 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd"} err="failed to get container status \"c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd\": rpc error: code = NotFound desc = could not find container \"c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd\": container with ID starting with c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd not found: ID does not exist" Apr 22 17:57:40.752778 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.752722 2566 scope.go:117] "RemoveContainer" containerID="e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49" Apr 22 17:57:40.753005 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.752988 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49"} err="failed to get container status \"e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49\": rpc error: code = NotFound desc = could not find container \"e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49\": container with ID starting with e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49 not found: ID does not exist" Apr 22 17:57:40.753071 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.753007 2566 scope.go:117] "RemoveContainer" containerID="232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129" Apr 22 17:57:40.753218 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.753200 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129"} err="failed to get container status \"232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129\": rpc error: code = NotFound desc = could not find container \"232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129\": container with ID starting with 232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129 not found: ID does not exist" Apr 22 17:57:40.753260 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.753220 2566 scope.go:117] "RemoveContainer" containerID="552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea" Apr 22 17:57:40.753409 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.753393 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea"} err="failed to get container status \"552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea\": rpc error: code = NotFound desc = could not find container \"552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea\": container with ID starting with 552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea not found: ID does not exist" Apr 22 17:57:40.753461 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.753409 2566 scope.go:117] "RemoveContainer" containerID="6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c" Apr 22 17:57:40.753604 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.753588 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c"} err="failed to get container status \"6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c\": rpc error: code = NotFound desc = could not find container \"6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c\": container with ID starting with 6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c not found: ID does not exist" Apr 22 17:57:40.753662 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.753605 2566 scope.go:117] "RemoveContainer" containerID="9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f" Apr 22 17:57:40.753803 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.753785 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f"} err="failed to get container status \"9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f\": rpc error: code = NotFound desc = could not find container \"9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f\": container with ID starting with 9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f not found: ID does not exist" Apr 22 17:57:40.753883 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.753805 2566 scope.go:117] "RemoveContainer" containerID="127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e" Apr 22 17:57:40.754009 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.753993 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e"} err="failed to get container status \"127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e\": rpc error: code = NotFound desc = could not find container \"127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e\": container with ID starting with 127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e not found: ID does not exist" Apr 22 17:57:40.754078 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.754010 2566 scope.go:117] "RemoveContainer" containerID="c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd" Apr 22 17:57:40.754254 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.754230 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd"} err="failed to get container status \"c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd\": rpc error: code = NotFound desc = could not find container \"c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd\": container with ID starting with c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd not found: ID does not exist" Apr 22 17:57:40.754254 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.754250 2566 scope.go:117] "RemoveContainer" containerID="e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49" Apr 22 17:57:40.754471 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.754443 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49"} err="failed to get container status \"e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49\": rpc error: code = NotFound desc = could not find container \"e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49\": container with ID starting with e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49 not found: ID does not exist" Apr 22 17:57:40.754525 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.754476 2566 scope.go:117] "RemoveContainer" containerID="232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129" Apr 22 17:57:40.754723 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.754698 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129"} err="failed to get container status \"232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129\": rpc error: code = NotFound desc = could not find container \"232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129\": container with ID starting with 232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129 not found: ID does not exist" Apr 22 17:57:40.754849 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.754724 2566 scope.go:117] "RemoveContainer" containerID="552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea" Apr 22 17:57:40.755060 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.755029 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea"} err="failed to get container status \"552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea\": rpc error: code = NotFound desc = could not find container \"552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea\": container with ID starting with 552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea not found: ID does not exist" Apr 22 17:57:40.755108 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.755062 2566 scope.go:117] "RemoveContainer" containerID="6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c" Apr 22 17:57:40.755296 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.755276 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c"} err="failed to get container status \"6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c\": rpc error: code = NotFound desc = could not find container \"6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c\": container with ID starting with 6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c not found: ID does not exist" Apr 22 17:57:40.755417 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.755297 2566 scope.go:117] "RemoveContainer" containerID="9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f" Apr 22 17:57:40.755606 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.755574 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f"} err="failed to get container status \"9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f\": rpc error: code = NotFound desc = could not find container \"9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f\": container with ID starting with 9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f not found: ID does not exist" Apr 22 17:57:40.755606 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.755604 2566 scope.go:117] "RemoveContainer" containerID="127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e" Apr 22 17:57:40.755891 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.755854 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e"} err="failed to get container status \"127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e\": rpc error: code = NotFound desc = could not find container \"127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e\": container with ID starting with 127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e not found: ID does not exist" Apr 22 17:57:40.755891 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.755890 2566 scope.go:117] "RemoveContainer" containerID="c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd" Apr 22 17:57:40.756213 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.756192 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd"} err="failed to get container status \"c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd\": rpc error: code = NotFound desc = could not find container \"c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd\": container with ID starting with c7d0dbad3c698700ffac4cbedfe0a22cbcb89747aeba59dba877414c165c13fd not found: ID does not exist" Apr 22 17:57:40.756213 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.756214 2566 scope.go:117] "RemoveContainer" containerID="e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49" Apr 22 17:57:40.756455 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.756439 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49"} err="failed to get container status \"e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49\": rpc error: code = NotFound desc = could not find container \"e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49\": container with ID starting with e5c77c257fd64040b54c59b8e2edc9c08abe77fb1149e21ddf32c8f7f10a1f49 not found: ID does not exist" Apr 22 17:57:40.756511 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.756456 2566 scope.go:117] "RemoveContainer" containerID="232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129" Apr 22 17:57:40.756617 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.756600 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:57:40.756683 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.756664 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129"} err="failed to get container status \"232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129\": rpc error: code = NotFound desc = could not find container \"232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129\": container with ID starting with 232e474d39a25947495550ddc63982e2a264564d7e8ef803427788aa3c90c129 not found: ID does not exist" Apr 22 17:57:40.756734 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.756684 2566 scope.go:117] "RemoveContainer" containerID="552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea" Apr 22 17:57:40.756985 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.756960 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea"} err="failed to get container status \"552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea\": rpc error: code = NotFound desc = could not find container \"552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea\": container with ID starting with 552cf76d9f32b20238d55052c0b71ed5e8e0ec3a6dd6d0e9c9487ebcb61996ea not found: ID does not exist" Apr 22 17:57:40.757080 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.756991 2566 scope.go:117] "RemoveContainer" containerID="6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c" Apr 22 17:57:40.757080 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.756970 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="config-reloader" Apr 22 17:57:40.757080 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757048 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="config-reloader" Apr 22 17:57:40.757205 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757091 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="thanos-sidecar" Apr 22 17:57:40.757205 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757100 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="thanos-sidecar" Apr 22 17:57:40.757205 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757114 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="init-config-reloader" Apr 22 17:57:40.757205 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757125 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="init-config-reloader" Apr 22 17:57:40.757205 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757135 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="kube-rbac-proxy" Apr 22 17:57:40.757205 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757143 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="kube-rbac-proxy" Apr 22 17:57:40.757205 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757158 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="prometheus" Apr 22 17:57:40.757205 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757167 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="prometheus" Apr 22 17:57:40.757205 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757177 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="kube-rbac-proxy-thanos" Apr 22 17:57:40.757205 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757184 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="kube-rbac-proxy-thanos" Apr 22 17:57:40.757205 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757201 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="kube-rbac-proxy-web" Apr 22 17:57:40.757205 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757209 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="kube-rbac-proxy-web" Apr 22 17:57:40.757614 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757231 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c"} err="failed to get container status \"6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c\": rpc error: code = NotFound desc = could not find container \"6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c\": container with ID starting with 6e67766695d7f5005fa35a87505b4c258f57e988c82d3de06f5d75d69eff965c not found: ID does not exist" Apr 22 17:57:40.757614 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757259 2566 scope.go:117] "RemoveContainer" containerID="9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f" Apr 22 17:57:40.757614 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757322 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="prometheus" Apr 22 17:57:40.757614 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757336 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="kube-rbac-proxy-web" Apr 22 17:57:40.757614 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757346 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="kube-rbac-proxy" Apr 22 17:57:40.757614 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757357 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="kube-rbac-proxy-thanos" Apr 22 17:57:40.757614 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757367 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="config-reloader" Apr 22 17:57:40.757614 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757380 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="87284001-4dac-4179-9d8d-202143da5c90" containerName="thanos-sidecar" Apr 22 17:57:40.757614 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757509 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f"} err="failed to get container status \"9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f\": rpc error: code = NotFound desc = could not find container \"9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f\": container with ID starting with 9468f48619826a0d78fc0dacadb8e92ee878a8a82451450c2524f521160aac3f not found: ID does not exist" Apr 22 17:57:40.757614 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757535 2566 scope.go:117] "RemoveContainer" containerID="127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e" Apr 22 17:57:40.757981 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.757763 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e"} err="failed to get container status \"127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e\": rpc error: code = NotFound desc = could not find container \"127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e\": container with ID starting with 127b96feb7bde66d1a7b3478aefdf5bf3125cc0f16ba7b2c3ea0134d4c75012e not found: ID does not exist" Apr 22 17:57:40.763028 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.763012 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.767123 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.767103 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 17:57:40.767498 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.767470 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 17:57:40.767498 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.767487 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 17:57:40.767817 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.767716 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 17:57:40.767817 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.767729 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 17:57:40.768157 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.767885 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 17:57:40.768157 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.767890 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5jq44qrsr09tb\"" Apr 22 17:57:40.768157 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.767919 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 17:57:40.768157 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.767939 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 17:57:40.768157 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.767887 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 17:57:40.768157 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.767943 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-j7s85\"" Apr 22 17:57:40.768157 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.768151 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 17:57:40.768495 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.768272 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 17:57:40.770713 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.770691 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 17:57:40.774800 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.774617 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 17:57:40.776406 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.776387 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:57:40.851554 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.851524 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.851702 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.851573 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.851761 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.851730 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.851813 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.851784 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-web-config\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.851940 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.851920 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89a4c77d-d08d-4acc-8d82-81ac5653491c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.852058 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.851954 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.852058 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.852002 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89a4c77d-d08d-4acc-8d82-81ac5653491c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.852058 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.852043 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89a4c77d-d08d-4acc-8d82-81ac5653491c-config-out\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.852208 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.852061 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89a4c77d-d08d-4acc-8d82-81ac5653491c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.852315 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.852298 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89a4c77d-d08d-4acc-8d82-81ac5653491c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.852369 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.852323 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/89a4c77d-d08d-4acc-8d82-81ac5653491c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.852470 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.852452 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-config\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.852521 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.852500 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.852554 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.852522 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.852554 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.852538 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.852708 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.852562 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n877h\" (UniqueName: \"kubernetes.io/projected/89a4c77d-d08d-4acc-8d82-81ac5653491c-kube-api-access-n877h\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.852708 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.852611 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89a4c77d-d08d-4acc-8d82-81ac5653491c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.852787 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.852704 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89a4c77d-d08d-4acc-8d82-81ac5653491c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.953384 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.953331 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.953384 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.953358 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.953384 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.953374 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.953591 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.953391 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n877h\" (UniqueName: \"kubernetes.io/projected/89a4c77d-d08d-4acc-8d82-81ac5653491c-kube-api-access-n877h\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.953591 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.953533 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89a4c77d-d08d-4acc-8d82-81ac5653491c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.953591 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.953577 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89a4c77d-d08d-4acc-8d82-81ac5653491c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.953731 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.953704 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.953783 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.953751 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.953847 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.953790 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.953847 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.953814 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-web-config\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.953847 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.953844 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89a4c77d-d08d-4acc-8d82-81ac5653491c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.954032 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.953889 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.954032 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.953926 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89a4c77d-d08d-4acc-8d82-81ac5653491c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.954032 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.953957 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89a4c77d-d08d-4acc-8d82-81ac5653491c-config-out\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.954032 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.953981 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89a4c77d-d08d-4acc-8d82-81ac5653491c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.954032 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.954023 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89a4c77d-d08d-4acc-8d82-81ac5653491c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.954270 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.954047 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/89a4c77d-d08d-4acc-8d82-81ac5653491c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.954270 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.954085 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-config\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.954589 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.954564 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89a4c77d-d08d-4acc-8d82-81ac5653491c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.956386 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.956344 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.956481 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.956460 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/89a4c77d-d08d-4acc-8d82-81ac5653491c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.956540 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.956487 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.956716 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.956693 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-config\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.957026 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.957005 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/89a4c77d-d08d-4acc-8d82-81ac5653491c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.957161 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.957140 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89a4c77d-d08d-4acc-8d82-81ac5653491c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.957234 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.957148 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.957772 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.957747 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89a4c77d-d08d-4acc-8d82-81ac5653491c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.958012 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.957990 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/89a4c77d-d08d-4acc-8d82-81ac5653491c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.958398 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.958374 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/89a4c77d-d08d-4acc-8d82-81ac5653491c-config-out\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.958568 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.958545 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.958706 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.958687 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.958765 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.958689 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/89a4c77d-d08d-4acc-8d82-81ac5653491c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.959017 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.959000 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.959314 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.959295 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.959852 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.959833 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/89a4c77d-d08d-4acc-8d82-81ac5653491c-web-config\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:40.961322 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:40.961306 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n877h\" (UniqueName: \"kubernetes.io/projected/89a4c77d-d08d-4acc-8d82-81ac5653491c-kube-api-access-n877h\") pod \"prometheus-k8s-0\" (UID: \"89a4c77d-d08d-4acc-8d82-81ac5653491c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:41.085561 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:41.085532 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:57:41.222265 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:41.222211 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 17:57:41.224825 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:57:41.224799 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89a4c77d_d08d_4acc_8d82_81ac5653491c.slice/crio-39f26ee44099e868ccd26dda79c43b188f9c575f8ccdcaafdefc787372ad1858 WatchSource:0}: Error finding container 39f26ee44099e868ccd26dda79c43b188f9c575f8ccdcaafdefc787372ad1858: Status 404 returned error can't find the container with id 39f26ee44099e868ccd26dda79c43b188f9c575f8ccdcaafdefc787372ad1858 Apr 22 17:57:41.699124 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:41.699091 2566 generic.go:358] "Generic (PLEG): container finished" podID="89a4c77d-d08d-4acc-8d82-81ac5653491c" containerID="8041e955f861df662addeaf564d5386bf15ed106fafa92736fb1bafe0766a189" exitCode=0 Apr 22 17:57:41.699525 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:41.699185 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89a4c77d-d08d-4acc-8d82-81ac5653491c","Type":"ContainerDied","Data":"8041e955f861df662addeaf564d5386bf15ed106fafa92736fb1bafe0766a189"} Apr 22 17:57:41.699525 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:41.699230 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89a4c77d-d08d-4acc-8d82-81ac5653491c","Type":"ContainerStarted","Data":"39f26ee44099e868ccd26dda79c43b188f9c575f8ccdcaafdefc787372ad1858"} Apr 22 17:57:41.782845 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:41.782818 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87284001-4dac-4179-9d8d-202143da5c90" path="/var/lib/kubelet/pods/87284001-4dac-4179-9d8d-202143da5c90/volumes" Apr 22 17:57:42.708165 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:42.708133 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89a4c77d-d08d-4acc-8d82-81ac5653491c","Type":"ContainerStarted","Data":"51fde84017bf6eb58d89bef2f57700453812a12fbfc5507cf886b8246671dd81"} Apr 22 17:57:42.708165 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:42.708170 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89a4c77d-d08d-4acc-8d82-81ac5653491c","Type":"ContainerStarted","Data":"03acaddb11eea3cfd250505f5cf5630ddf6ba8420f10a14876e1954efaeaf727"} Apr 22 17:57:42.708614 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:42.708183 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89a4c77d-d08d-4acc-8d82-81ac5653491c","Type":"ContainerStarted","Data":"13b5929b8ff113020e2b1f3e69965af3e88b85a3c24aea82efa50c96eb7f1be4"} Apr 22 17:57:42.708614 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:42.708195 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89a4c77d-d08d-4acc-8d82-81ac5653491c","Type":"ContainerStarted","Data":"42f642e020453a3a948be518e8ec59c043767495ebce9633721ae3ef46c37363"} Apr 22 17:57:42.708614 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:42.708208 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89a4c77d-d08d-4acc-8d82-81ac5653491c","Type":"ContainerStarted","Data":"b2657c9a149bea8a2722790872eeb4d5a39b91180d4a5d2b35e7b90b2f11356f"} Apr 22 17:57:42.708614 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:42.708219 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"89a4c77d-d08d-4acc-8d82-81ac5653491c","Type":"ContainerStarted","Data":"c0a305951da2b128cee52c4507e80b68b1b86275a5b76035a48df891f3dde423"} Apr 22 17:57:42.738226 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:42.738185 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.738171732 podStartE2EDuration="2.738171732s" podCreationTimestamp="2026-04-22 17:57:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:57:42.736380694 +0000 UTC m=+255.536273913" watchObservedRunningTime="2026-04-22 17:57:42.738171732 +0000 UTC m=+255.538064951" Apr 22 17:57:46.085691 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:57:46.085656 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:18.884564 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:18.884529 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-6k6lt"] Apr 22 17:58:18.887337 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:18.887312 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6k6lt" Apr 22 17:58:18.890070 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:18.890044 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 17:58:18.897800 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:18.897775 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6k6lt"] Apr 22 17:58:18.969653 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:18.969615 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a9c27a92-d6b0-4347-a2f9-ee999ae6f4e6-dbus\") pod \"global-pull-secret-syncer-6k6lt\" (UID: \"a9c27a92-d6b0-4347-a2f9-ee999ae6f4e6\") " pod="kube-system/global-pull-secret-syncer-6k6lt" Apr 22 17:58:18.969847 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:18.969675 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a9c27a92-d6b0-4347-a2f9-ee999ae6f4e6-kubelet-config\") pod \"global-pull-secret-syncer-6k6lt\" (UID: \"a9c27a92-d6b0-4347-a2f9-ee999ae6f4e6\") " pod="kube-system/global-pull-secret-syncer-6k6lt" Apr 22 17:58:18.969847 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:18.969755 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a9c27a92-d6b0-4347-a2f9-ee999ae6f4e6-original-pull-secret\") pod \"global-pull-secret-syncer-6k6lt\" (UID: \"a9c27a92-d6b0-4347-a2f9-ee999ae6f4e6\") " pod="kube-system/global-pull-secret-syncer-6k6lt" Apr 22 17:58:19.071159 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:19.071119 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a9c27a92-d6b0-4347-a2f9-ee999ae6f4e6-dbus\") pod \"global-pull-secret-syncer-6k6lt\" (UID: \"a9c27a92-d6b0-4347-a2f9-ee999ae6f4e6\") " pod="kube-system/global-pull-secret-syncer-6k6lt" Apr 22 17:58:19.071352 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:19.071192 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a9c27a92-d6b0-4347-a2f9-ee999ae6f4e6-kubelet-config\") pod \"global-pull-secret-syncer-6k6lt\" (UID: \"a9c27a92-d6b0-4347-a2f9-ee999ae6f4e6\") " pod="kube-system/global-pull-secret-syncer-6k6lt" Apr 22 17:58:19.071352 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:19.071241 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a9c27a92-d6b0-4347-a2f9-ee999ae6f4e6-original-pull-secret\") pod \"global-pull-secret-syncer-6k6lt\" (UID: \"a9c27a92-d6b0-4347-a2f9-ee999ae6f4e6\") " pod="kube-system/global-pull-secret-syncer-6k6lt" Apr 22 17:58:19.071352 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:19.071312 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a9c27a92-d6b0-4347-a2f9-ee999ae6f4e6-dbus\") pod \"global-pull-secret-syncer-6k6lt\" (UID: \"a9c27a92-d6b0-4347-a2f9-ee999ae6f4e6\") " pod="kube-system/global-pull-secret-syncer-6k6lt" Apr 22 17:58:19.071352 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:19.071319 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a9c27a92-d6b0-4347-a2f9-ee999ae6f4e6-kubelet-config\") pod \"global-pull-secret-syncer-6k6lt\" (UID: \"a9c27a92-d6b0-4347-a2f9-ee999ae6f4e6\") " pod="kube-system/global-pull-secret-syncer-6k6lt" Apr 22 17:58:19.073544 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:19.073521 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a9c27a92-d6b0-4347-a2f9-ee999ae6f4e6-original-pull-secret\") pod \"global-pull-secret-syncer-6k6lt\" (UID: \"a9c27a92-d6b0-4347-a2f9-ee999ae6f4e6\") " pod="kube-system/global-pull-secret-syncer-6k6lt" Apr 22 17:58:19.196628 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:19.196581 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-6k6lt" Apr 22 17:58:19.317314 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:19.317284 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-6k6lt"] Apr 22 17:58:19.319950 ip-10-0-132-106 kubenswrapper[2566]: W0422 17:58:19.319923 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9c27a92_d6b0_4347_a2f9_ee999ae6f4e6.slice/crio-7c9385fb137513537e38644c23b6f3c8807a519091c0a992d84423967c23b487 WatchSource:0}: Error finding container 7c9385fb137513537e38644c23b6f3c8807a519091c0a992d84423967c23b487: Status 404 returned error can't find the container with id 7c9385fb137513537e38644c23b6f3c8807a519091c0a992d84423967c23b487 Apr 22 17:58:19.822932 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:19.822897 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6k6lt" event={"ID":"a9c27a92-d6b0-4347-a2f9-ee999ae6f4e6","Type":"ContainerStarted","Data":"7c9385fb137513537e38644c23b6f3c8807a519091c0a992d84423967c23b487"} Apr 22 17:58:23.837195 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:23.837157 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-6k6lt" event={"ID":"a9c27a92-d6b0-4347-a2f9-ee999ae6f4e6","Type":"ContainerStarted","Data":"ea15f943dc54a7a3ee45601140a982d688cc9f9d1534c1ba9803a1c5fe521846"} Apr 22 17:58:23.852683 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:23.852634 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-6k6lt" podStartSLOduration=1.95211512 podStartE2EDuration="5.852621336s" podCreationTimestamp="2026-04-22 17:58:18 +0000 UTC" firstStartedPulling="2026-04-22 17:58:19.321836001 +0000 UTC m=+292.121729198" lastFinishedPulling="2026-04-22 17:58:23.222342216 +0000 UTC m=+296.022235414" observedRunningTime="2026-04-22 17:58:23.851753625 +0000 UTC m=+296.651646840" watchObservedRunningTime="2026-04-22 17:58:23.852621336 +0000 UTC m=+296.652514556" Apr 22 17:58:27.663947 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:27.663922 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 17:58:27.664277 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:27.663922 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 17:58:27.671310 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:27.671292 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 17:58:27.671397 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:27.671321 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 17:58:41.085730 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:41.085700 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:41.100838 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:41.100817 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 17:58:41.901155 ip-10-0-132-106 kubenswrapper[2566]: I0422 17:58:41.901128 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:03:27.697394 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:27.697349 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:03:27.698469 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:27.698447 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:03:27.700397 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:27.700374 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:03:27.701588 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:27.701561 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:03:39.645352 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:39.645317 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-vml2d"] Apr 22 18:03:39.648434 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:39.648418 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-vml2d" Apr 22 18:03:39.651040 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:39.651015 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:03:39.652062 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:39.652045 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-ddd5r\"" Apr 22 18:03:39.652140 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:39.652060 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:03:39.652140 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:39.652063 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 18:03:39.657529 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:39.657509 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-vml2d"] Apr 22 18:03:39.735394 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:39.735370 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvfrz\" (UniqueName: \"kubernetes.io/projected/46cd4506-abef-4b38-93ac-167376db6638-kube-api-access-tvfrz\") pod \"s3-init-vml2d\" (UID: \"46cd4506-abef-4b38-93ac-167376db6638\") " pod="kserve/s3-init-vml2d" Apr 22 18:03:39.836068 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:39.836038 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvfrz\" (UniqueName: \"kubernetes.io/projected/46cd4506-abef-4b38-93ac-167376db6638-kube-api-access-tvfrz\") pod \"s3-init-vml2d\" (UID: \"46cd4506-abef-4b38-93ac-167376db6638\") " pod="kserve/s3-init-vml2d" Apr 22 18:03:39.846072 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:39.846047 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvfrz\" (UniqueName: \"kubernetes.io/projected/46cd4506-abef-4b38-93ac-167376db6638-kube-api-access-tvfrz\") pod \"s3-init-vml2d\" (UID: \"46cd4506-abef-4b38-93ac-167376db6638\") " pod="kserve/s3-init-vml2d" Apr 22 18:03:39.971720 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:39.971692 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-vml2d" Apr 22 18:03:40.092004 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:40.091981 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-vml2d"] Apr 22 18:03:40.094625 ip-10-0-132-106 kubenswrapper[2566]: W0422 18:03:40.094597 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46cd4506_abef_4b38_93ac_167376db6638.slice/crio-89937f1740ef3b4d3355faf22ad4405a5ba58eac03e8559381fdfb73e1308b37 WatchSource:0}: Error finding container 89937f1740ef3b4d3355faf22ad4405a5ba58eac03e8559381fdfb73e1308b37: Status 404 returned error can't find the container with id 89937f1740ef3b4d3355faf22ad4405a5ba58eac03e8559381fdfb73e1308b37 Apr 22 18:03:40.096260 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:40.096238 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:03:40.779732 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:40.779692 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-vml2d" event={"ID":"46cd4506-abef-4b38-93ac-167376db6638","Type":"ContainerStarted","Data":"89937f1740ef3b4d3355faf22ad4405a5ba58eac03e8559381fdfb73e1308b37"} Apr 22 18:03:44.793976 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:44.793885 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-vml2d" event={"ID":"46cd4506-abef-4b38-93ac-167376db6638","Type":"ContainerStarted","Data":"2da3f24f60d15d4ac90c45d5ad68300be109acd4c7f8953edfb0862a8906b6a6"} Apr 22 18:03:47.803872 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:47.803789 2566 generic.go:358] "Generic (PLEG): container finished" podID="46cd4506-abef-4b38-93ac-167376db6638" containerID="2da3f24f60d15d4ac90c45d5ad68300be109acd4c7f8953edfb0862a8906b6a6" exitCode=0 Apr 22 18:03:47.803872 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:47.803844 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-vml2d" event={"ID":"46cd4506-abef-4b38-93ac-167376db6638","Type":"ContainerDied","Data":"2da3f24f60d15d4ac90c45d5ad68300be109acd4c7f8953edfb0862a8906b6a6"} Apr 22 18:03:48.921370 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:48.921339 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-vml2d" Apr 22 18:03:49.019193 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:49.019169 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvfrz\" (UniqueName: \"kubernetes.io/projected/46cd4506-abef-4b38-93ac-167376db6638-kube-api-access-tvfrz\") pod \"46cd4506-abef-4b38-93ac-167376db6638\" (UID: \"46cd4506-abef-4b38-93ac-167376db6638\") " Apr 22 18:03:49.021223 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:49.021202 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46cd4506-abef-4b38-93ac-167376db6638-kube-api-access-tvfrz" (OuterVolumeSpecName: "kube-api-access-tvfrz") pod "46cd4506-abef-4b38-93ac-167376db6638" (UID: "46cd4506-abef-4b38-93ac-167376db6638"). InnerVolumeSpecName "kube-api-access-tvfrz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:03:49.119979 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:49.119923 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tvfrz\" (UniqueName: \"kubernetes.io/projected/46cd4506-abef-4b38-93ac-167376db6638-kube-api-access-tvfrz\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 18:03:49.813001 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:49.812971 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-vml2d" event={"ID":"46cd4506-abef-4b38-93ac-167376db6638","Type":"ContainerDied","Data":"89937f1740ef3b4d3355faf22ad4405a5ba58eac03e8559381fdfb73e1308b37"} Apr 22 18:03:49.813001 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:49.812992 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-vml2d" Apr 22 18:03:49.813158 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:03:49.812999 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89937f1740ef3b4d3355faf22ad4405a5ba58eac03e8559381fdfb73e1308b37" Apr 22 18:08:27.723758 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:08:27.723685 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:08:27.724901 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:08:27.724873 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:08:27.727222 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:08:27.727202 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:08:27.728373 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:08:27.728358 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:13:27.750370 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:13:27.750342 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:13:27.752724 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:13:27.752703 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:13:27.753757 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:13:27.753736 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:13:27.761497 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:13:27.761466 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:18:27.783694 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:18:27.783578 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:18:27.787560 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:18:27.785937 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:18:27.787560 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:18:27.786755 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:18:27.788879 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:18:27.788846 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:23:27.808773 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:23:27.808664 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:23:27.812830 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:23:27.810818 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:23:27.812830 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:23:27.811839 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:23:27.814199 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:23:27.814180 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:28:27.831716 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:28:27.831607 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:28:27.835878 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:28:27.834748 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:28:27.835878 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:28:27.835181 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:28:27.838370 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:28:27.838354 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:33:27.856632 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:33:27.856532 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:33:27.859448 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:33:27.859427 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:33:27.860216 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:33:27.860197 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:33:27.863121 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:33:27.863105 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:38:27.879635 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:38:27.879537 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:38:27.888688 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:38:27.888664 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:38:27.889343 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:38:27.889323 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:38:27.892918 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:38:27.892900 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:43:27.909875 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:43:27.909829 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:43:27.913145 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:43:27.913122 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:43:27.914795 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:43:27.914770 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:43:27.917855 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:43:27.917838 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:48:27.933012 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:48:27.932902 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:48:27.937256 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:48:27.936427 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:48:27.938583 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:48:27.938564 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:48:27.941690 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:48:27.941673 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:53:27.955600 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:53:27.955483 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:53:27.959883 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:53:27.959075 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:53:27.962723 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:53:27.962704 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:53:27.965658 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:53:27.965641 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:58:27.980211 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:58:27.980108 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:58:27.986327 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:58:27.986303 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 18:58:27.989935 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:58:27.989916 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 18:58:27.992929 ip-10-0-132-106 kubenswrapper[2566]: I0422 18:58:27.992912 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 19:03:28.007157 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:03:28.007041 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 19:03:28.011436 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:03:28.010266 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 19:03:28.014525 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:03:28.014510 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 19:03:28.017508 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:03:28.017493 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 19:08:28.033562 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:08:28.033453 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 19:08:28.037761 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:08:28.036830 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 19:08:28.040891 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:08:28.040852 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 19:08:28.043827 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:08:28.043808 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 19:13:28.056702 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:13:28.056589 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 19:13:28.060424 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:13:28.059696 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 19:13:28.063463 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:13:28.063445 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 19:13:28.066625 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:13:28.066609 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 19:18:28.079630 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:18:28.079526 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 19:18:28.083608 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:18:28.082811 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 19:18:28.087732 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:18:28.087707 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 19:18:28.090826 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:18:28.090810 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 19:23:28.103349 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:23:28.103247 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 19:23:28.107430 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:23:28.106278 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 19:23:28.111079 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:23:28.111059 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 19:23:28.113949 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:23:28.113935 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 19:27:04.895875 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:04.895828 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hjdk5/must-gather-pk87w"] Apr 22 19:27:04.896329 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:04.896172 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="46cd4506-abef-4b38-93ac-167376db6638" containerName="s3-init" Apr 22 19:27:04.896329 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:04.896183 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="46cd4506-abef-4b38-93ac-167376db6638" containerName="s3-init" Apr 22 19:27:04.896329 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:04.896251 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="46cd4506-abef-4b38-93ac-167376db6638" containerName="s3-init" Apr 22 19:27:04.899279 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:04.899261 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hjdk5/must-gather-pk87w" Apr 22 19:27:04.902145 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:04.902121 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hjdk5\"/\"kube-root-ca.crt\"" Apr 22 19:27:04.902336 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:04.902318 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-hjdk5\"/\"openshift-service-ca.crt\"" Apr 22 19:27:04.919431 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:04.919413 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hjdk5/must-gather-pk87w"] Apr 22 19:27:05.058848 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:05.058817 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4m6h\" (UniqueName: \"kubernetes.io/projected/902dd421-10bd-4cf5-90c2-1d3c598bcecf-kube-api-access-v4m6h\") pod \"must-gather-pk87w\" (UID: \"902dd421-10bd-4cf5-90c2-1d3c598bcecf\") " pod="openshift-must-gather-hjdk5/must-gather-pk87w" Apr 22 19:27:05.058848 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:05.058855 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/902dd421-10bd-4cf5-90c2-1d3c598bcecf-must-gather-output\") pod \"must-gather-pk87w\" (UID: \"902dd421-10bd-4cf5-90c2-1d3c598bcecf\") " pod="openshift-must-gather-hjdk5/must-gather-pk87w" Apr 22 19:27:05.159996 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:05.159963 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4m6h\" (UniqueName: \"kubernetes.io/projected/902dd421-10bd-4cf5-90c2-1d3c598bcecf-kube-api-access-v4m6h\") pod \"must-gather-pk87w\" (UID: \"902dd421-10bd-4cf5-90c2-1d3c598bcecf\") " pod="openshift-must-gather-hjdk5/must-gather-pk87w" Apr 22 19:27:05.159996 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:05.160000 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/902dd421-10bd-4cf5-90c2-1d3c598bcecf-must-gather-output\") pod \"must-gather-pk87w\" (UID: \"902dd421-10bd-4cf5-90c2-1d3c598bcecf\") " pod="openshift-must-gather-hjdk5/must-gather-pk87w" Apr 22 19:27:05.160282 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:05.160268 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/902dd421-10bd-4cf5-90c2-1d3c598bcecf-must-gather-output\") pod \"must-gather-pk87w\" (UID: \"902dd421-10bd-4cf5-90c2-1d3c598bcecf\") " pod="openshift-must-gather-hjdk5/must-gather-pk87w" Apr 22 19:27:05.169045 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:05.169017 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4m6h\" (UniqueName: \"kubernetes.io/projected/902dd421-10bd-4cf5-90c2-1d3c598bcecf-kube-api-access-v4m6h\") pod \"must-gather-pk87w\" (UID: \"902dd421-10bd-4cf5-90c2-1d3c598bcecf\") " pod="openshift-must-gather-hjdk5/must-gather-pk87w" Apr 22 19:27:05.217055 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:05.217030 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hjdk5/must-gather-pk87w" Apr 22 19:27:05.329326 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:05.329299 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hjdk5/must-gather-pk87w"] Apr 22 19:27:05.331707 ip-10-0-132-106 kubenswrapper[2566]: W0422 19:27:05.331682 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod902dd421_10bd_4cf5_90c2_1d3c598bcecf.slice/crio-8fdf197b5395d96ecf13f19bdeb4f9f7dd0ff1f4cf687cf35e55068391bae655 WatchSource:0}: Error finding container 8fdf197b5395d96ecf13f19bdeb4f9f7dd0ff1f4cf687cf35e55068391bae655: Status 404 returned error can't find the container with id 8fdf197b5395d96ecf13f19bdeb4f9f7dd0ff1f4cf687cf35e55068391bae655 Apr 22 19:27:05.333662 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:05.333647 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:27:05.556376 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:05.556308 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hjdk5/must-gather-pk87w" event={"ID":"902dd421-10bd-4cf5-90c2-1d3c598bcecf","Type":"ContainerStarted","Data":"8fdf197b5395d96ecf13f19bdeb4f9f7dd0ff1f4cf687cf35e55068391bae655"} Apr 22 19:27:10.574145 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:10.574103 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hjdk5/must-gather-pk87w" event={"ID":"902dd421-10bd-4cf5-90c2-1d3c598bcecf","Type":"ContainerStarted","Data":"cd9ed53c3307c9cd088c335a1045e8cc8a9184d870bfeb86e290a9bb8f200342"} Apr 22 19:27:10.574145 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:10.574144 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hjdk5/must-gather-pk87w" event={"ID":"902dd421-10bd-4cf5-90c2-1d3c598bcecf","Type":"ContainerStarted","Data":"c09627ed54ff40f7f99d61253e3beca43daca237cceaa3d6d4881a5198206142"} Apr 22 19:27:10.591932 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:10.591851 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hjdk5/must-gather-pk87w" podStartSLOduration=2.20076301 podStartE2EDuration="6.591835797s" podCreationTimestamp="2026-04-22 19:27:04 +0000 UTC" firstStartedPulling="2026-04-22 19:27:05.333801826 +0000 UTC m=+5618.133695024" lastFinishedPulling="2026-04-22 19:27:09.724874597 +0000 UTC m=+5622.524767811" observedRunningTime="2026-04-22 19:27:10.590128911 +0000 UTC m=+5623.390022131" watchObservedRunningTime="2026-04-22 19:27:10.591835797 +0000 UTC m=+5623.391729016" Apr 22 19:27:29.638745 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:29.638696 2566 generic.go:358] "Generic (PLEG): container finished" podID="902dd421-10bd-4cf5-90c2-1d3c598bcecf" containerID="c09627ed54ff40f7f99d61253e3beca43daca237cceaa3d6d4881a5198206142" exitCode=0 Apr 22 19:27:29.639134 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:29.638776 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hjdk5/must-gather-pk87w" event={"ID":"902dd421-10bd-4cf5-90c2-1d3c598bcecf","Type":"ContainerDied","Data":"c09627ed54ff40f7f99d61253e3beca43daca237cceaa3d6d4881a5198206142"} Apr 22 19:27:29.639134 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:29.639128 2566 scope.go:117] "RemoveContainer" containerID="c09627ed54ff40f7f99d61253e3beca43daca237cceaa3d6d4881a5198206142" Apr 22 19:27:30.444004 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:30.443972 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hjdk5_must-gather-pk87w_902dd421-10bd-4cf5-90c2-1d3c598bcecf/gather/0.log" Apr 22 19:27:33.583890 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:33.583844 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-6k6lt_a9c27a92-d6b0-4347-a2f9-ee999ae6f4e6/global-pull-secret-syncer/0.log" Apr 22 19:27:33.731873 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:33.731827 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-lfws2_da9cec39-924f-4446-be6f-25108e9c58ba/konnectivity-agent/0.log" Apr 22 19:27:33.836127 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:33.836064 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-106.ec2.internal_de01c4bd78187d5743793fda4e118da1/haproxy/0.log" Apr 22 19:27:35.855373 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:35.855335 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hjdk5/must-gather-pk87w"] Apr 22 19:27:35.855839 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:35.855602 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-hjdk5/must-gather-pk87w" podUID="902dd421-10bd-4cf5-90c2-1d3c598bcecf" containerName="copy" containerID="cri-o://cd9ed53c3307c9cd088c335a1045e8cc8a9184d870bfeb86e290a9bb8f200342" gracePeriod=2 Apr 22 19:27:35.859910 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:35.858982 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hjdk5/must-gather-pk87w"] Apr 22 19:27:35.859910 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:35.859415 2566 status_manager.go:895] "Failed to get status for pod" podUID="902dd421-10bd-4cf5-90c2-1d3c598bcecf" pod="openshift-must-gather-hjdk5/must-gather-pk87w" err="pods \"must-gather-pk87w\" is forbidden: User \"system:node:ip-10-0-132-106.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-hjdk5\": no relationship found between node 'ip-10-0-132-106.ec2.internal' and this object" Apr 22 19:27:36.084534 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:36.084511 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hjdk5_must-gather-pk87w_902dd421-10bd-4cf5-90c2-1d3c598bcecf/copy/0.log" Apr 22 19:27:36.084892 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:36.084854 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hjdk5/must-gather-pk87w" Apr 22 19:27:36.087094 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:36.087069 2566 status_manager.go:895] "Failed to get status for pod" podUID="902dd421-10bd-4cf5-90c2-1d3c598bcecf" pod="openshift-must-gather-hjdk5/must-gather-pk87w" err="pods \"must-gather-pk87w\" is forbidden: User \"system:node:ip-10-0-132-106.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-hjdk5\": no relationship found between node 'ip-10-0-132-106.ec2.internal' and this object" Apr 22 19:27:36.127536 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:36.127479 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/902dd421-10bd-4cf5-90c2-1d3c598bcecf-must-gather-output\") pod \"902dd421-10bd-4cf5-90c2-1d3c598bcecf\" (UID: \"902dd421-10bd-4cf5-90c2-1d3c598bcecf\") " Apr 22 19:27:36.127681 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:36.127595 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4m6h\" (UniqueName: \"kubernetes.io/projected/902dd421-10bd-4cf5-90c2-1d3c598bcecf-kube-api-access-v4m6h\") pod \"902dd421-10bd-4cf5-90c2-1d3c598bcecf\" (UID: \"902dd421-10bd-4cf5-90c2-1d3c598bcecf\") " Apr 22 19:27:36.129421 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:36.129399 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/902dd421-10bd-4cf5-90c2-1d3c598bcecf-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "902dd421-10bd-4cf5-90c2-1d3c598bcecf" (UID: "902dd421-10bd-4cf5-90c2-1d3c598bcecf"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:27:36.129519 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:36.129482 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/902dd421-10bd-4cf5-90c2-1d3c598bcecf-kube-api-access-v4m6h" (OuterVolumeSpecName: "kube-api-access-v4m6h") pod "902dd421-10bd-4cf5-90c2-1d3c598bcecf" (UID: "902dd421-10bd-4cf5-90c2-1d3c598bcecf"). InnerVolumeSpecName "kube-api-access-v4m6h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:27:36.228854 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:36.228828 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v4m6h\" (UniqueName: \"kubernetes.io/projected/902dd421-10bd-4cf5-90c2-1d3c598bcecf-kube-api-access-v4m6h\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 19:27:36.228854 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:36.228852 2566 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/902dd421-10bd-4cf5-90c2-1d3c598bcecf-must-gather-output\") on node \"ip-10-0-132-106.ec2.internal\" DevicePath \"\"" Apr 22 19:27:36.663419 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:36.663390 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hjdk5_must-gather-pk87w_902dd421-10bd-4cf5-90c2-1d3c598bcecf/copy/0.log" Apr 22 19:27:36.663747 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:36.663721 2566 generic.go:358] "Generic (PLEG): container finished" podID="902dd421-10bd-4cf5-90c2-1d3c598bcecf" containerID="cd9ed53c3307c9cd088c335a1045e8cc8a9184d870bfeb86e290a9bb8f200342" exitCode=143 Apr 22 19:27:36.663822 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:36.663766 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hjdk5/must-gather-pk87w" Apr 22 19:27:36.663822 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:36.663799 2566 scope.go:117] "RemoveContainer" containerID="cd9ed53c3307c9cd088c335a1045e8cc8a9184d870bfeb86e290a9bb8f200342" Apr 22 19:27:36.666084 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:36.666057 2566 status_manager.go:895] "Failed to get status for pod" podUID="902dd421-10bd-4cf5-90c2-1d3c598bcecf" pod="openshift-must-gather-hjdk5/must-gather-pk87w" err="pods \"must-gather-pk87w\" is forbidden: User \"system:node:ip-10-0-132-106.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-hjdk5\": no relationship found between node 'ip-10-0-132-106.ec2.internal' and this object" Apr 22 19:27:36.671717 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:36.671694 2566 scope.go:117] "RemoveContainer" containerID="c09627ed54ff40f7f99d61253e3beca43daca237cceaa3d6d4881a5198206142" Apr 22 19:27:36.673715 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:36.673694 2566 status_manager.go:895] "Failed to get status for pod" podUID="902dd421-10bd-4cf5-90c2-1d3c598bcecf" pod="openshift-must-gather-hjdk5/must-gather-pk87w" err="pods \"must-gather-pk87w\" is forbidden: User \"system:node:ip-10-0-132-106.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-hjdk5\": no relationship found between node 'ip-10-0-132-106.ec2.internal' and this object" Apr 22 19:27:36.683377 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:36.683359 2566 scope.go:117] "RemoveContainer" containerID="cd9ed53c3307c9cd088c335a1045e8cc8a9184d870bfeb86e290a9bb8f200342" Apr 22 19:27:36.683601 ip-10-0-132-106 kubenswrapper[2566]: E0422 19:27:36.683583 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd9ed53c3307c9cd088c335a1045e8cc8a9184d870bfeb86e290a9bb8f200342\": container with ID starting with cd9ed53c3307c9cd088c335a1045e8cc8a9184d870bfeb86e290a9bb8f200342 not found: ID does not exist" containerID="cd9ed53c3307c9cd088c335a1045e8cc8a9184d870bfeb86e290a9bb8f200342" Apr 22 19:27:36.683641 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:36.683608 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd9ed53c3307c9cd088c335a1045e8cc8a9184d870bfeb86e290a9bb8f200342"} err="failed to get container status \"cd9ed53c3307c9cd088c335a1045e8cc8a9184d870bfeb86e290a9bb8f200342\": rpc error: code = NotFound desc = could not find container \"cd9ed53c3307c9cd088c335a1045e8cc8a9184d870bfeb86e290a9bb8f200342\": container with ID starting with cd9ed53c3307c9cd088c335a1045e8cc8a9184d870bfeb86e290a9bb8f200342 not found: ID does not exist" Apr 22 19:27:36.683641 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:36.683628 2566 scope.go:117] "RemoveContainer" containerID="c09627ed54ff40f7f99d61253e3beca43daca237cceaa3d6d4881a5198206142" Apr 22 19:27:36.683817 ip-10-0-132-106 kubenswrapper[2566]: E0422 19:27:36.683802 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c09627ed54ff40f7f99d61253e3beca43daca237cceaa3d6d4881a5198206142\": container with ID starting with c09627ed54ff40f7f99d61253e3beca43daca237cceaa3d6d4881a5198206142 not found: ID does not exist" containerID="c09627ed54ff40f7f99d61253e3beca43daca237cceaa3d6d4881a5198206142" Apr 22 19:27:36.683853 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:36.683822 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c09627ed54ff40f7f99d61253e3beca43daca237cceaa3d6d4881a5198206142"} err="failed to get container status \"c09627ed54ff40f7f99d61253e3beca43daca237cceaa3d6d4881a5198206142\": rpc error: code = NotFound desc = could not find container \"c09627ed54ff40f7f99d61253e3beca43daca237cceaa3d6d4881a5198206142\": container with ID starting with c09627ed54ff40f7f99d61253e3beca43daca237cceaa3d6d4881a5198206142 not found: ID does not exist" Apr 22 19:27:37.055682 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.055595 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_95086d3d-8975-4a72-a006-6c8106d580ae/alertmanager/0.log" Apr 22 19:27:37.079820 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.079790 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_95086d3d-8975-4a72-a006-6c8106d580ae/config-reloader/0.log" Apr 22 19:27:37.101965 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.101945 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_95086d3d-8975-4a72-a006-6c8106d580ae/kube-rbac-proxy-web/0.log" Apr 22 19:27:37.126055 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.126037 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_95086d3d-8975-4a72-a006-6c8106d580ae/kube-rbac-proxy/0.log" Apr 22 19:27:37.148302 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.148281 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_95086d3d-8975-4a72-a006-6c8106d580ae/kube-rbac-proxy-metric/0.log" Apr 22 19:27:37.169869 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.169835 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_95086d3d-8975-4a72-a006-6c8106d580ae/prom-label-proxy/0.log" Apr 22 19:27:37.193252 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.193235 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_95086d3d-8975-4a72-a006-6c8106d580ae/init-config-reloader/0.log" Apr 22 19:27:37.256197 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.256172 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-wplrv_4f93890b-e678-4591-8604-de9e0ff75905/cluster-monitoring-operator/0.log" Apr 22 19:27:37.389756 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.389695 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-vcjl7_b4f78a22-a4a4-4801-a800-a8983d8d8a1d/monitoring-plugin/0.log" Apr 22 19:27:37.496343 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.496321 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-w4jff_2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24/node-exporter/0.log" Apr 22 19:27:37.517657 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.517635 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-w4jff_2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24/kube-rbac-proxy/0.log" Apr 22 19:27:37.538391 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.538372 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-w4jff_2b1d8567-1ee7-40d9-8c1d-dc6ab3b97d24/init-textfile/0.log" Apr 22 19:27:37.640841 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.640777 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-b9pjt_0ecc790c-43bc-443b-b15e-cb2adabc8a2f/kube-rbac-proxy-main/0.log" Apr 22 19:27:37.664136 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.664117 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-b9pjt_0ecc790c-43bc-443b-b15e-cb2adabc8a2f/kube-rbac-proxy-self/0.log" Apr 22 19:27:37.686648 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.686615 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-b9pjt_0ecc790c-43bc-443b-b15e-cb2adabc8a2f/openshift-state-metrics/0.log" Apr 22 19:27:37.746969 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.746933 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_89a4c77d-d08d-4acc-8d82-81ac5653491c/prometheus/0.log" Apr 22 19:27:37.767459 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.767432 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_89a4c77d-d08d-4acc-8d82-81ac5653491c/config-reloader/0.log" Apr 22 19:27:37.778697 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.778672 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="902dd421-10bd-4cf5-90c2-1d3c598bcecf" path="/var/lib/kubelet/pods/902dd421-10bd-4cf5-90c2-1d3c598bcecf/volumes" Apr 22 19:27:37.779022 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.778996 2566 status_manager.go:895] "Failed to get status for pod" podUID="902dd421-10bd-4cf5-90c2-1d3c598bcecf" pod="openshift-must-gather-hjdk5/must-gather-pk87w" err="pods \"must-gather-pk87w\" is forbidden: User \"system:node:ip-10-0-132-106.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-hjdk5\": no relationship found between node 'ip-10-0-132-106.ec2.internal' and this object" Apr 22 19:27:37.788476 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.788454 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_89a4c77d-d08d-4acc-8d82-81ac5653491c/thanos-sidecar/0.log" Apr 22 19:27:37.807564 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.807546 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_89a4c77d-d08d-4acc-8d82-81ac5653491c/kube-rbac-proxy-web/0.log" Apr 22 19:27:37.827305 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.827287 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_89a4c77d-d08d-4acc-8d82-81ac5653491c/kube-rbac-proxy/0.log" Apr 22 19:27:37.849814 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.849796 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_89a4c77d-d08d-4acc-8d82-81ac5653491c/kube-rbac-proxy-thanos/0.log" Apr 22 19:27:37.872631 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.872602 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_89a4c77d-d08d-4acc-8d82-81ac5653491c/init-config-reloader/0.log" Apr 22 19:27:37.956539 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.956516 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-5nghc_f5ebb09d-fd51-4c07-b267-b6a533e95f1f/prometheus-operator-admission-webhook/0.log" Apr 22 19:27:37.987982 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:37.987955 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6df96bf54d-9v5k4_d88b8107-8f12-4fa1-aa21-eeaa65216c76/telemeter-client/0.log" Apr 22 19:27:38.008963 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:38.008933 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6df96bf54d-9v5k4_d88b8107-8f12-4fa1-aa21-eeaa65216c76/reload/0.log" Apr 22 19:27:38.029322 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:38.029297 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6df96bf54d-9v5k4_d88b8107-8f12-4fa1-aa21-eeaa65216c76/kube-rbac-proxy/0.log" Apr 22 19:27:38.055819 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:38.055794 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5bbd5cd8d-mc4cv_e5bdfb2b-3526-43a6-a2d3-5b997408d869/thanos-query/0.log" Apr 22 19:27:38.075657 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:38.075636 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5bbd5cd8d-mc4cv_e5bdfb2b-3526-43a6-a2d3-5b997408d869/kube-rbac-proxy-web/0.log" Apr 22 19:27:38.094389 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:38.094372 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5bbd5cd8d-mc4cv_e5bdfb2b-3526-43a6-a2d3-5b997408d869/kube-rbac-proxy/0.log" Apr 22 19:27:38.116123 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:38.116100 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5bbd5cd8d-mc4cv_e5bdfb2b-3526-43a6-a2d3-5b997408d869/prom-label-proxy/0.log" Apr 22 19:27:38.134753 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:38.134716 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5bbd5cd8d-mc4cv_e5bdfb2b-3526-43a6-a2d3-5b997408d869/kube-rbac-proxy-rules/0.log" Apr 22 19:27:38.154481 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:38.154461 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5bbd5cd8d-mc4cv_e5bdfb2b-3526-43a6-a2d3-5b997408d869/kube-rbac-proxy-metrics/0.log" Apr 22 19:27:39.708126 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:39.708097 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/2.log" Apr 22 19:27:39.715784 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:39.715757 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-mrr8j_84a18cd9-ceac-4cc8-972b-92e3c17a262b/console-operator/3.log" Apr 22 19:27:40.587261 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.587173 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf"] Apr 22 19:27:40.587581 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.587564 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="902dd421-10bd-4cf5-90c2-1d3c598bcecf" containerName="gather" Apr 22 19:27:40.587642 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.587582 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="902dd421-10bd-4cf5-90c2-1d3c598bcecf" containerName="gather" Apr 22 19:27:40.587642 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.587599 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="902dd421-10bd-4cf5-90c2-1d3c598bcecf" containerName="copy" Apr 22 19:27:40.587642 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.587605 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="902dd421-10bd-4cf5-90c2-1d3c598bcecf" containerName="copy" Apr 22 19:27:40.587782 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.587679 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="902dd421-10bd-4cf5-90c2-1d3c598bcecf" containerName="copy" Apr 22 19:27:40.587782 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.587692 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="902dd421-10bd-4cf5-90c2-1d3c598bcecf" containerName="gather" Apr 22 19:27:40.590683 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.590661 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf" Apr 22 19:27:40.593260 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.593231 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xtxx5\"/\"openshift-service-ca.crt\"" Apr 22 19:27:40.593260 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.593255 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xtxx5\"/\"default-dockercfg-jqt9f\"" Apr 22 19:27:40.593400 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.593254 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xtxx5\"/\"kube-root-ca.crt\"" Apr 22 19:27:40.599099 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.599080 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf"] Apr 22 19:27:40.667434 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.667415 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/54db7888-0db6-461c-9c73-ac819bb70974-proc\") pod \"perf-node-gather-daemonset-q7nqf\" (UID: \"54db7888-0db6-461c-9c73-ac819bb70974\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf" Apr 22 19:27:40.667548 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.667455 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/54db7888-0db6-461c-9c73-ac819bb70974-sys\") pod \"perf-node-gather-daemonset-q7nqf\" (UID: \"54db7888-0db6-461c-9c73-ac819bb70974\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf" Apr 22 19:27:40.667548 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.667479 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/54db7888-0db6-461c-9c73-ac819bb70974-lib-modules\") pod \"perf-node-gather-daemonset-q7nqf\" (UID: \"54db7888-0db6-461c-9c73-ac819bb70974\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf" Apr 22 19:27:40.667658 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.667558 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2bj9\" (UniqueName: \"kubernetes.io/projected/54db7888-0db6-461c-9c73-ac819bb70974-kube-api-access-v2bj9\") pod \"perf-node-gather-daemonset-q7nqf\" (UID: \"54db7888-0db6-461c-9c73-ac819bb70974\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf" Apr 22 19:27:40.667658 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.667595 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/54db7888-0db6-461c-9c73-ac819bb70974-podres\") pod \"perf-node-gather-daemonset-q7nqf\" (UID: \"54db7888-0db6-461c-9c73-ac819bb70974\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf" Apr 22 19:27:40.768186 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.768160 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/54db7888-0db6-461c-9c73-ac819bb70974-lib-modules\") pod \"perf-node-gather-daemonset-q7nqf\" (UID: \"54db7888-0db6-461c-9c73-ac819bb70974\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf" Apr 22 19:27:40.768542 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.768207 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2bj9\" (UniqueName: \"kubernetes.io/projected/54db7888-0db6-461c-9c73-ac819bb70974-kube-api-access-v2bj9\") pod \"perf-node-gather-daemonset-q7nqf\" (UID: \"54db7888-0db6-461c-9c73-ac819bb70974\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf" Apr 22 19:27:40.768542 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.768237 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/54db7888-0db6-461c-9c73-ac819bb70974-podres\") pod \"perf-node-gather-daemonset-q7nqf\" (UID: \"54db7888-0db6-461c-9c73-ac819bb70974\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf" Apr 22 19:27:40.768542 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.768311 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/54db7888-0db6-461c-9c73-ac819bb70974-proc\") pod \"perf-node-gather-daemonset-q7nqf\" (UID: \"54db7888-0db6-461c-9c73-ac819bb70974\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf" Apr 22 19:27:40.768542 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.768336 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/54db7888-0db6-461c-9c73-ac819bb70974-lib-modules\") pod \"perf-node-gather-daemonset-q7nqf\" (UID: \"54db7888-0db6-461c-9c73-ac819bb70974\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf" Apr 22 19:27:40.768542 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.768337 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/54db7888-0db6-461c-9c73-ac819bb70974-podres\") pod \"perf-node-gather-daemonset-q7nqf\" (UID: \"54db7888-0db6-461c-9c73-ac819bb70974\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf" Apr 22 19:27:40.768542 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.768370 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/54db7888-0db6-461c-9c73-ac819bb70974-sys\") pod \"perf-node-gather-daemonset-q7nqf\" (UID: \"54db7888-0db6-461c-9c73-ac819bb70974\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf" Apr 22 19:27:40.768542 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.768396 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/54db7888-0db6-461c-9c73-ac819bb70974-proc\") pod \"perf-node-gather-daemonset-q7nqf\" (UID: \"54db7888-0db6-461c-9c73-ac819bb70974\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf" Apr 22 19:27:40.768542 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.768465 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/54db7888-0db6-461c-9c73-ac819bb70974-sys\") pod \"perf-node-gather-daemonset-q7nqf\" (UID: \"54db7888-0db6-461c-9c73-ac819bb70974\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf" Apr 22 19:27:40.776670 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.776642 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2bj9\" (UniqueName: \"kubernetes.io/projected/54db7888-0db6-461c-9c73-ac819bb70974-kube-api-access-v2bj9\") pod \"perf-node-gather-daemonset-q7nqf\" (UID: \"54db7888-0db6-461c-9c73-ac819bb70974\") " pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf" Apr 22 19:27:40.901035 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:40.900978 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf" Apr 22 19:27:41.016223 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:41.016193 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf"] Apr 22 19:27:41.018253 ip-10-0-132-106 kubenswrapper[2566]: W0422 19:27:41.018228 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod54db7888_0db6_461c_9c73_ac819bb70974.slice/crio-c77a59fb9338bfec5d952b1108147c4371f34e525deee8a93a5493adc749da18 WatchSource:0}: Error finding container c77a59fb9338bfec5d952b1108147c4371f34e525deee8a93a5493adc749da18: Status 404 returned error can't find the container with id c77a59fb9338bfec5d952b1108147c4371f34e525deee8a93a5493adc749da18 Apr 22 19:27:41.107691 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:41.107668 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hmcpt_705dd2ce-2ac7-4745-a314-14e119a14624/dns/0.log" Apr 22 19:27:41.126248 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:41.126230 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hmcpt_705dd2ce-2ac7-4745-a314-14e119a14624/kube-rbac-proxy/0.log" Apr 22 19:27:41.206740 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:41.206716 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9kn68_bfef47bf-9dff-44e0-8b1a-1397bf347548/dns-node-resolver/0.log" Apr 22 19:27:41.632155 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:41.632096 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2vnst_90a8af9e-a3b4-4682-86d6-985e15148048/node-ca/0.log" Apr 22 19:27:41.679778 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:41.679750 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf" event={"ID":"54db7888-0db6-461c-9c73-ac819bb70974","Type":"ContainerStarted","Data":"204a0a42d38f03f07f693ed223b99ef0c37294673b9e1ee5b55405cc5d19b767"} Apr 22 19:27:41.679778 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:41.679778 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf" event={"ID":"54db7888-0db6-461c-9c73-ac819bb70974","Type":"ContainerStarted","Data":"c77a59fb9338bfec5d952b1108147c4371f34e525deee8a93a5493adc749da18"} Apr 22 19:27:41.679971 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:41.679802 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf" Apr 22 19:27:41.695414 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:41.695373 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf" podStartSLOduration=1.695360135 podStartE2EDuration="1.695360135s" podCreationTimestamp="2026-04-22 19:27:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:27:41.69409898 +0000 UTC m=+5654.493992200" watchObservedRunningTime="2026-04-22 19:27:41.695360135 +0000 UTC m=+5654.495253391" Apr 22 19:27:42.308844 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:42.308815 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-56975f448b-wh8bp_c355c33b-68d6-4bb8-aeda-28cdf82e8d61/router/0.log" Apr 22 19:27:42.662176 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:42.662148 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rbrk7_aa418e11-9a0e-463a-8262-e078bca4e7a8/serve-healthcheck-canary/0.log" Apr 22 19:27:43.003479 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:43.003412 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-2fnvq_9af35975-ba53-433d-8c3b-454c55c4ffd7/insights-operator/0.log" Apr 22 19:27:43.005543 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:43.005520 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-2fnvq_9af35975-ba53-433d-8c3b-454c55c4ffd7/insights-operator/1.log" Apr 22 19:27:43.086264 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:43.086234 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m9ns5_ac83d655-eec6-4d15-b10d-55f5c7a109d0/kube-rbac-proxy/0.log" Apr 22 19:27:43.104850 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:43.104831 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m9ns5_ac83d655-eec6-4d15-b10d-55f5c7a109d0/exporter/0.log" Apr 22 19:27:43.124675 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:43.124659 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m9ns5_ac83d655-eec6-4d15-b10d-55f5c7a109d0/extractor/0.log" Apr 22 19:27:45.150800 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:45.150768 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-vml2d_46cd4506-abef-4b38-93ac-167376db6638/s3-init/0.log" Apr 22 19:27:47.693287 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:47.693256 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xtxx5/perf-node-gather-daemonset-q7nqf" Apr 22 19:27:48.813288 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:48.813255 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-psp4c_2f310f35-5bdc-4e57-86df-62e51a3c8cdf/migrator/0.log" Apr 22 19:27:48.831383 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:48.831353 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-psp4c_2f310f35-5bdc-4e57-86df-62e51a3c8cdf/graceful-termination/0.log" Apr 22 19:27:50.579821 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:50.579760 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zmjph_c437ff89-37eb-4bee-a67b-f2918685eee5/kube-multus-additional-cni-plugins/0.log" Apr 22 19:27:50.598762 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:50.598740 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zmjph_c437ff89-37eb-4bee-a67b-f2918685eee5/egress-router-binary-copy/0.log" Apr 22 19:27:50.616594 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:50.616574 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zmjph_c437ff89-37eb-4bee-a67b-f2918685eee5/cni-plugins/0.log" Apr 22 19:27:50.636727 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:50.636703 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zmjph_c437ff89-37eb-4bee-a67b-f2918685eee5/bond-cni-plugin/0.log" Apr 22 19:27:50.655013 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:50.654995 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zmjph_c437ff89-37eb-4bee-a67b-f2918685eee5/routeoverride-cni/0.log" Apr 22 19:27:50.673147 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:50.673122 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zmjph_c437ff89-37eb-4bee-a67b-f2918685eee5/whereabouts-cni-bincopy/0.log" Apr 22 19:27:50.692971 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:50.692949 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zmjph_c437ff89-37eb-4bee-a67b-f2918685eee5/whereabouts-cni/0.log" Apr 22 19:27:50.753509 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:50.753489 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cqc2t_9e89ff1c-b604-4ae4-a756-badea52f84ef/kube-multus/0.log" Apr 22 19:27:50.772205 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:50.772169 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dv96w_92650e2d-54ea-4904-8ee5-235164ed2949/network-metrics-daemon/0.log" Apr 22 19:27:50.794770 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:50.794752 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dv96w_92650e2d-54ea-4904-8ee5-235164ed2949/kube-rbac-proxy/0.log" Apr 22 19:27:51.647406 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:51.647380 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-controller/0.log" Apr 22 19:27:51.663487 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:51.663464 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/0.log" Apr 22 19:27:51.709773 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:51.709746 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovn-acl-logging/1.log" Apr 22 19:27:51.735845 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:51.735797 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/kube-rbac-proxy-node/0.log" Apr 22 19:27:51.758412 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:51.758391 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:27:51.777740 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:51.777724 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/northd/0.log" Apr 22 19:27:51.798483 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:51.798467 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/nbdb/0.log" Apr 22 19:27:51.818525 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:51.818481 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/sbdb/0.log" Apr 22 19:27:51.971512 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:51.971487 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9z9pj_1685d860-db2e-419d-b016-516c1932fa2f/ovnkube-controller/0.log" Apr 22 19:27:53.479198 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:53.479172 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-ngcz8_6fb74441-7ec5-4482-ad08-21d23adeeb37/network-check-target-container/0.log" Apr 22 19:27:54.371332 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:54.371303 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-rhw5j_0d845f70-52b2-4607-be37-ce8250614a3f/iptables-alerter/0.log" Apr 22 19:27:54.981759 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:54.981728 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-55lxs_abf52151-090a-4499-9e78-eebbda08114e/tuned/0.log" Apr 22 19:27:56.747071 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:56.747034 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-ctlkb_92750ce1-a0fa-4514-b9b0-09845663619d/cluster-samples-operator/0.log" Apr 22 19:27:56.762428 ip-10-0-132-106 kubenswrapper[2566]: I0422 19:27:56.762394 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-ctlkb_92750ce1-a0fa-4514-b9b0-09845663619d/cluster-samples-operator-watch/0.log"