Apr 17 16:17:03.986774 ip-10-0-132-179 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 16:17:03.986785 ip-10-0-132-179 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 16:17:03.986795 ip-10-0-132-179 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 16:17:03.987075 ip-10-0-132-179 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 16:17:14.211069 ip-10-0-132-179 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 16:17:14.211085 ip-10-0-132-179 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot e7ff1f4fd2854364b429b4233e93f2ec -- Apr 17 16:19:42.555367 ip-10-0-132-179 systemd[1]: Starting Kubernetes Kubelet... Apr 17 16:19:42.936420 ip-10-0-132-179 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:19:42.936420 ip-10-0-132-179 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 16:19:42.936420 ip-10-0-132-179 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:19:42.936420 ip-10-0-132-179 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 16:19:42.936420 ip-10-0-132-179 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 16:19:42.937762 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.937622 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 16:19:42.940652 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940635 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:19:42.940652 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940651 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:19:42.940717 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940655 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:19:42.940717 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940658 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:19:42.940717 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940661 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:19:42.940717 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940664 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:19:42.940717 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940667 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:19:42.940717 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940670 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:19:42.940717 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940672 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:19:42.940717 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940675 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:19:42.940717 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940678 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:19:42.940717 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940680 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:19:42.940717 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940683 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:19:42.940717 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940685 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:19:42.940717 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940688 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:19:42.940717 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940695 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:19:42.940717 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940698 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:19:42.940717 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940702 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:19:42.940717 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940704 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:19:42.940717 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940707 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:19:42.940717 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940710 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:19:42.940717 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940714 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:19:42.941230 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940718 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:19:42.941230 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940721 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:19:42.941230 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940725 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:19:42.941230 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940728 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:19:42.941230 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940731 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:19:42.941230 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940734 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:19:42.941230 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940737 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:19:42.941230 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940740 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:19:42.941230 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940743 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:19:42.941230 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940746 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:19:42.941230 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940749 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:19:42.941230 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940751 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:19:42.941230 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940760 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:19:42.941230 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940763 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:19:42.941230 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940766 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:19:42.941230 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940769 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:19:42.941230 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940771 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:19:42.941230 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940774 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:19:42.941230 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940776 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:19:42.941230 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940779 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:19:42.941826 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940782 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:19:42.941826 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940784 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:19:42.941826 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940787 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:19:42.941826 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940790 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:19:42.941826 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940792 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:19:42.941826 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940795 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:19:42.941826 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940797 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:19:42.941826 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940800 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:19:42.941826 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940802 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:19:42.941826 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940805 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:19:42.941826 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940807 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:19:42.941826 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940810 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:19:42.941826 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940812 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:19:42.941826 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940815 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:19:42.941826 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940818 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:19:42.941826 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940821 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:19:42.941826 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940824 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:19:42.941826 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940826 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:19:42.941826 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940829 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:19:42.941826 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940831 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:19:42.942334 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940834 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:19:42.942334 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940837 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:19:42.942334 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940839 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:19:42.942334 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940842 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:19:42.942334 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940859 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:19:42.942334 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940862 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:19:42.942334 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940865 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:19:42.942334 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940867 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:19:42.942334 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940872 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:19:42.942334 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940876 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:19:42.942334 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940878 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:19:42.942334 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940881 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:19:42.942334 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940883 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:19:42.942334 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940886 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:19:42.942334 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940889 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:19:42.942334 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940891 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:19:42.942334 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940894 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:19:42.942334 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940897 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:19:42.942334 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940899 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:19:42.942334 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940901 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:19:42.942849 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940904 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:19:42.942849 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940906 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:19:42.942849 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940909 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:19:42.942849 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.940912 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:19:42.942849 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941339 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:19:42.942849 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941345 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:19:42.942849 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941348 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:19:42.942849 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941351 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:19:42.942849 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941354 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:19:42.942849 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941357 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:19:42.942849 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941360 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:19:42.942849 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941362 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:19:42.942849 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941365 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:19:42.942849 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941368 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:19:42.942849 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941371 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:19:42.942849 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941373 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:19:42.942849 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941376 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:19:42.942849 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941379 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:19:42.942849 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941382 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:19:42.942849 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941386 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:19:42.943359 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941390 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:19:42.943359 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941392 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:19:42.943359 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941395 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:19:42.943359 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941398 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:19:42.943359 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941400 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:19:42.943359 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941403 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:19:42.943359 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941405 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:19:42.943359 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941408 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:19:42.943359 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941410 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:19:42.943359 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941413 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:19:42.943359 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941415 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:19:42.943359 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941418 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:19:42.943359 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941422 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:19:42.943359 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941426 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:19:42.943359 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941429 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:19:42.943359 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941432 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:19:42.943359 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941435 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:19:42.943359 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941438 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:19:42.943359 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941442 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:19:42.943828 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941445 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:19:42.943828 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941447 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:19:42.943828 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941450 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:19:42.943828 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941452 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:19:42.943828 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941455 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:19:42.943828 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941457 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:19:42.943828 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941460 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:19:42.943828 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941462 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:19:42.943828 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941465 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:19:42.943828 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941467 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:19:42.943828 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941470 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:19:42.943828 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941473 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:19:42.943828 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941476 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:19:42.943828 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941478 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:19:42.943828 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941481 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:19:42.943828 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941484 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:19:42.943828 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941486 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:19:42.943828 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941489 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:19:42.943828 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941492 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:19:42.943828 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941494 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:19:42.944336 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941497 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:19:42.944336 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941499 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:19:42.944336 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941501 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:19:42.944336 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941504 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:19:42.944336 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941506 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:19:42.944336 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941509 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:19:42.944336 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941512 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:19:42.944336 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941514 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:19:42.944336 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941517 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:19:42.944336 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941520 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:19:42.944336 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941523 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:19:42.944336 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941525 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:19:42.944336 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941528 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:19:42.944336 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941530 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:19:42.944336 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941533 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:19:42.944336 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941535 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:19:42.944336 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941539 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:19:42.944336 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941541 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:19:42.944336 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941544 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:19:42.944336 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941546 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:19:42.944818 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941549 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:19:42.944818 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941551 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:19:42.944818 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941554 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:19:42.944818 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941557 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:19:42.944818 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941559 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:19:42.944818 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941562 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:19:42.944818 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941564 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:19:42.944818 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941567 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:19:42.944818 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941570 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:19:42.944818 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941572 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:19:42.944818 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.941574 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:19:42.944818 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941660 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 16:19:42.944818 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941670 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 16:19:42.944818 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941682 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 16:19:42.944818 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941690 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 16:19:42.944818 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941697 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 16:19:42.944818 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941701 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 16:19:42.944818 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941705 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 16:19:42.944818 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941710 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 16:19:42.944818 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941713 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 16:19:42.944818 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941716 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941719 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941723 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941727 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941730 2572 flags.go:64] FLAG: --cgroup-root="" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941733 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941736 2572 flags.go:64] FLAG: --client-ca-file="" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941739 2572 flags.go:64] FLAG: --cloud-config="" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941741 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941744 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941751 2572 flags.go:64] FLAG: --cluster-domain="" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941754 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941757 2572 flags.go:64] FLAG: --config-dir="" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941760 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941764 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941768 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941774 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941777 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941781 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941787 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941790 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941793 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941796 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941799 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941803 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 16:19:42.945361 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941807 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941810 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941813 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941816 2572 flags.go:64] FLAG: --enable-server="true" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941819 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941823 2572 flags.go:64] FLAG: --event-burst="100" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941827 2572 flags.go:64] FLAG: --event-qps="50" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941830 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941833 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941836 2572 flags.go:64] FLAG: --eviction-hard="" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941840 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941843 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941846 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941849 2572 flags.go:64] FLAG: --eviction-soft="" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941852 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941854 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941857 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941860 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941863 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941867 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941869 2572 flags.go:64] FLAG: --feature-gates="" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941873 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941876 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941879 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941883 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941886 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 17 16:19:42.945990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941890 2572 flags.go:64] FLAG: --help="false" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941893 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-132-179.ec2.internal" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941896 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941900 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941903 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941906 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941910 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941913 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941916 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941919 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941922 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941924 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941928 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941930 2572 flags.go:64] FLAG: --kube-reserved="" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941934 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941936 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941939 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941942 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941945 2572 flags.go:64] FLAG: --lock-file="" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941948 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941950 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941953 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941959 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 16:19:42.946639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941962 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941965 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941967 2572 flags.go:64] FLAG: --logging-format="text" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941970 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941974 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941976 2572 flags.go:64] FLAG: --manifest-url="" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941979 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941984 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941988 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941992 2572 flags.go:64] FLAG: --max-pods="110" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941995 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.941999 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942002 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942004 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942007 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942010 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942013 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942021 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942024 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942027 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942029 2572 flags.go:64] FLAG: --pod-cidr="" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942032 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942038 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942049 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 16:19:42.947219 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942053 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942055 2572 flags.go:64] FLAG: --port="10250" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942059 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942062 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0932ab7c84f486089" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942065 2572 flags.go:64] FLAG: --qos-reserved="" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942068 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942084 2572 flags.go:64] FLAG: --register-node="true" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942087 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942091 2572 flags.go:64] FLAG: --register-with-taints="" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942094 2572 flags.go:64] FLAG: --registry-burst="10" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942097 2572 flags.go:64] FLAG: --registry-qps="5" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942100 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942103 2572 flags.go:64] FLAG: --reserved-memory="" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942107 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942110 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942113 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942116 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942121 2572 flags.go:64] FLAG: --runonce="false" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942124 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942128 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942131 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942134 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942137 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942140 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942143 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942146 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 16:19:42.947825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942149 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942151 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942154 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942157 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942160 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942163 2572 flags.go:64] FLAG: --system-cgroups="" Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942166 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942172 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942175 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942178 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942182 2572 flags.go:64] FLAG: --tls-min-version="" Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942185 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942188 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942191 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942194 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942197 2572 flags.go:64] FLAG: --v="2" Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942202 2572 flags.go:64] FLAG: --version="false" Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942205 2572 flags.go:64] FLAG: --vmodule="" Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942210 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.942213 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942321 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942325 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942333 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942337 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:19:42.948469 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942340 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:19:42.949059 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942343 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:19:42.949059 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942346 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:19:42.949059 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942349 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:19:42.949059 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942351 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:19:42.949059 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942354 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:19:42.949059 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942357 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:19:42.949059 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942360 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:19:42.949059 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942362 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:19:42.949059 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942365 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:19:42.949059 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942368 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:19:42.949059 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942371 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:19:42.949059 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942373 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:19:42.949059 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942376 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:19:42.949059 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942379 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:19:42.949059 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942381 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:19:42.949059 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942384 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:19:42.949059 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942387 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:19:42.949059 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942389 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:19:42.949059 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942392 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:19:42.949059 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942394 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:19:42.949575 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942397 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:19:42.949575 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942401 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:19:42.949575 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942404 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:19:42.949575 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942407 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:19:42.949575 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942410 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:19:42.949575 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942413 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:19:42.949575 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942416 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:19:42.949575 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942418 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:19:42.949575 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942421 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:19:42.949575 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942426 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:19:42.949575 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942430 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:19:42.949575 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942432 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:19:42.949575 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942441 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:19:42.949575 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942444 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:19:42.949575 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942446 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:19:42.949575 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942449 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:19:42.949575 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942451 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:19:42.949575 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942454 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:19:42.949575 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942456 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:19:42.949575 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942459 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:19:42.950087 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942461 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:19:42.950087 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942464 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:19:42.950087 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942466 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:19:42.950087 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942469 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:19:42.950087 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942472 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:19:42.950087 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942474 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:19:42.950087 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942477 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:19:42.950087 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942479 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:19:42.950087 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942482 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:19:42.950087 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942485 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:19:42.950087 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942487 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:19:42.950087 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942490 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:19:42.950087 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942492 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:19:42.950087 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942495 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:19:42.950087 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942497 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:19:42.950087 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942500 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:19:42.950087 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942502 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:19:42.950087 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942505 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:19:42.950087 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942507 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:19:42.950087 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942515 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:19:42.950576 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942518 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:19:42.950576 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942522 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:19:42.950576 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942526 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:19:42.950576 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942529 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:19:42.950576 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942532 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:19:42.950576 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942541 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:19:42.950576 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942544 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:19:42.950576 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942546 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:19:42.950576 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942549 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:19:42.950576 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942552 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:19:42.950576 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942554 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:19:42.950576 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942557 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:19:42.950576 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942561 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:19:42.950576 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942564 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:19:42.950576 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942567 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:19:42.950576 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942569 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:19:42.950576 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942572 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:19:42.950576 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942575 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:19:42.950576 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942578 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:19:42.950576 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942580 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:19:42.951148 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.942583 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:19:42.951148 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.943129 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:19:42.951148 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.950115 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 16:19:42.951148 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.950136 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 16:19:42.951148 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950186 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:19:42.951148 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950191 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:19:42.951148 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950202 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:19:42.951148 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950205 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:19:42.951148 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950208 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:19:42.951148 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950211 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:19:42.951148 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950214 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:19:42.951148 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950217 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:19:42.951148 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950219 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:19:42.951148 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950222 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:19:42.951148 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950225 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:19:42.951148 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950228 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:19:42.951600 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950230 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:19:42.951600 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950233 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:19:42.951600 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950235 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:19:42.951600 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950238 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:19:42.951600 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950241 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:19:42.951600 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950243 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:19:42.951600 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950246 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:19:42.951600 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950249 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:19:42.951600 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950251 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:19:42.951600 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950254 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:19:42.951600 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950256 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:19:42.951600 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950259 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:19:42.951600 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950262 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:19:42.951600 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950265 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:19:42.951600 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950268 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:19:42.951600 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950270 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:19:42.951600 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950273 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:19:42.951600 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950276 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:19:42.951600 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950279 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:19:42.952065 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950288 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:19:42.952065 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950291 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:19:42.952065 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950294 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:19:42.952065 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950296 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:19:42.952065 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950299 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:19:42.952065 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950301 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:19:42.952065 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950304 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:19:42.952065 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950307 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:19:42.952065 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950309 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:19:42.952065 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950312 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:19:42.952065 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950314 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:19:42.952065 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950317 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:19:42.952065 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950320 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:19:42.952065 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950322 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:19:42.952065 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950325 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:19:42.952065 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950327 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:19:42.952065 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950330 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:19:42.952065 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950333 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:19:42.952065 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950336 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:19:42.952065 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950338 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:19:42.952556 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950341 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:19:42.952556 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950343 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:19:42.952556 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950346 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:19:42.952556 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950348 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:19:42.952556 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950350 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:19:42.952556 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950353 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:19:42.952556 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950356 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:19:42.952556 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950358 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:19:42.952556 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950360 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:19:42.952556 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950363 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:19:42.952556 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950367 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:19:42.952556 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950369 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:19:42.952556 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950374 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:19:42.952556 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950390 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:19:42.952556 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950394 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:19:42.952556 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950397 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:19:42.952556 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950400 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:19:42.952556 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950403 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:19:42.952556 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950406 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:19:42.952556 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950408 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:19:42.953201 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950411 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:19:42.953201 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950413 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:19:42.953201 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950416 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:19:42.953201 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950418 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:19:42.953201 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950421 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:19:42.953201 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950424 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:19:42.953201 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950426 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:19:42.953201 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950429 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:19:42.953201 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950431 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:19:42.953201 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950434 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:19:42.953201 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950436 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:19:42.953201 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950439 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:19:42.953201 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950443 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:19:42.953201 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950447 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:19:42.953201 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950449 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:19:42.953763 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.950455 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:19:42.953763 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950577 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 16:19:42.953763 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950582 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 16:19:42.953763 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950586 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 16:19:42.953763 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950588 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 16:19:42.953763 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950591 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 16:19:42.953763 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950594 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 16:19:42.953763 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950597 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 16:19:42.953763 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950600 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 16:19:42.953763 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950603 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 16:19:42.953763 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950605 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 16:19:42.953763 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950613 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 16:19:42.953763 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950616 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 16:19:42.953763 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950619 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 16:19:42.953763 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950621 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 16:19:42.954200 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950624 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 16:19:42.954200 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950626 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 16:19:42.954200 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950629 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 16:19:42.954200 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950631 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 16:19:42.954200 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950634 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 16:19:42.954200 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950637 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 16:19:42.954200 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950639 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 16:19:42.954200 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950641 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 16:19:42.954200 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950644 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 16:19:42.954200 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950646 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 16:19:42.954200 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950649 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 16:19:42.954200 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950651 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 16:19:42.954200 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950654 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 16:19:42.954200 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950656 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 16:19:42.954200 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950659 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 16:19:42.954200 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950661 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 16:19:42.954200 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950664 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 16:19:42.954200 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950666 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 16:19:42.954200 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950669 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 16:19:42.954671 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950671 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 16:19:42.954671 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950674 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 16:19:42.954671 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950677 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 16:19:42.954671 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950680 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 16:19:42.954671 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950684 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 17 16:19:42.954671 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950687 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 16:19:42.954671 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950690 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 16:19:42.954671 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950693 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 16:19:42.954671 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950696 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 16:19:42.954671 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950698 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 16:19:42.954671 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950706 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 16:19:42.954671 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950709 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 16:19:42.954671 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950712 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 16:19:42.954671 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950714 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 16:19:42.954671 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950717 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 16:19:42.954671 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950719 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 16:19:42.954671 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950722 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 16:19:42.954671 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950724 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 16:19:42.954671 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950727 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 16:19:42.955248 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950729 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 16:19:42.955248 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950732 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 16:19:42.955248 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950734 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 16:19:42.955248 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950737 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 16:19:42.955248 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950739 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 16:19:42.955248 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950741 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 16:19:42.955248 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950744 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 16:19:42.955248 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950746 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 16:19:42.955248 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950749 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 16:19:42.955248 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950751 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 16:19:42.955248 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950754 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 16:19:42.955248 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950756 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 16:19:42.955248 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950759 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 16:19:42.955248 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950761 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 16:19:42.955248 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950764 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 16:19:42.955248 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950766 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 16:19:42.955248 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950769 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 16:19:42.955248 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950772 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 16:19:42.955248 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950775 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 16:19:42.955248 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950777 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 16:19:42.955731 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950780 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 16:19:42.955731 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950782 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 16:19:42.955731 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950785 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 16:19:42.955731 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950787 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 16:19:42.955731 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950795 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 16:19:42.955731 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950798 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 16:19:42.955731 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950800 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 16:19:42.955731 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950803 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 16:19:42.955731 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950805 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 16:19:42.955731 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950808 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 16:19:42.955731 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950810 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 16:19:42.955731 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950813 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 16:19:42.955731 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950815 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 16:19:42.955731 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:42.950819 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 16:19:42.955731 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.950824 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 16:19:42.955731 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.951442 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 16:19:42.956147 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.955214 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 16:19:42.956200 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.956188 2572 server.go:1019] "Starting client certificate rotation" Apr 17 16:19:42.956301 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.956285 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:19:42.956330 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.956321 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 16:19:42.978569 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.978546 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:19:42.981736 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.981715 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 16:19:42.995486 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:42.995459 2572 log.go:25] "Validated CRI v1 runtime API" Apr 17 16:19:43.001243 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.001228 2572 log.go:25] "Validated CRI v1 image API" Apr 17 16:19:43.002602 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.002586 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 16:19:43.006456 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.006422 2572 fs.go:135] Filesystem UUIDs: map[0461e62a-c00f-4243-baca-15d3e00ac40d:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 f386f3d6-7a87-43d1-85b2-5bc483ca5477:/dev/nvme0n1p4] Apr 17 16:19:43.006532 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.006454 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 16:19:43.008556 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.008535 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:19:43.012689 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.012572 2572 manager.go:217] Machine: {Timestamp:2026-04-17 16:19:43.011472955 +0000 UTC m=+0.351961025 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099072 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2e21e017e42b2b2a9b2cfac1047520 SystemUUID:ec2e21e0-17e4-2b2b-2a9b-2cfac1047520 BootID:e7ff1f4f-d285-4364-b429-b4233e93f2ec Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:3d:38:29:ad:85 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:3d:38:29:ad:85 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:f6:e8:74:28:9c:32 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 16:19:43.012689 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.012678 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 16:19:43.012820 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.012767 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 16:19:43.014151 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.014125 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 16:19:43.014295 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.014154 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-179.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 16:19:43.014344 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.014308 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 16:19:43.014344 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.014317 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 16:19:43.014344 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.014330 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:19:43.014425 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.014346 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 16:19:43.015867 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.015855 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:19:43.015975 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.015965 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 16:19:43.018420 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.018407 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 17 16:19:43.018467 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.018424 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 16:19:43.019201 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.019191 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 16:19:43.019253 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.019207 2572 kubelet.go:397] "Adding apiserver pod source" Apr 17 16:19:43.019253 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.019220 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 16:19:43.020310 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.020296 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:19:43.020376 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.020316 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 16:19:43.022802 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.022785 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 16:19:43.024241 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.024228 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 16:19:43.025894 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.025881 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 16:19:43.025958 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.025898 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 16:19:43.025958 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.025907 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 16:19:43.025958 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.025914 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 16:19:43.025958 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.025923 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 16:19:43.025958 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.025931 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 16:19:43.025958 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.025937 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 16:19:43.025958 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.025942 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 16:19:43.025958 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.025949 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 16:19:43.025958 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.025955 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 16:19:43.026222 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.025966 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 16:19:43.026222 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.025975 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 16:19:43.026680 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.026667 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 16:19:43.026716 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.026683 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 16:19:43.028468 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.028444 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5lg6l" Apr 17 16:19:43.029401 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:43.029370 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 16:19:43.029401 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.029352 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-179.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 16:19:43.029504 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:43.029431 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-179.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 16:19:43.030798 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.030786 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 16:19:43.030837 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.030824 2572 server.go:1295] "Started kubelet" Apr 17 16:19:43.030936 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.030908 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 16:19:43.031669 ip-10-0-132-179 systemd[1]: Started Kubernetes Kubelet. Apr 17 16:19:43.031965 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.031445 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 16:19:43.032044 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.032031 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 16:19:43.032268 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.032217 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 17 16:19:43.033497 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.033469 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 16:19:43.036887 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.036858 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5lg6l" Apr 17 16:19:43.039509 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:43.038671 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-179.ec2.internal.18a7314c965b6a04 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-179.ec2.internal,UID:ip-10-0-132-179.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-179.ec2.internal,},FirstTimestamp:2026-04-17 16:19:43.030798852 +0000 UTC m=+0.371286921,LastTimestamp:2026-04-17 16:19:43.030798852 +0000 UTC m=+0.371286921,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-179.ec2.internal,}" Apr 17 16:19:43.039638 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.039544 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 16:19:43.039993 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.039977 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 16:19:43.041781 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.041634 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 16:19:43.041862 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.041785 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 16:19:43.041955 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:43.041926 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-179.ec2.internal\" not found" Apr 17 16:19:43.042664 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.042648 2572 factory.go:153] Registering CRI-O factory Apr 17 16:19:43.042769 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.042710 2572 factory.go:223] Registration of the crio container factory successfully Apr 17 16:19:43.042915 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.042901 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 16:19:43.043049 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.043038 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 17 16:19:43.043121 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.043049 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 17 16:19:43.043171 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.043122 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 16:19:43.043171 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.043154 2572 factory.go:55] Registering systemd factory Apr 17 16:19:43.043171 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.043164 2572 factory.go:223] Registration of the systemd container factory successfully Apr 17 16:19:43.043302 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.043222 2572 factory.go:103] Registering Raw factory Apr 17 16:19:43.043302 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.043240 2572 manager.go:1196] Started watching for new ooms in manager Apr 17 16:19:43.044863 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.044342 2572 manager.go:319] Starting recovery of all containers Apr 17 16:19:43.045687 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:43.045659 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 16:19:43.048762 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.048738 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:19:43.049606 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:43.049585 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-132-179.ec2.internal\" not found" node="ip-10-0-132-179.ec2.internal" Apr 17 16:19:43.056353 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.056194 2572 manager.go:324] Recovery completed Apr 17 16:19:43.060547 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.060529 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:19:43.062974 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.062960 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-179.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:19:43.063041 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.062989 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-179.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:19:43.063041 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.063002 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-179.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:19:43.063629 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.063617 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 16:19:43.063670 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.063629 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 16:19:43.063670 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.063645 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 17 16:19:43.065793 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.065782 2572 policy_none.go:49] "None policy: Start" Apr 17 16:19:43.065833 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.065798 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 16:19:43.065833 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.065807 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 17 16:19:43.098651 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.098632 2572 manager.go:341] "Starting Device Plugin manager" Apr 17 16:19:43.112371 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:43.098671 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 16:19:43.112371 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.098684 2572 server.go:85] "Starting device plugin registration server" Apr 17 16:19:43.112371 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.098922 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 16:19:43.112371 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.098931 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 16:19:43.112371 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.099254 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 16:19:43.112371 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.099341 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 16:19:43.112371 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.099350 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 16:19:43.112371 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:43.099790 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 16:19:43.112371 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:43.099828 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-179.ec2.internal\" not found" Apr 17 16:19:43.172080 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.172027 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 16:19:43.173295 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.173277 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 16:19:43.173347 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.173315 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 16:19:43.173347 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.173338 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 16:19:43.173406 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.173353 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 16:19:43.173406 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:43.173390 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 16:19:43.176262 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.176240 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:19:43.199276 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.199225 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:19:43.200291 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.200274 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-179.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:19:43.200356 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.200307 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-179.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:19:43.200356 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.200319 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-179.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:19:43.200356 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.200343 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-179.ec2.internal" Apr 17 16:19:43.207495 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.207481 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-179.ec2.internal" Apr 17 16:19:43.207540 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:43.207503 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-179.ec2.internal\": node \"ip-10-0-132-179.ec2.internal\" not found" Apr 17 16:19:43.224215 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:43.224189 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-179.ec2.internal\" not found" Apr 17 16:19:43.274088 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.274028 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-179.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-179.ec2.internal"] Apr 17 16:19:43.274175 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.274138 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:19:43.275057 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.275042 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-179.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:19:43.275121 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.275084 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-179.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:19:43.275121 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.275099 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-179.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:19:43.276316 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.276304 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:19:43.276472 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.276457 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-179.ec2.internal" Apr 17 16:19:43.276508 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.276488 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:19:43.277007 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.276991 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-179.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:19:43.277092 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.277018 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-179.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:19:43.277092 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.277027 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-179.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:19:43.277092 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.276991 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-179.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:19:43.277092 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.277066 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-179.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:19:43.277224 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.277095 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-179.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:19:43.278165 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.278152 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-179.ec2.internal" Apr 17 16:19:43.278217 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.278176 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 16:19:43.278871 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.278853 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-179.ec2.internal" event="NodeHasSufficientMemory" Apr 17 16:19:43.278945 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.278889 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-179.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 16:19:43.278945 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.278901 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-179.ec2.internal" event="NodeHasSufficientPID" Apr 17 16:19:43.302809 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:43.302779 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-179.ec2.internal\" not found" node="ip-10-0-132-179.ec2.internal" Apr 17 16:19:43.308724 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:43.308706 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-179.ec2.internal\" not found" node="ip-10-0-132-179.ec2.internal" Apr 17 16:19:43.324845 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:43.324824 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-179.ec2.internal\" not found" Apr 17 16:19:43.344347 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.344323 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/79115418117c74aa519bb91d0beaf224-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-179.ec2.internal\" (UID: \"79115418117c74aa519bb91d0beaf224\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-179.ec2.internal" Apr 17 16:19:43.425488 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:43.425461 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-179.ec2.internal\" not found" Apr 17 16:19:43.444754 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.444705 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bcd2509d09ecd17cbbfc28f51d1d45a5-config\") pod \"kube-apiserver-proxy-ip-10-0-132-179.ec2.internal\" (UID: \"bcd2509d09ecd17cbbfc28f51d1d45a5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-179.ec2.internal" Apr 17 16:19:43.444830 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.444788 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/79115418117c74aa519bb91d0beaf224-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-179.ec2.internal\" (UID: \"79115418117c74aa519bb91d0beaf224\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-179.ec2.internal" Apr 17 16:19:43.444830 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.444807 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/79115418117c74aa519bb91d0beaf224-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-179.ec2.internal\" (UID: \"79115418117c74aa519bb91d0beaf224\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-179.ec2.internal" Apr 17 16:19:43.444928 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.444855 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/79115418117c74aa519bb91d0beaf224-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-179.ec2.internal\" (UID: \"79115418117c74aa519bb91d0beaf224\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-179.ec2.internal" Apr 17 16:19:43.526193 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:43.526112 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-179.ec2.internal\" not found" Apr 17 16:19:43.545435 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.545409 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/79115418117c74aa519bb91d0beaf224-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-179.ec2.internal\" (UID: \"79115418117c74aa519bb91d0beaf224\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-179.ec2.internal" Apr 17 16:19:43.545514 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.545440 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bcd2509d09ecd17cbbfc28f51d1d45a5-config\") pod \"kube-apiserver-proxy-ip-10-0-132-179.ec2.internal\" (UID: \"bcd2509d09ecd17cbbfc28f51d1d45a5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-179.ec2.internal" Apr 17 16:19:43.545514 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.545479 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bcd2509d09ecd17cbbfc28f51d1d45a5-config\") pod \"kube-apiserver-proxy-ip-10-0-132-179.ec2.internal\" (UID: \"bcd2509d09ecd17cbbfc28f51d1d45a5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-179.ec2.internal" Apr 17 16:19:43.545577 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.545515 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/79115418117c74aa519bb91d0beaf224-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-179.ec2.internal\" (UID: \"79115418117c74aa519bb91d0beaf224\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-179.ec2.internal" Apr 17 16:19:43.604551 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.604513 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-179.ec2.internal" Apr 17 16:19:43.611147 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.611128 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-179.ec2.internal" Apr 17 16:19:43.626813 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:43.626787 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-179.ec2.internal\" not found" Apr 17 16:19:43.727542 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:43.727499 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-179.ec2.internal\" not found" Apr 17 16:19:43.828045 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:43.827976 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-179.ec2.internal\" not found" Apr 17 16:19:43.928492 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:43.928456 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-179.ec2.internal\" not found" Apr 17 16:19:43.955902 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.955878 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 16:19:43.956333 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.956029 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:19:43.956333 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:43.956052 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 16:19:44.029441 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:44.029404 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-179.ec2.internal\" not found" Apr 17 16:19:44.039600 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:44.039562 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 16:14:43 +0000 UTC" deadline="2027-12-16 16:30:56.639208568 +0000 UTC" Apr 17 16:19:44.039600 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:44.039594 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14592h11m12.599618043s" Apr 17 16:19:44.039762 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:44.039644 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 16:19:44.049677 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:44.049655 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 16:19:44.067269 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:44.067248 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-vgkbl" Apr 17 16:19:44.073787 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:44.073771 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:19:44.074726 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:44.074708 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-vgkbl" Apr 17 16:19:44.127365 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:44.127336 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79115418117c74aa519bb91d0beaf224.slice/crio-b81206a3f0dbd6cb5aa5dd00b05c97e0464a0c49adf94710cb5ea89ca2dc5910 WatchSource:0}: Error finding container b81206a3f0dbd6cb5aa5dd00b05c97e0464a0c49adf94710cb5ea89ca2dc5910: Status 404 returned error can't find the container with id b81206a3f0dbd6cb5aa5dd00b05c97e0464a0c49adf94710cb5ea89ca2dc5910 Apr 17 16:19:44.127752 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:44.127736 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcd2509d09ecd17cbbfc28f51d1d45a5.slice/crio-c15e2c5289aabea7a196b80ce9974e5214dc24ff6a8df4be2f6e2abe7e0a4a7e WatchSource:0}: Error finding container c15e2c5289aabea7a196b80ce9974e5214dc24ff6a8df4be2f6e2abe7e0a4a7e: Status 404 returned error can't find the container with id c15e2c5289aabea7a196b80ce9974e5214dc24ff6a8df4be2f6e2abe7e0a4a7e Apr 17 16:19:44.132008 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:44.131985 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:19:44.143353 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:44.143320 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-179.ec2.internal" Apr 17 16:19:44.153276 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:44.153257 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:19:44.154668 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:44.154652 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-179.ec2.internal" Apr 17 16:19:44.163903 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:44.163883 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 16:19:44.176952 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:44.176912 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-179.ec2.internal" event={"ID":"bcd2509d09ecd17cbbfc28f51d1d45a5","Type":"ContainerStarted","Data":"c15e2c5289aabea7a196b80ce9974e5214dc24ff6a8df4be2f6e2abe7e0a4a7e"} Apr 17 16:19:44.177842 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:44.177822 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-179.ec2.internal" event={"ID":"79115418117c74aa519bb91d0beaf224","Type":"ContainerStarted","Data":"b81206a3f0dbd6cb5aa5dd00b05c97e0464a0c49adf94710cb5ea89ca2dc5910"} Apr 17 16:19:44.371026 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:44.370961 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:19:44.922934 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:44.922881 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:19:45.020659 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.020612 2572 apiserver.go:52] "Watching apiserver" Apr 17 16:19:45.029373 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.029343 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 16:19:45.029760 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.029730 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-179.ec2.internal","openshift-multus/multus-additional-cni-plugins-vt8g2","openshift-multus/network-metrics-daemon-j89hr","openshift-network-operator/iptables-alerter-rswrv","kube-system/konnectivity-agent-v6lnz","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz","openshift-cluster-node-tuning-operator/tuned-gx8qb","openshift-image-registry/node-ca-fwdvx","openshift-multus/multus-hgr6r","openshift-network-diagnostics/network-check-target-xnc8v","openshift-ovn-kubernetes/ovnkube-node-nhjqx","kube-system/kube-apiserver-proxy-ip-10-0-132-179.ec2.internal","openshift-dns/node-resolver-s4nbk"] Apr 17 16:19:45.033157 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.033137 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.035362 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.035339 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.035967 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.035843 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 16:19:45.035967 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.035846 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:19:45.035967 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.035887 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-q95ft\"" Apr 17 16:19:45.039424 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.037995 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:19:45.039424 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.038028 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 16:19:45.039424 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:45.038115 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j89hr" podUID="dbd283d5-ff0b-4c8f-b1be-15a75816e953" Apr 17 16:19:45.039424 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.038361 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 16:19:45.039424 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.038749 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 16:19:45.039424 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.038921 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 16:19:45.039424 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.039016 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-np6qb\"" Apr 17 16:19:45.039424 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.039097 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 16:19:45.043844 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.043789 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rswrv" Apr 17 16:19:45.044272 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.044054 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-v6lnz" Apr 17 16:19:45.046208 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.046190 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.046382 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.046359 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:19:45.046738 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.046609 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 16:19:45.046738 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.046627 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 16:19:45.046738 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.046647 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 16:19:45.046738 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.046721 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 16:19:45.046975 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.046931 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wdh86\"" Apr 17 16:19:45.047087 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.047050 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-dwqz4\"" Apr 17 16:19:45.048653 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.048635 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 16:19:45.048781 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.048676 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-h7pmc\"" Apr 17 16:19:45.048781 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.048639 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 16:19:45.048978 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.048913 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 16:19:45.051119 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.050798 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fwdvx" Apr 17 16:19:45.051119 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.050874 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.053269 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.053237 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 16:19:45.053404 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.053385 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:19:45.053503 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:45.053447 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnc8v" podUID="30245442-5a33-4b64-a9be-b62b496e3e7b" Apr 17 16:19:45.053564 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.053527 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 16:19:45.053621 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.053577 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vh9w7\"" Apr 17 16:19:45.053811 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.053797 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 16:19:45.053910 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.053895 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 16:19:45.053982 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.053962 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-jx9nj\"" Apr 17 16:19:45.054424 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.054405 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a4d23535-e11c-4204-9246-5539245e51d9-agent-certs\") pod \"konnectivity-agent-v6lnz\" (UID: \"a4d23535-e11c-4204-9246-5539245e51d9\") " pod="kube-system/konnectivity-agent-v6lnz" Apr 17 16:19:45.054517 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.054438 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-etc-modprobe-d\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.054517 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.054462 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-lib-modules\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.054517 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.054483 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-host\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.054670 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.054514 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a4d23535-e11c-4204-9246-5539245e51d9-konnectivity-ca\") pod \"konnectivity-agent-v6lnz\" (UID: \"a4d23535-e11c-4204-9246-5539245e51d9\") " pod="kube-system/konnectivity-agent-v6lnz" Apr 17 16:19:45.054670 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.054539 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b3d68729-0bbb-475c-8a72-38489e06e068-tmp\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.054670 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.054560 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8de8591f-0659-4b29-abd0-982ba1568fa2-system-cni-dir\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.054670 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.054576 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8de8591f-0659-4b29-abd0-982ba1568fa2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.054670 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.054597 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8de8591f-0659-4b29-abd0-982ba1568fa2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.054670 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.054638 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpbkb\" (UniqueName: \"kubernetes.io/projected/b7370709-dff9-4a26-85ce-0d05bcf27a57-kube-api-access-bpbkb\") pod \"iptables-alerter-rswrv\" (UID: \"b7370709-dff9-4a26-85ce-0d05bcf27a57\") " pod="openshift-network-operator/iptables-alerter-rswrv" Apr 17 16:19:45.054670 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.054667 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-etc-kubernetes\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.054991 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.054688 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-sys\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.054991 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.054710 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b3d68729-0bbb-475c-8a72-38489e06e068-etc-tuned\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.054991 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.054746 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlg7s\" (UniqueName: \"kubernetes.io/projected/b3d68729-0bbb-475c-8a72-38489e06e068-kube-api-access-zlg7s\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.054991 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.054770 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-etc-sysconfig\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.054991 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.054797 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8de8591f-0659-4b29-abd0-982ba1568fa2-cnibin\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.054991 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.054820 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8de8591f-0659-4b29-abd0-982ba1568fa2-os-release\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.054991 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.054843 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8de8591f-0659-4b29-abd0-982ba1568fa2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.054991 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.054869 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bznj5\" (UniqueName: \"kubernetes.io/projected/dbd283d5-ff0b-4c8f-b1be-15a75816e953-kube-api-access-bznj5\") pod \"network-metrics-daemon-j89hr\" (UID: \"dbd283d5-ff0b-4c8f-b1be-15a75816e953\") " pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:19:45.054991 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.054932 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b7370709-dff9-4a26-85ce-0d05bcf27a57-iptables-alerter-script\") pod \"iptables-alerter-rswrv\" (UID: \"b7370709-dff9-4a26-85ce-0d05bcf27a57\") " pod="openshift-network-operator/iptables-alerter-rswrv" Apr 17 16:19:45.054991 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.054967 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-etc-systemd\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.054991 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.054990 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-run\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.055484 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.055016 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8de8591f-0659-4b29-abd0-982ba1568fa2-cni-binary-copy\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.055484 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.055054 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-etc-sysctl-conf\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.055484 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.055158 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcj8x\" (UniqueName: \"kubernetes.io/projected/8de8591f-0659-4b29-abd0-982ba1568fa2-kube-api-access-hcj8x\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.055484 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.055205 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs\") pod \"network-metrics-daemon-j89hr\" (UID: \"dbd283d5-ff0b-4c8f-b1be-15a75816e953\") " pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:19:45.055484 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.055232 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-etc-sysctl-d\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.055484 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.055268 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-var-lib-kubelet\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.055484 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.055292 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b7370709-dff9-4a26-85ce-0d05bcf27a57-host-slash\") pod \"iptables-alerter-rswrv\" (UID: \"b7370709-dff9-4a26-85ce-0d05bcf27a57\") " pod="openshift-network-operator/iptables-alerter-rswrv" Apr 17 16:19:45.056113 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.056095 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.059240 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.058869 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s4nbk" Apr 17 16:19:45.059556 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.059395 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 16:19:45.059556 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.059473 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-tzvxd\"" Apr 17 16:19:45.059705 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.059616 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 16:19:45.059705 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.059668 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 16:19:45.060325 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.060229 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 16:19:45.060514 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.060497 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 16:19:45.061489 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.061467 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 16:19:45.062033 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.061842 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-vl8tq\"" Apr 17 16:19:45.062033 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.061893 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 16:19:45.062033 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.061911 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 16:19:45.075440 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.075372 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:14:44 +0000 UTC" deadline="2027-11-13 18:02:36.350692167 +0000 UTC" Apr 17 16:19:45.075440 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.075397 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13801h42m51.27529795s" Apr 17 16:19:45.144747 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.144722 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 16:19:45.156142 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156117 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-etc-sysctl-d\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.156289 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156231 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b0030b4f-f856-49ec-87a6-eca6a00291ad-hosts-file\") pod \"node-resolver-s4nbk\" (UID: \"b0030b4f-f856-49ec-87a6-eca6a00291ad\") " pod="openshift-dns/node-resolver-s4nbk" Apr 17 16:19:45.156289 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156272 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-etc-sysctl-d\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.156396 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156273 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r4m6\" (UniqueName: \"kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6\") pod \"network-check-target-xnc8v\" (UID: \"30245442-5a33-4b64-a9be-b62b496e3e7b\") " pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:19:45.156396 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156328 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-host-kubelet\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.156396 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156352 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-host-run-ovn-kubernetes\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.156396 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156382 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.156593 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156403 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3e371a5a-2d19-4c74-8b51-d4ac6484410c-ovnkube-config\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.156593 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156421 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a4d23535-e11c-4204-9246-5539245e51d9-agent-certs\") pod \"konnectivity-agent-v6lnz\" (UID: \"a4d23535-e11c-4204-9246-5539245e51d9\") " pod="kube-system/konnectivity-agent-v6lnz" Apr 17 16:19:45.156593 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156476 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-etc-modprobe-d\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.156593 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156517 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-lib-modules\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.156593 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156542 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-node-log\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.156593 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156566 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3e371a5a-2d19-4c74-8b51-d4ac6484410c-env-overrides\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.156593 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156590 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-os-release\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.156919 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156634 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-multus-conf-dir\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.156919 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156647 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-lib-modules\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.156919 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156663 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a4d23535-e11c-4204-9246-5539245e51d9-konnectivity-ca\") pod \"konnectivity-agent-v6lnz\" (UID: \"a4d23535-e11c-4204-9246-5539245e51d9\") " pod="kube-system/konnectivity-agent-v6lnz" Apr 17 16:19:45.156919 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156674 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-etc-modprobe-d\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.156919 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156708 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8de8591f-0659-4b29-abd0-982ba1568fa2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.156919 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156736 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-run-ovn\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.156919 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156760 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-host-var-lib-cni-bin\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.156919 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156796 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpbkb\" (UniqueName: \"kubernetes.io/projected/b7370709-dff9-4a26-85ce-0d05bcf27a57-kube-api-access-bpbkb\") pod \"iptables-alerter-rswrv\" (UID: \"b7370709-dff9-4a26-85ce-0d05bcf27a57\") " pod="openshift-network-operator/iptables-alerter-rswrv" Apr 17 16:19:45.156919 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156823 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-etc-kubernetes\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.156919 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156848 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b3d68729-0bbb-475c-8a72-38489e06e068-etc-tuned\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.156919 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156872 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8de8591f-0659-4b29-abd0-982ba1568fa2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.156919 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156806 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 16:19:45.156919 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156901 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-etc-kubernetes\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.157372 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156933 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bznj5\" (UniqueName: \"kubernetes.io/projected/dbd283d5-ff0b-4c8f-b1be-15a75816e953-kube-api-access-bznj5\") pod \"network-metrics-daemon-j89hr\" (UID: \"dbd283d5-ff0b-4c8f-b1be-15a75816e953\") " pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:19:45.157372 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156960 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-host-run-netns\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.157372 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.156981 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-hostroot\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.157372 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157003 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-etc-sysconfig\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.157372 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157026 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8de8591f-0659-4b29-abd0-982ba1568fa2-cnibin\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.157372 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157050 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-host-cni-bin\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.157372 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157089 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/21e63632-a824-43b0-bf94-b6059b8aa0f7-socket-dir\") pod \"aws-ebs-csi-driver-node-v9gbz\" (UID: \"21e63632-a824-43b0-bf94-b6059b8aa0f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.157372 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157114 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-cni-binary-copy\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.157372 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157136 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-host-var-lib-cni-multus\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.157372 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157160 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-multus-daemon-config\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.157372 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157187 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-etc-kubernetes\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.157372 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157236 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-etc-systemd\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.157372 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157257 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-run\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.157372 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157279 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a4d23535-e11c-4204-9246-5539245e51d9-konnectivity-ca\") pod \"konnectivity-agent-v6lnz\" (UID: \"a4d23535-e11c-4204-9246-5539245e51d9\") " pod="kube-system/konnectivity-agent-v6lnz" Apr 17 16:19:45.157372 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157285 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-host-run-netns\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.157372 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157338 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-run-systemd\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.157372 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157366 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-log-socket\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.158033 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157385 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-run\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.158033 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157390 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b0030b4f-f856-49ec-87a6-eca6a00291ad-tmp-dir\") pod \"node-resolver-s4nbk\" (UID: \"b0030b4f-f856-49ec-87a6-eca6a00291ad\") " pod="openshift-dns/node-resolver-s4nbk" Apr 17 16:19:45.158033 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157383 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8de8591f-0659-4b29-abd0-982ba1568fa2-cnibin\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.158033 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157417 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/21e63632-a824-43b0-bf94-b6059b8aa0f7-etc-selinux\") pod \"aws-ebs-csi-driver-node-v9gbz\" (UID: \"21e63632-a824-43b0-bf94-b6059b8aa0f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.158033 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157425 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-etc-systemd\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.158033 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157446 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-multus-cni-dir\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.158033 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157480 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-etc-sysconfig\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.158033 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157487 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs\") pod \"network-metrics-daemon-j89hr\" (UID: \"dbd283d5-ff0b-4c8f-b1be-15a75816e953\") " pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:19:45.158033 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157515 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd76r\" (UniqueName: \"kubernetes.io/projected/21e63632-a824-43b0-bf94-b6059b8aa0f7-kube-api-access-wd76r\") pod \"aws-ebs-csi-driver-node-v9gbz\" (UID: \"21e63632-a824-43b0-bf94-b6059b8aa0f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.158033 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157550 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-multus-socket-dir-parent\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.158033 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:45.157612 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:19:45.158033 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157640 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-host-run-multus-certs\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.158033 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:45.157733 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs podName:dbd283d5-ff0b-4c8f-b1be-15a75816e953 nodeName:}" failed. No retries permitted until 2026-04-17 16:19:45.657688204 +0000 UTC m=+2.998176282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs") pod "network-metrics-daemon-j89hr" (UID: "dbd283d5-ff0b-4c8f-b1be-15a75816e953") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:19:45.158033 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157755 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-var-lib-kubelet\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.158033 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157783 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-run-openvswitch\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.158033 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157810 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ef613c01-eb3a-451b-b2f2-9eee9ab808cc-serviceca\") pod \"node-ca-fwdvx\" (UID: \"ef613c01-eb3a-451b-b2f2-9eee9ab808cc\") " pod="openshift-image-registry/node-ca-fwdvx" Apr 17 16:19:45.158033 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157820 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-var-lib-kubelet\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.158796 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157836 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b7370709-dff9-4a26-85ce-0d05bcf27a57-host-slash\") pod \"iptables-alerter-rswrv\" (UID: \"b7370709-dff9-4a26-85ce-0d05bcf27a57\") " pod="openshift-network-operator/iptables-alerter-rswrv" Apr 17 16:19:45.158796 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157862 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-systemd-units\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.158796 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157899 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/21e63632-a824-43b0-bf94-b6059b8aa0f7-registration-dir\") pod \"aws-ebs-csi-driver-node-v9gbz\" (UID: \"21e63632-a824-43b0-bf94-b6059b8aa0f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.158796 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157911 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b7370709-dff9-4a26-85ce-0d05bcf27a57-host-slash\") pod \"iptables-alerter-rswrv\" (UID: \"b7370709-dff9-4a26-85ce-0d05bcf27a57\") " pod="openshift-network-operator/iptables-alerter-rswrv" Apr 17 16:19:45.158796 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157946 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q9sk\" (UniqueName: \"kubernetes.io/projected/ef613c01-eb3a-451b-b2f2-9eee9ab808cc-kube-api-access-2q9sk\") pod \"node-ca-fwdvx\" (UID: \"ef613c01-eb3a-451b-b2f2-9eee9ab808cc\") " pod="openshift-image-registry/node-ca-fwdvx" Apr 17 16:19:45.158796 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157973 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-cnibin\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.158796 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.157996 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-host-var-lib-kubelet\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.158796 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.158022 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-host\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.158796 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.158053 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-host-cni-netd\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.158796 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.158095 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-system-cni-dir\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.158796 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.158146 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5msp\" (UniqueName: \"kubernetes.io/projected/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-kube-api-access-z5msp\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.158796 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.158148 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-host\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.158796 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.158197 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6drbb\" (UniqueName: \"kubernetes.io/projected/b0030b4f-f856-49ec-87a6-eca6a00291ad-kube-api-access-6drbb\") pod \"node-resolver-s4nbk\" (UID: \"b0030b4f-f856-49ec-87a6-eca6a00291ad\") " pod="openshift-dns/node-resolver-s4nbk" Apr 17 16:19:45.158796 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.158255 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b3d68729-0bbb-475c-8a72-38489e06e068-tmp\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.158796 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.158294 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8de8591f-0659-4b29-abd0-982ba1568fa2-system-cni-dir\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.158796 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.158347 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8de8591f-0659-4b29-abd0-982ba1568fa2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.158796 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.158356 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8de8591f-0659-4b29-abd0-982ba1568fa2-system-cni-dir\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.159535 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.158416 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3e371a5a-2d19-4c74-8b51-d4ac6484410c-ovnkube-script-lib\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.159535 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.158466 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swkxh\" (UniqueName: \"kubernetes.io/projected/3e371a5a-2d19-4c74-8b51-d4ac6484410c-kube-api-access-swkxh\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.159535 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.158520 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/21e63632-a824-43b0-bf94-b6059b8aa0f7-device-dir\") pod \"aws-ebs-csi-driver-node-v9gbz\" (UID: \"21e63632-a824-43b0-bf94-b6059b8aa0f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.159535 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.158554 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-sys\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.159535 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.158669 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-sys\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.159535 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.158715 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlg7s\" (UniqueName: \"kubernetes.io/projected/b3d68729-0bbb-475c-8a72-38489e06e068-kube-api-access-zlg7s\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.159535 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.159039 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8de8591f-0659-4b29-abd0-982ba1568fa2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.159535 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.159268 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8de8591f-0659-4b29-abd0-982ba1568fa2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.159535 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.159310 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3e371a5a-2d19-4c74-8b51-d4ac6484410c-ovn-node-metrics-cert\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.159535 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.159332 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8de8591f-0659-4b29-abd0-982ba1568fa2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.159535 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.159337 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21e63632-a824-43b0-bf94-b6059b8aa0f7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-v9gbz\" (UID: \"21e63632-a824-43b0-bf94-b6059b8aa0f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.159535 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.159413 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8de8591f-0659-4b29-abd0-982ba1568fa2-os-release\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.159535 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.159446 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-etc-openvswitch\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.159535 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.159472 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-host-run-k8s-cni-cncf-io\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.159535 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.159498 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8de8591f-0659-4b29-abd0-982ba1568fa2-os-release\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.159535 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.159523 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b7370709-dff9-4a26-85ce-0d05bcf27a57-iptables-alerter-script\") pod \"iptables-alerter-rswrv\" (UID: \"b7370709-dff9-4a26-85ce-0d05bcf27a57\") " pod="openshift-network-operator/iptables-alerter-rswrv" Apr 17 16:19:45.160313 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.159550 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8de8591f-0659-4b29-abd0-982ba1568fa2-cni-binary-copy\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.160313 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.159576 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-var-lib-openvswitch\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.160313 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.159599 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/21e63632-a824-43b0-bf94-b6059b8aa0f7-sys-fs\") pod \"aws-ebs-csi-driver-node-v9gbz\" (UID: \"21e63632-a824-43b0-bf94-b6059b8aa0f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.160313 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.159622 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef613c01-eb3a-451b-b2f2-9eee9ab808cc-host\") pod \"node-ca-fwdvx\" (UID: \"ef613c01-eb3a-451b-b2f2-9eee9ab808cc\") " pod="openshift-image-registry/node-ca-fwdvx" Apr 17 16:19:45.160313 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.159649 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-etc-sysctl-conf\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.160313 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.159674 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcj8x\" (UniqueName: \"kubernetes.io/projected/8de8591f-0659-4b29-abd0-982ba1568fa2-kube-api-access-hcj8x\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.160313 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.159698 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-host-slash\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.160313 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.159868 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b3d68729-0bbb-475c-8a72-38489e06e068-etc-sysctl-conf\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.160313 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.160052 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b7370709-dff9-4a26-85ce-0d05bcf27a57-iptables-alerter-script\") pod \"iptables-alerter-rswrv\" (UID: \"b7370709-dff9-4a26-85ce-0d05bcf27a57\") " pod="openshift-network-operator/iptables-alerter-rswrv" Apr 17 16:19:45.160313 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.160107 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8de8591f-0659-4b29-abd0-982ba1568fa2-cni-binary-copy\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.160713 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.160686 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b3d68729-0bbb-475c-8a72-38489e06e068-tmp\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.161099 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.160943 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a4d23535-e11c-4204-9246-5539245e51d9-agent-certs\") pod \"konnectivity-agent-v6lnz\" (UID: \"a4d23535-e11c-4204-9246-5539245e51d9\") " pod="kube-system/konnectivity-agent-v6lnz" Apr 17 16:19:45.161654 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.161634 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b3d68729-0bbb-475c-8a72-38489e06e068-etc-tuned\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.168473 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.168432 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpbkb\" (UniqueName: \"kubernetes.io/projected/b7370709-dff9-4a26-85ce-0d05bcf27a57-kube-api-access-bpbkb\") pod \"iptables-alerter-rswrv\" (UID: \"b7370709-dff9-4a26-85ce-0d05bcf27a57\") " pod="openshift-network-operator/iptables-alerter-rswrv" Apr 17 16:19:45.169207 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.169186 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bznj5\" (UniqueName: \"kubernetes.io/projected/dbd283d5-ff0b-4c8f-b1be-15a75816e953-kube-api-access-bznj5\") pod \"network-metrics-daemon-j89hr\" (UID: \"dbd283d5-ff0b-4c8f-b1be-15a75816e953\") " pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:19:45.172035 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.172012 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlg7s\" (UniqueName: \"kubernetes.io/projected/b3d68729-0bbb-475c-8a72-38489e06e068-kube-api-access-zlg7s\") pod \"tuned-gx8qb\" (UID: \"b3d68729-0bbb-475c-8a72-38489e06e068\") " pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.172821 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.172794 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcj8x\" (UniqueName: \"kubernetes.io/projected/8de8591f-0659-4b29-abd0-982ba1568fa2-kube-api-access-hcj8x\") pod \"multus-additional-cni-plugins-vt8g2\" (UID: \"8de8591f-0659-4b29-abd0-982ba1568fa2\") " pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.260917 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.260832 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-node-log\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.260917 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.260882 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3e371a5a-2d19-4c74-8b51-d4ac6484410c-env-overrides\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.261146 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.260941 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-node-log\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.261146 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.260998 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-os-release\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.261146 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261027 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-multus-conf-dir\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.261146 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261054 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-run-ovn\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.261146 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261091 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-host-var-lib-cni-bin\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.261146 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261114 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-multus-conf-dir\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.261146 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261122 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-host-run-netns\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.261491 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261156 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-host-run-netns\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.261491 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261160 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-run-ovn\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.261491 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261125 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-os-release\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.261491 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261156 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-hostroot\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.261491 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261188 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-hostroot\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.261491 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261206 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-host-cni-bin\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.261491 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261232 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/21e63632-a824-43b0-bf94-b6059b8aa0f7-socket-dir\") pod \"aws-ebs-csi-driver-node-v9gbz\" (UID: \"21e63632-a824-43b0-bf94-b6059b8aa0f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.261491 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261243 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-host-cni-bin\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.261491 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261206 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-host-var-lib-cni-bin\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.261491 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261259 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-cni-binary-copy\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.261491 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261283 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-host-var-lib-cni-multus\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.261491 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261306 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-multus-daemon-config\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.261491 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261329 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-etc-kubernetes\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.261491 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261328 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3e371a5a-2d19-4c74-8b51-d4ac6484410c-env-overrides\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.261491 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261361 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/21e63632-a824-43b0-bf94-b6059b8aa0f7-socket-dir\") pod \"aws-ebs-csi-driver-node-v9gbz\" (UID: \"21e63632-a824-43b0-bf94-b6059b8aa0f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.261491 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261370 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-host-var-lib-cni-multus\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.261491 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261391 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-etc-kubernetes\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.261491 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261392 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-host-run-netns\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.262314 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261421 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-host-run-netns\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.262314 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261428 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-run-systemd\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.262314 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261454 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-log-socket\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.262314 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261458 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-run-systemd\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.262314 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261479 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b0030b4f-f856-49ec-87a6-eca6a00291ad-tmp-dir\") pod \"node-resolver-s4nbk\" (UID: \"b0030b4f-f856-49ec-87a6-eca6a00291ad\") " pod="openshift-dns/node-resolver-s4nbk" Apr 17 16:19:45.262314 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261542 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-log-socket\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.262314 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261577 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/21e63632-a824-43b0-bf94-b6059b8aa0f7-etc-selinux\") pod \"aws-ebs-csi-driver-node-v9gbz\" (UID: \"21e63632-a824-43b0-bf94-b6059b8aa0f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.262314 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261617 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-multus-cni-dir\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.262314 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261657 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wd76r\" (UniqueName: \"kubernetes.io/projected/21e63632-a824-43b0-bf94-b6059b8aa0f7-kube-api-access-wd76r\") pod \"aws-ebs-csi-driver-node-v9gbz\" (UID: \"21e63632-a824-43b0-bf94-b6059b8aa0f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.262314 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261682 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-multus-socket-dir-parent\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.262314 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261707 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-host-run-multus-certs\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.262314 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261723 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/21e63632-a824-43b0-bf94-b6059b8aa0f7-etc-selinux\") pod \"aws-ebs-csi-driver-node-v9gbz\" (UID: \"21e63632-a824-43b0-bf94-b6059b8aa0f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.262314 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261731 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-run-openvswitch\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.262314 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261756 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ef613c01-eb3a-451b-b2f2-9eee9ab808cc-serviceca\") pod \"node-ca-fwdvx\" (UID: \"ef613c01-eb3a-451b-b2f2-9eee9ab808cc\") " pod="openshift-image-registry/node-ca-fwdvx" Apr 17 16:19:45.262314 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261769 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-multus-socket-dir-parent\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.262314 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261780 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-systemd-units\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.262314 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261786 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-multus-cni-dir\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.262314 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261796 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-run-openvswitch\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.263155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261795 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-host-run-multus-certs\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.263155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261802 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/21e63632-a824-43b0-bf94-b6059b8aa0f7-registration-dir\") pod \"aws-ebs-csi-driver-node-v9gbz\" (UID: \"21e63632-a824-43b0-bf94-b6059b8aa0f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.263155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261836 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-systemd-units\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.263155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261852 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2q9sk\" (UniqueName: \"kubernetes.io/projected/ef613c01-eb3a-451b-b2f2-9eee9ab808cc-kube-api-access-2q9sk\") pod \"node-ca-fwdvx\" (UID: \"ef613c01-eb3a-451b-b2f2-9eee9ab808cc\") " pod="openshift-image-registry/node-ca-fwdvx" Apr 17 16:19:45.263155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261883 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-cnibin\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.263155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261911 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-host-var-lib-kubelet\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.263155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261939 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-host-cni-netd\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.263155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261954 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-cnibin\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.263155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261957 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/21e63632-a824-43b0-bf94-b6059b8aa0f7-registration-dir\") pod \"aws-ebs-csi-driver-node-v9gbz\" (UID: \"21e63632-a824-43b0-bf94-b6059b8aa0f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.263155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261963 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-system-cni-dir\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.263155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261989 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5msp\" (UniqueName: \"kubernetes.io/projected/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-kube-api-access-z5msp\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.263155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262000 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-host-var-lib-kubelet\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.263155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262013 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6drbb\" (UniqueName: \"kubernetes.io/projected/b0030b4f-f856-49ec-87a6-eca6a00291ad-kube-api-access-6drbb\") pod \"node-resolver-s4nbk\" (UID: \"b0030b4f-f856-49ec-87a6-eca6a00291ad\") " pod="openshift-dns/node-resolver-s4nbk" Apr 17 16:19:45.263155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.261999 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-host-cni-netd\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.263155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262051 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3e371a5a-2d19-4c74-8b51-d4ac6484410c-ovnkube-script-lib\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.263155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262057 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-system-cni-dir\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.263155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262093 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swkxh\" (UniqueName: \"kubernetes.io/projected/3e371a5a-2d19-4c74-8b51-d4ac6484410c-kube-api-access-swkxh\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.263155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262131 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/21e63632-a824-43b0-bf94-b6059b8aa0f7-device-dir\") pod \"aws-ebs-csi-driver-node-v9gbz\" (UID: \"21e63632-a824-43b0-bf94-b6059b8aa0f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.263952 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262160 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3e371a5a-2d19-4c74-8b51-d4ac6484410c-ovn-node-metrics-cert\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.263952 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262183 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21e63632-a824-43b0-bf94-b6059b8aa0f7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-v9gbz\" (UID: \"21e63632-a824-43b0-bf94-b6059b8aa0f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.263952 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262211 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-etc-openvswitch\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.263952 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262235 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-host-run-k8s-cni-cncf-io\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.263952 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262261 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-var-lib-openvswitch\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.263952 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262285 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/21e63632-a824-43b0-bf94-b6059b8aa0f7-sys-fs\") pod \"aws-ebs-csi-driver-node-v9gbz\" (UID: \"21e63632-a824-43b0-bf94-b6059b8aa0f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.263952 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262288 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-etc-openvswitch\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.263952 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262307 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef613c01-eb3a-451b-b2f2-9eee9ab808cc-host\") pod \"node-ca-fwdvx\" (UID: \"ef613c01-eb3a-451b-b2f2-9eee9ab808cc\") " pod="openshift-image-registry/node-ca-fwdvx" Apr 17 16:19:45.263952 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262333 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-host-slash\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.263952 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262337 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-host-run-k8s-cni-cncf-io\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.263952 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262356 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b0030b4f-f856-49ec-87a6-eca6a00291ad-hosts-file\") pod \"node-resolver-s4nbk\" (UID: \"b0030b4f-f856-49ec-87a6-eca6a00291ad\") " pod="openshift-dns/node-resolver-s4nbk" Apr 17 16:19:45.263952 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262341 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/21e63632-a824-43b0-bf94-b6059b8aa0f7-kubelet-dir\") pod \"aws-ebs-csi-driver-node-v9gbz\" (UID: \"21e63632-a824-43b0-bf94-b6059b8aa0f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.263952 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262389 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5r4m6\" (UniqueName: \"kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6\") pod \"network-check-target-xnc8v\" (UID: \"30245442-5a33-4b64-a9be-b62b496e3e7b\") " pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:19:45.263952 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262416 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-var-lib-openvswitch\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.263952 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262418 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-host-kubelet\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.263952 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262241 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/21e63632-a824-43b0-bf94-b6059b8aa0f7-device-dir\") pod \"aws-ebs-csi-driver-node-v9gbz\" (UID: \"21e63632-a824-43b0-bf94-b6059b8aa0f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.263952 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262451 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-host-run-ovn-kubernetes\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.264713 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262448 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-host-kubelet\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.264713 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262484 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-host-run-ovn-kubernetes\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.264713 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262517 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.264713 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262531 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/21e63632-a824-43b0-bf94-b6059b8aa0f7-sys-fs\") pod \"aws-ebs-csi-driver-node-v9gbz\" (UID: \"21e63632-a824-43b0-bf94-b6059b8aa0f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.264713 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262541 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-cni-binary-copy\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.264713 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262549 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3e371a5a-2d19-4c74-8b51-d4ac6484410c-ovnkube-config\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.264713 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262583 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef613c01-eb3a-451b-b2f2-9eee9ab808cc-host\") pod \"node-ca-fwdvx\" (UID: \"ef613c01-eb3a-451b-b2f2-9eee9ab808cc\") " pod="openshift-image-registry/node-ca-fwdvx" Apr 17 16:19:45.264713 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262635 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-host-slash\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.264713 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262683 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b0030b4f-f856-49ec-87a6-eca6a00291ad-hosts-file\") pod \"node-resolver-s4nbk\" (UID: \"b0030b4f-f856-49ec-87a6-eca6a00291ad\") " pod="openshift-dns/node-resolver-s4nbk" Apr 17 16:19:45.264713 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262725 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3e371a5a-2d19-4c74-8b51-d4ac6484410c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.264713 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262763 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-multus-daemon-config\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.264713 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.262856 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b0030b4f-f856-49ec-87a6-eca6a00291ad-tmp-dir\") pod \"node-resolver-s4nbk\" (UID: \"b0030b4f-f856-49ec-87a6-eca6a00291ad\") " pod="openshift-dns/node-resolver-s4nbk" Apr 17 16:19:45.264713 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.263156 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ef613c01-eb3a-451b-b2f2-9eee9ab808cc-serviceca\") pod \"node-ca-fwdvx\" (UID: \"ef613c01-eb3a-451b-b2f2-9eee9ab808cc\") " pod="openshift-image-registry/node-ca-fwdvx" Apr 17 16:19:45.264713 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.263297 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3e371a5a-2d19-4c74-8b51-d4ac6484410c-ovnkube-config\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.264713 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.263319 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3e371a5a-2d19-4c74-8b51-d4ac6484410c-ovnkube-script-lib\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.264713 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.264592 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3e371a5a-2d19-4c74-8b51-d4ac6484410c-ovn-node-metrics-cert\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.269589 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:45.269508 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:19:45.269589 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:45.269536 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:19:45.269589 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:45.269550 2572 projected.go:194] Error preparing data for projected volume kube-api-access-5r4m6 for pod openshift-network-diagnostics/network-check-target-xnc8v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:19:45.269828 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:45.269644 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6 podName:30245442-5a33-4b64-a9be-b62b496e3e7b nodeName:}" failed. No retries permitted until 2026-04-17 16:19:45.769615145 +0000 UTC m=+3.110103445 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5r4m6" (UniqueName: "kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6") pod "network-check-target-xnc8v" (UID: "30245442-5a33-4b64-a9be-b62b496e3e7b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:19:45.271735 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.271706 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q9sk\" (UniqueName: \"kubernetes.io/projected/ef613c01-eb3a-451b-b2f2-9eee9ab808cc-kube-api-access-2q9sk\") pod \"node-ca-fwdvx\" (UID: \"ef613c01-eb3a-451b-b2f2-9eee9ab808cc\") " pod="openshift-image-registry/node-ca-fwdvx" Apr 17 16:19:45.271875 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.271850 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swkxh\" (UniqueName: \"kubernetes.io/projected/3e371a5a-2d19-4c74-8b51-d4ac6484410c-kube-api-access-swkxh\") pod \"ovnkube-node-nhjqx\" (UID: \"3e371a5a-2d19-4c74-8b51-d4ac6484410c\") " pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.272007 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.271921 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6drbb\" (UniqueName: \"kubernetes.io/projected/b0030b4f-f856-49ec-87a6-eca6a00291ad-kube-api-access-6drbb\") pod \"node-resolver-s4nbk\" (UID: \"b0030b4f-f856-49ec-87a6-eca6a00291ad\") " pod="openshift-dns/node-resolver-s4nbk" Apr 17 16:19:45.272133 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.272116 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5msp\" (UniqueName: \"kubernetes.io/projected/edcb65df-bdda-4e5d-acba-2ef0eb3d8f51-kube-api-access-z5msp\") pod \"multus-hgr6r\" (UID: \"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51\") " pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.272192 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.272120 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd76r\" (UniqueName: \"kubernetes.io/projected/21e63632-a824-43b0-bf94-b6059b8aa0f7-kube-api-access-wd76r\") pod \"aws-ebs-csi-driver-node-v9gbz\" (UID: \"21e63632-a824-43b0-bf94-b6059b8aa0f7\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.305453 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.305429 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 16:19:45.346116 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.346064 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" Apr 17 16:19:45.354706 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.354674 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vt8g2" Apr 17 16:19:45.364403 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.364378 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rswrv" Apr 17 16:19:45.371973 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.371950 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-v6lnz" Apr 17 16:19:45.378567 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.378550 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" Apr 17 16:19:45.386132 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.386116 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fwdvx" Apr 17 16:19:45.394676 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.394648 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hgr6r" Apr 17 16:19:45.401337 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.401318 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:19:45.406891 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.406869 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-s4nbk" Apr 17 16:19:45.664596 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.664507 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs\") pod \"network-metrics-daemon-j89hr\" (UID: \"dbd283d5-ff0b-4c8f-b1be-15a75816e953\") " pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:19:45.664793 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:45.664613 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:19:45.664793 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:45.664682 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs podName:dbd283d5-ff0b-4c8f-b1be-15a75816e953 nodeName:}" failed. No retries permitted until 2026-04-17 16:19:46.664665656 +0000 UTC m=+4.005153713 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs") pod "network-metrics-daemon-j89hr" (UID: "dbd283d5-ff0b-4c8f-b1be-15a75816e953") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:19:45.866049 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:45.866015 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5r4m6\" (UniqueName: \"kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6\") pod \"network-check-target-xnc8v\" (UID: \"30245442-5a33-4b64-a9be-b62b496e3e7b\") " pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:19:45.866212 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:45.866194 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:19:45.866249 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:45.866216 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:19:45.866249 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:45.866225 2572 projected.go:194] Error preparing data for projected volume kube-api-access-5r4m6 for pod openshift-network-diagnostics/network-check-target-xnc8v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:19:45.866326 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:45.866278 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6 podName:30245442-5a33-4b64-a9be-b62b496e3e7b nodeName:}" failed. No retries permitted until 2026-04-17 16:19:46.86626508 +0000 UTC m=+4.206753137 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5r4m6" (UniqueName: "kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6") pod "network-check-target-xnc8v" (UID: "30245442-5a33-4b64-a9be-b62b496e3e7b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:19:45.892870 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:45.892840 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3d68729_0bbb_475c_8a72_38489e06e068.slice/crio-19e1b82084865bff154bd8cc85f608875c17e159b5d19033e81b65c3bc1c789a WatchSource:0}: Error finding container 19e1b82084865bff154bd8cc85f608875c17e159b5d19033e81b65c3bc1c789a: Status 404 returned error can't find the container with id 19e1b82084865bff154bd8cc85f608875c17e159b5d19033e81b65c3bc1c789a Apr 17 16:19:45.895661 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:45.895633 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8de8591f_0659_4b29_abd0_982ba1568fa2.slice/crio-d5ffab41dd67d6caf6ed6f9285d4faafc5f9d84632394a265fecc370d884a454 WatchSource:0}: Error finding container d5ffab41dd67d6caf6ed6f9285d4faafc5f9d84632394a265fecc370d884a454: Status 404 returned error can't find the container with id d5ffab41dd67d6caf6ed6f9285d4faafc5f9d84632394a265fecc370d884a454 Apr 17 16:19:45.900826 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:45.900602 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef613c01_eb3a_451b_b2f2_9eee9ab808cc.slice/crio-480c1e0d05659b4a2c544db4e0824824c286ef4d9cbd186b3cd8f78e80899005 WatchSource:0}: Error finding container 480c1e0d05659b4a2c544db4e0824824c286ef4d9cbd186b3cd8f78e80899005: Status 404 returned error can't find the container with id 480c1e0d05659b4a2c544db4e0824824c286ef4d9cbd186b3cd8f78e80899005 Apr 17 16:19:45.901379 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:45.901034 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e371a5a_2d19_4c74_8b51_d4ac6484410c.slice/crio-a4f5e8bb32d32dac186a80e23991fdcbf02af440271b74bc04cc522d72462383 WatchSource:0}: Error finding container a4f5e8bb32d32dac186a80e23991fdcbf02af440271b74bc04cc522d72462383: Status 404 returned error can't find the container with id a4f5e8bb32d32dac186a80e23991fdcbf02af440271b74bc04cc522d72462383 Apr 17 16:19:45.904217 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:45.904038 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0030b4f_f856_49ec_87a6_eca6a00291ad.slice/crio-d01846453c263eba3887282896fdd94e053c5da52664d12fcef8b274ecf76604 WatchSource:0}: Error finding container d01846453c263eba3887282896fdd94e053c5da52664d12fcef8b274ecf76604: Status 404 returned error can't find the container with id d01846453c263eba3887282896fdd94e053c5da52664d12fcef8b274ecf76604 Apr 17 16:19:45.904391 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:45.904231 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7370709_dff9_4a26_85ce_0d05bcf27a57.slice/crio-09ca958c95f87ffdf1398c724361238b3fb06ecff596ccff9694f177c8cb9573 WatchSource:0}: Error finding container 09ca958c95f87ffdf1398c724361238b3fb06ecff596ccff9694f177c8cb9573: Status 404 returned error can't find the container with id 09ca958c95f87ffdf1398c724361238b3fb06ecff596ccff9694f177c8cb9573 Apr 17 16:19:45.905130 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:45.905113 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21e63632_a824_43b0_bf94_b6059b8aa0f7.slice/crio-5462709f79bac2600d9e71e060e35f16c7f43529ea8f1c9f6f3fb58b8e1ab593 WatchSource:0}: Error finding container 5462709f79bac2600d9e71e060e35f16c7f43529ea8f1c9f6f3fb58b8e1ab593: Status 404 returned error can't find the container with id 5462709f79bac2600d9e71e060e35f16c7f43529ea8f1c9f6f3fb58b8e1ab593 Apr 17 16:19:45.906281 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:19:45.906251 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4d23535_e11c_4204_9246_5539245e51d9.slice/crio-b2597c8ae62f5f220e188f0b168197cd4c09f06f94f40f48a539b8bd2e6d6572 WatchSource:0}: Error finding container b2597c8ae62f5f220e188f0b168197cd4c09f06f94f40f48a539b8bd2e6d6572: Status 404 returned error can't find the container with id b2597c8ae62f5f220e188f0b168197cd4c09f06f94f40f48a539b8bd2e6d6572 Apr 17 16:19:46.076414 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:46.076220 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 16:14:44 +0000 UTC" deadline="2027-12-02 21:43:47.469674968 +0000 UTC" Apr 17 16:19:46.076414 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:46.076407 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14261h24m1.393271923s" Apr 17 16:19:46.173967 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:46.173857 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:19:46.174137 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:46.173991 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j89hr" podUID="dbd283d5-ff0b-4c8f-b1be-15a75816e953" Apr 17 16:19:46.182404 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:46.182368 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-179.ec2.internal" event={"ID":"bcd2509d09ecd17cbbfc28f51d1d45a5","Type":"ContainerStarted","Data":"3b37554f0f7cb7bd3cbbafa4318483a559c154d281fb372b1984f9f31b70e3dd"} Apr 17 16:19:46.183493 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:46.183443 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s4nbk" event={"ID":"b0030b4f-f856-49ec-87a6-eca6a00291ad","Type":"ContainerStarted","Data":"d01846453c263eba3887282896fdd94e053c5da52664d12fcef8b274ecf76604"} Apr 17 16:19:46.184589 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:46.184566 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" event={"ID":"3e371a5a-2d19-4c74-8b51-d4ac6484410c","Type":"ContainerStarted","Data":"a4f5e8bb32d32dac186a80e23991fdcbf02af440271b74bc04cc522d72462383"} Apr 17 16:19:46.185530 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:46.185511 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hgr6r" event={"ID":"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51","Type":"ContainerStarted","Data":"ed195ca3b88a06f5ce7aaa9d6185025f9d5fbe39c5bc14bc23b655d14c79ef28"} Apr 17 16:19:46.186493 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:46.186471 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vt8g2" event={"ID":"8de8591f-0659-4b29-abd0-982ba1568fa2","Type":"ContainerStarted","Data":"d5ffab41dd67d6caf6ed6f9285d4faafc5f9d84632394a265fecc370d884a454"} Apr 17 16:19:46.187438 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:46.187417 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" event={"ID":"b3d68729-0bbb-475c-8a72-38489e06e068","Type":"ContainerStarted","Data":"19e1b82084865bff154bd8cc85f608875c17e159b5d19033e81b65c3bc1c789a"} Apr 17 16:19:46.188348 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:46.188328 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-v6lnz" event={"ID":"a4d23535-e11c-4204-9246-5539245e51d9","Type":"ContainerStarted","Data":"b2597c8ae62f5f220e188f0b168197cd4c09f06f94f40f48a539b8bd2e6d6572"} Apr 17 16:19:46.189334 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:46.189301 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" event={"ID":"21e63632-a824-43b0-bf94-b6059b8aa0f7","Type":"ContainerStarted","Data":"5462709f79bac2600d9e71e060e35f16c7f43529ea8f1c9f6f3fb58b8e1ab593"} Apr 17 16:19:46.190407 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:46.190384 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rswrv" event={"ID":"b7370709-dff9-4a26-85ce-0d05bcf27a57","Type":"ContainerStarted","Data":"09ca958c95f87ffdf1398c724361238b3fb06ecff596ccff9694f177c8cb9573"} Apr 17 16:19:46.191860 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:46.191837 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fwdvx" event={"ID":"ef613c01-eb3a-451b-b2f2-9eee9ab808cc","Type":"ContainerStarted","Data":"480c1e0d05659b4a2c544db4e0824824c286ef4d9cbd186b3cd8f78e80899005"} Apr 17 16:19:46.195448 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:46.195414 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-179.ec2.internal" podStartSLOduration=2.19540318 podStartE2EDuration="2.19540318s" podCreationTimestamp="2026-04-17 16:19:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:19:46.195065026 +0000 UTC m=+3.535553104" watchObservedRunningTime="2026-04-17 16:19:46.19540318 +0000 UTC m=+3.535891259" Apr 17 16:19:46.672259 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:46.672227 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs\") pod \"network-metrics-daemon-j89hr\" (UID: \"dbd283d5-ff0b-4c8f-b1be-15a75816e953\") " pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:19:46.672419 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:46.672401 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:19:46.672483 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:46.672470 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs podName:dbd283d5-ff0b-4c8f-b1be-15a75816e953 nodeName:}" failed. No retries permitted until 2026-04-17 16:19:48.672449231 +0000 UTC m=+6.012937293 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs") pod "network-metrics-daemon-j89hr" (UID: "dbd283d5-ff0b-4c8f-b1be-15a75816e953") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:19:46.874054 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:46.873786 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5r4m6\" (UniqueName: \"kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6\") pod \"network-check-target-xnc8v\" (UID: \"30245442-5a33-4b64-a9be-b62b496e3e7b\") " pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:19:46.874054 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:46.873990 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:19:46.874054 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:46.874006 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:19:46.874054 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:46.874017 2572 projected.go:194] Error preparing data for projected volume kube-api-access-5r4m6 for pod openshift-network-diagnostics/network-check-target-xnc8v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:19:46.874379 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:46.874141 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6 podName:30245442-5a33-4b64-a9be-b62b496e3e7b nodeName:}" failed. No retries permitted until 2026-04-17 16:19:48.874114012 +0000 UTC m=+6.214602069 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-5r4m6" (UniqueName: "kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6") pod "network-check-target-xnc8v" (UID: "30245442-5a33-4b64-a9be-b62b496e3e7b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:19:47.176282 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:47.176202 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:19:47.176704 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:47.176334 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnc8v" podUID="30245442-5a33-4b64-a9be-b62b496e3e7b" Apr 17 16:19:47.205272 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:47.205238 2572 generic.go:358] "Generic (PLEG): container finished" podID="79115418117c74aa519bb91d0beaf224" containerID="13ac495ab26c0e175659671b983725746a4d89abffa64b642395a2e512c0a162" exitCode=0 Apr 17 16:19:47.205424 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:47.205360 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-179.ec2.internal" event={"ID":"79115418117c74aa519bb91d0beaf224","Type":"ContainerDied","Data":"13ac495ab26c0e175659671b983725746a4d89abffa64b642395a2e512c0a162"} Apr 17 16:19:48.173613 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:48.173581 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:19:48.173784 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:48.173729 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j89hr" podUID="dbd283d5-ff0b-4c8f-b1be-15a75816e953" Apr 17 16:19:48.215614 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:48.215573 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-179.ec2.internal" event={"ID":"79115418117c74aa519bb91d0beaf224","Type":"ContainerStarted","Data":"e4a6abeffd1019a48d92aa116b3f7b1b7f6e2fb2ffad31bb9b9807f006fc27a0"} Apr 17 16:19:48.690711 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:48.690134 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs\") pod \"network-metrics-daemon-j89hr\" (UID: \"dbd283d5-ff0b-4c8f-b1be-15a75816e953\") " pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:19:48.690711 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:48.690307 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:19:48.690711 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:48.690371 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs podName:dbd283d5-ff0b-4c8f-b1be-15a75816e953 nodeName:}" failed. No retries permitted until 2026-04-17 16:19:52.690352621 +0000 UTC m=+10.030840680 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs") pod "network-metrics-daemon-j89hr" (UID: "dbd283d5-ff0b-4c8f-b1be-15a75816e953") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:19:48.892800 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:48.892188 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5r4m6\" (UniqueName: \"kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6\") pod \"network-check-target-xnc8v\" (UID: \"30245442-5a33-4b64-a9be-b62b496e3e7b\") " pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:19:48.892800 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:48.892343 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:19:48.892800 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:48.892363 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:19:48.892800 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:48.892377 2572 projected.go:194] Error preparing data for projected volume kube-api-access-5r4m6 for pod openshift-network-diagnostics/network-check-target-xnc8v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:19:48.892800 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:48.892434 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6 podName:30245442-5a33-4b64-a9be-b62b496e3e7b nodeName:}" failed. No retries permitted until 2026-04-17 16:19:52.892415438 +0000 UTC m=+10.232903500 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-5r4m6" (UniqueName: "kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6") pod "network-check-target-xnc8v" (UID: "30245442-5a33-4b64-a9be-b62b496e3e7b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:19:49.175013 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:49.174364 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:19:49.175013 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:49.174553 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnc8v" podUID="30245442-5a33-4b64-a9be-b62b496e3e7b" Apr 17 16:19:50.174657 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:50.174613 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:19:50.175208 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:50.174763 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j89hr" podUID="dbd283d5-ff0b-4c8f-b1be-15a75816e953" Apr 17 16:19:51.173738 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:51.173699 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:19:51.173933 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:51.173828 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnc8v" podUID="30245442-5a33-4b64-a9be-b62b496e3e7b" Apr 17 16:19:52.174639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:52.174604 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:19:52.175107 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:52.174745 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j89hr" podUID="dbd283d5-ff0b-4c8f-b1be-15a75816e953" Apr 17 16:19:52.724670 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:52.724608 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs\") pod \"network-metrics-daemon-j89hr\" (UID: \"dbd283d5-ff0b-4c8f-b1be-15a75816e953\") " pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:19:52.724845 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:52.724731 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:19:52.724845 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:52.724810 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs podName:dbd283d5-ff0b-4c8f-b1be-15a75816e953 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:00.724790065 +0000 UTC m=+18.065278126 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs") pod "network-metrics-daemon-j89hr" (UID: "dbd283d5-ff0b-4c8f-b1be-15a75816e953") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:19:52.926339 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:52.926300 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5r4m6\" (UniqueName: \"kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6\") pod \"network-check-target-xnc8v\" (UID: \"30245442-5a33-4b64-a9be-b62b496e3e7b\") " pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:19:52.926508 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:52.926481 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:19:52.926508 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:52.926506 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:19:52.926633 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:52.926522 2572 projected.go:194] Error preparing data for projected volume kube-api-access-5r4m6 for pod openshift-network-diagnostics/network-check-target-xnc8v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:19:52.926633 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:52.926585 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6 podName:30245442-5a33-4b64-a9be-b62b496e3e7b nodeName:}" failed. No retries permitted until 2026-04-17 16:20:00.926564639 +0000 UTC m=+18.267052702 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-5r4m6" (UniqueName: "kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6") pod "network-check-target-xnc8v" (UID: "30245442-5a33-4b64-a9be-b62b496e3e7b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:19:53.174947 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:53.174888 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:19:53.175416 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:53.174988 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnc8v" podUID="30245442-5a33-4b64-a9be-b62b496e3e7b" Apr 17 16:19:54.174094 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:54.174009 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:19:54.174298 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:54.174169 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j89hr" podUID="dbd283d5-ff0b-4c8f-b1be-15a75816e953" Apr 17 16:19:55.174091 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:55.174027 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:19:55.174493 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:55.174179 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnc8v" podUID="30245442-5a33-4b64-a9be-b62b496e3e7b" Apr 17 16:19:56.174055 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:56.174006 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:19:56.174263 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:56.174164 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j89hr" podUID="dbd283d5-ff0b-4c8f-b1be-15a75816e953" Apr 17 16:19:57.174139 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:57.174110 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:19:57.174331 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:57.174229 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnc8v" podUID="30245442-5a33-4b64-a9be-b62b496e3e7b" Apr 17 16:19:58.174085 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:58.174033 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:19:58.174290 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:58.174176 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j89hr" podUID="dbd283d5-ff0b-4c8f-b1be-15a75816e953" Apr 17 16:19:59.173903 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:19:59.173796 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:19:59.174358 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:19:59.173932 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnc8v" podUID="30245442-5a33-4b64-a9be-b62b496e3e7b" Apr 17 16:20:00.174599 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:00.174560 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:20:00.175141 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:00.174701 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j89hr" podUID="dbd283d5-ff0b-4c8f-b1be-15a75816e953" Apr 17 16:20:00.783238 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:00.783192 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs\") pod \"network-metrics-daemon-j89hr\" (UID: \"dbd283d5-ff0b-4c8f-b1be-15a75816e953\") " pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:20:00.783442 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:00.783361 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:20:00.783506 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:00.783444 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs podName:dbd283d5-ff0b-4c8f-b1be-15a75816e953 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:16.783422765 +0000 UTC m=+34.123910822 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs") pod "network-metrics-daemon-j89hr" (UID: "dbd283d5-ff0b-4c8f-b1be-15a75816e953") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:20:00.984951 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:00.984908 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5r4m6\" (UniqueName: \"kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6\") pod \"network-check-target-xnc8v\" (UID: \"30245442-5a33-4b64-a9be-b62b496e3e7b\") " pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:20:00.985161 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:00.985139 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:20:00.985211 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:00.985170 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:20:00.985211 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:00.985187 2572 projected.go:194] Error preparing data for projected volume kube-api-access-5r4m6 for pod openshift-network-diagnostics/network-check-target-xnc8v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:20:00.985311 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:00.985254 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6 podName:30245442-5a33-4b64-a9be-b62b496e3e7b nodeName:}" failed. No retries permitted until 2026-04-17 16:20:16.985237463 +0000 UTC m=+34.325725519 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-5r4m6" (UniqueName: "kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6") pod "network-check-target-xnc8v" (UID: "30245442-5a33-4b64-a9be-b62b496e3e7b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:20:01.173996 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:01.173965 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:20:01.174184 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:01.174108 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnc8v" podUID="30245442-5a33-4b64-a9be-b62b496e3e7b" Apr 17 16:20:02.174366 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:02.174326 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:20:02.174823 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:02.174481 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j89hr" podUID="dbd283d5-ff0b-4c8f-b1be-15a75816e953" Apr 17 16:20:03.175650 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:03.175619 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:20:03.176020 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:03.175744 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnc8v" podUID="30245442-5a33-4b64-a9be-b62b496e3e7b" Apr 17 16:20:03.246033 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:03.245999 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-v6lnz" event={"ID":"a4d23535-e11c-4204-9246-5539245e51d9","Type":"ContainerStarted","Data":"24df20122f957705d125d60991c1c81470536b5fa3f6520fbaab6e88ed7e3851"} Apr 17 16:20:04.174520 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:04.174222 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:20:04.174685 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:04.174604 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j89hr" podUID="dbd283d5-ff0b-4c8f-b1be-15a75816e953" Apr 17 16:20:04.252173 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:04.252135 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-s4nbk" event={"ID":"b0030b4f-f856-49ec-87a6-eca6a00291ad","Type":"ContainerStarted","Data":"ca09778d058c08a74ddece2daf45b09a2665e95d7758c7ef9472fbc1dd09e747"} Apr 17 16:20:04.254911 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:04.254881 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" event={"ID":"3e371a5a-2d19-4c74-8b51-d4ac6484410c","Type":"ContainerStarted","Data":"f444f829ececb93860f5355f30b24357c598f15c4c310f904eea5633dbc1dcf9"} Apr 17 16:20:04.255024 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:04.254920 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" event={"ID":"3e371a5a-2d19-4c74-8b51-d4ac6484410c","Type":"ContainerStarted","Data":"b3964c4751acc51b0351c248cc607e8264a9397a7c14adc94a82ae26180443e7"} Apr 17 16:20:04.255024 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:04.254934 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" event={"ID":"3e371a5a-2d19-4c74-8b51-d4ac6484410c","Type":"ContainerStarted","Data":"5835fb4ff450d813df26755dbf167219bb5c16959c016b16331c94260483ee08"} Apr 17 16:20:04.255024 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:04.254946 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" event={"ID":"3e371a5a-2d19-4c74-8b51-d4ac6484410c","Type":"ContainerStarted","Data":"fe2f40303c0b0fde46b16b9b2944f01ef8c04ef8a169bb6dad38bc18f076158d"} Apr 17 16:20:04.255024 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:04.254959 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" event={"ID":"3e371a5a-2d19-4c74-8b51-d4ac6484410c","Type":"ContainerStarted","Data":"924f80971fe5c02328dea2edfb37ba80b95cd85f9a473d9bd9232e7da04a972b"} Apr 17 16:20:04.255024 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:04.254968 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" event={"ID":"3e371a5a-2d19-4c74-8b51-d4ac6484410c","Type":"ContainerStarted","Data":"8e8d7eb0432ed4e39f4d6ae614149aec210f864f1d440f5fbf9187c2898bc0b8"} Apr 17 16:20:04.256264 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:04.256240 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hgr6r" event={"ID":"edcb65df-bdda-4e5d-acba-2ef0eb3d8f51","Type":"ContainerStarted","Data":"795e880683df2ac88e1a96e2eaf4313d53b3e7d10e3c38f0bd763ef4c84ca332"} Apr 17 16:20:04.257660 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:04.257625 2572 generic.go:358] "Generic (PLEG): container finished" podID="8de8591f-0659-4b29-abd0-982ba1568fa2" containerID="03c290f0fed26a5e736c939eadf862e7877c079c5eb062207f542c7220754b36" exitCode=0 Apr 17 16:20:04.257757 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:04.257654 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vt8g2" event={"ID":"8de8591f-0659-4b29-abd0-982ba1568fa2","Type":"ContainerDied","Data":"03c290f0fed26a5e736c939eadf862e7877c079c5eb062207f542c7220754b36"} Apr 17 16:20:04.259264 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:04.259241 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" event={"ID":"b3d68729-0bbb-475c-8a72-38489e06e068","Type":"ContainerStarted","Data":"1860ef26f811788c143fbf0a014a9bd3279065f1659e347278b8afa6e4e8a6ac"} Apr 17 16:20:04.260556 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:04.260534 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" event={"ID":"21e63632-a824-43b0-bf94-b6059b8aa0f7","Type":"ContainerStarted","Data":"c96f6950c861819c3816d16e804fdc6f70f9e9e47301a433a719f96dfc59fe60"} Apr 17 16:20:04.261813 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:04.261787 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fwdvx" event={"ID":"ef613c01-eb3a-451b-b2f2-9eee9ab808cc","Type":"ContainerStarted","Data":"29f08bd77cd0cc27ae3f4e9db4ab2ab3b5f1b1d3ed5d7a53620522f2ece72258"} Apr 17 16:20:04.271491 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:04.271448 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-s4nbk" podStartSLOduration=4.082053132 podStartE2EDuration="21.271434665s" podCreationTimestamp="2026-04-17 16:19:43 +0000 UTC" firstStartedPulling="2026-04-17 16:19:45.905354669 +0000 UTC m=+3.245842737" lastFinishedPulling="2026-04-17 16:20:03.094736206 +0000 UTC m=+20.435224270" observedRunningTime="2026-04-17 16:20:04.271343579 +0000 UTC m=+21.611831659" watchObservedRunningTime="2026-04-17 16:20:04.271434665 +0000 UTC m=+21.611922747" Apr 17 16:20:04.271609 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:04.271585 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-179.ec2.internal" podStartSLOduration=20.271575821 podStartE2EDuration="20.271575821s" podCreationTimestamp="2026-04-17 16:19:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:19:48.230546884 +0000 UTC m=+5.571034964" watchObservedRunningTime="2026-04-17 16:20:04.271575821 +0000 UTC m=+21.612063901" Apr 17 16:20:04.291635 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:04.291583 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hgr6r" podStartSLOduration=3.773679032 podStartE2EDuration="21.29156763s" podCreationTimestamp="2026-04-17 16:19:43 +0000 UTC" firstStartedPulling="2026-04-17 16:19:45.901064674 +0000 UTC m=+3.241552737" lastFinishedPulling="2026-04-17 16:20:03.418953278 +0000 UTC m=+20.759441335" observedRunningTime="2026-04-17 16:20:04.290844752 +0000 UTC m=+21.631332833" watchObservedRunningTime="2026-04-17 16:20:04.29156763 +0000 UTC m=+21.632055710" Apr 17 16:20:04.306508 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:04.306467 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-gx8qb" podStartSLOduration=4.104476085 podStartE2EDuration="21.30645432s" podCreationTimestamp="2026-04-17 16:19:43 +0000 UTC" firstStartedPulling="2026-04-17 16:19:45.894803545 +0000 UTC m=+3.235291606" lastFinishedPulling="2026-04-17 16:20:03.096781768 +0000 UTC m=+20.437269841" observedRunningTime="2026-04-17 16:20:04.305906859 +0000 UTC m=+21.646394941" watchObservedRunningTime="2026-04-17 16:20:04.30645432 +0000 UTC m=+21.646942415" Apr 17 16:20:04.342317 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:04.342275 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-v6lnz" podStartSLOduration=4.177718892 podStartE2EDuration="21.342261605s" podCreationTimestamp="2026-04-17 16:19:43 +0000 UTC" firstStartedPulling="2026-04-17 16:19:45.908119519 +0000 UTC m=+3.248607579" lastFinishedPulling="2026-04-17 16:20:03.072662232 +0000 UTC m=+20.413150292" observedRunningTime="2026-04-17 16:20:04.34167703 +0000 UTC m=+21.682165110" watchObservedRunningTime="2026-04-17 16:20:04.342261605 +0000 UTC m=+21.682749684" Apr 17 16:20:04.357640 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:04.357584 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fwdvx" podStartSLOduration=4.251139043 podStartE2EDuration="21.357567127s" podCreationTimestamp="2026-04-17 16:19:43 +0000 UTC" firstStartedPulling="2026-04-17 16:19:45.9021555 +0000 UTC m=+3.242643557" lastFinishedPulling="2026-04-17 16:20:03.00858358 +0000 UTC m=+20.349071641" observedRunningTime="2026-04-17 16:20:04.357244254 +0000 UTC m=+21.697732333" watchObservedRunningTime="2026-04-17 16:20:04.357567127 +0000 UTC m=+21.698055208" Apr 17 16:20:04.452919 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:04.452896 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 16:20:05.109458 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:05.109352 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T16:20:04.452917058Z","UUID":"101ce500-1d3d-4d67-ab97-c64fd726cd1f","Handler":null,"Name":"","Endpoint":""} Apr 17 16:20:05.110890 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:05.110875 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 16:20:05.110890 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:05.110895 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 16:20:05.173875 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:05.173841 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:20:05.174055 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:05.173955 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnc8v" podUID="30245442-5a33-4b64-a9be-b62b496e3e7b" Apr 17 16:20:05.265517 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:05.265483 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" event={"ID":"21e63632-a824-43b0-bf94-b6059b8aa0f7","Type":"ContainerStarted","Data":"6675ce636216bbe0c0a103494b7bb396b3408e799aee7d530aedf50853cf1504"} Apr 17 16:20:05.267041 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:05.267006 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rswrv" event={"ID":"b7370709-dff9-4a26-85ce-0d05bcf27a57","Type":"ContainerStarted","Data":"475b3de6ce54fed7475f8294ff47005bf27f14e82dec66a2465fb76698ce2d7f"} Apr 17 16:20:05.284726 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:05.284666 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rswrv" podStartSLOduration=5.088941404 podStartE2EDuration="22.284641773s" podCreationTimestamp="2026-04-17 16:19:43 +0000 UTC" firstStartedPulling="2026-04-17 16:19:45.906114922 +0000 UTC m=+3.246602980" lastFinishedPulling="2026-04-17 16:20:03.101815289 +0000 UTC m=+20.442303349" observedRunningTime="2026-04-17 16:20:05.284193959 +0000 UTC m=+22.624682085" watchObservedRunningTime="2026-04-17 16:20:05.284641773 +0000 UTC m=+22.625129859" Apr 17 16:20:06.174198 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:06.174169 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:20:06.174354 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:06.174289 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j89hr" podUID="dbd283d5-ff0b-4c8f-b1be-15a75816e953" Apr 17 16:20:06.271653 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:06.271619 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" event={"ID":"21e63632-a824-43b0-bf94-b6059b8aa0f7","Type":"ContainerStarted","Data":"330f7a41d0a1e875d81e68c862b1603dea81ee475d4965f3221f0069894b27b8"} Apr 17 16:20:06.275211 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:06.275182 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" event={"ID":"3e371a5a-2d19-4c74-8b51-d4ac6484410c","Type":"ContainerStarted","Data":"8225cdd1a682a1b0ac5005ebfd646c1177bd0b21cf5cb141dba419c5bfc38d6c"} Apr 17 16:20:06.288916 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:06.288877 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-v9gbz" podStartSLOduration=3.768372312 podStartE2EDuration="23.288862803s" podCreationTimestamp="2026-04-17 16:19:43 +0000 UTC" firstStartedPulling="2026-04-17 16:19:45.907467399 +0000 UTC m=+3.247955455" lastFinishedPulling="2026-04-17 16:20:05.427957871 +0000 UTC m=+22.768445946" observedRunningTime="2026-04-17 16:20:06.287253869 +0000 UTC m=+23.627741957" watchObservedRunningTime="2026-04-17 16:20:06.288862803 +0000 UTC m=+23.629350882" Apr 17 16:20:06.573140 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:06.573049 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-v6lnz" Apr 17 16:20:06.573731 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:06.573714 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-v6lnz" Apr 17 16:20:07.174391 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:07.174185 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:20:07.174626 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:07.174481 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnc8v" podUID="30245442-5a33-4b64-a9be-b62b496e3e7b" Apr 17 16:20:07.277567 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:07.277525 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-v6lnz" Apr 17 16:20:07.278016 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:07.277857 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-v6lnz" Apr 17 16:20:08.173725 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:08.173693 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:20:08.173892 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:08.173825 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j89hr" podUID="dbd283d5-ff0b-4c8f-b1be-15a75816e953" Apr 17 16:20:09.173871 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:09.173840 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:20:09.174327 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:09.173941 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnc8v" podUID="30245442-5a33-4b64-a9be-b62b496e3e7b" Apr 17 16:20:10.173903 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:10.173710 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:20:10.174516 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:10.173977 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j89hr" podUID="dbd283d5-ff0b-4c8f-b1be-15a75816e953" Apr 17 16:20:10.285760 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:10.285722 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" event={"ID":"3e371a5a-2d19-4c74-8b51-d4ac6484410c","Type":"ContainerStarted","Data":"7e1f6881eec808ae6c84d296fc826db27c8d2dcffa27c1b035b38a39882d3aa6"} Apr 17 16:20:10.286023 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:10.285999 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:20:10.287345 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:10.287320 2572 generic.go:358] "Generic (PLEG): container finished" podID="8de8591f-0659-4b29-abd0-982ba1568fa2" containerID="c34ea1677a36b3ac61395f697705f997f76129d7c6ef57a0e38ac53a387da46c" exitCode=0 Apr 17 16:20:10.287449 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:10.287355 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vt8g2" event={"ID":"8de8591f-0659-4b29-abd0-982ba1568fa2","Type":"ContainerDied","Data":"c34ea1677a36b3ac61395f697705f997f76129d7c6ef57a0e38ac53a387da46c"} Apr 17 16:20:10.300239 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:10.300215 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:20:10.313731 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:10.313690 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" podStartSLOduration=9.869147842 podStartE2EDuration="27.313677416s" podCreationTimestamp="2026-04-17 16:19:43 +0000 UTC" firstStartedPulling="2026-04-17 16:19:45.903770436 +0000 UTC m=+3.244258508" lastFinishedPulling="2026-04-17 16:20:03.348300008 +0000 UTC m=+20.688788082" observedRunningTime="2026-04-17 16:20:10.313356062 +0000 UTC m=+27.653844140" watchObservedRunningTime="2026-04-17 16:20:10.313677416 +0000 UTC m=+27.654165477" Apr 17 16:20:11.173885 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:11.173858 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:20:11.174009 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:11.173986 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnc8v" podUID="30245442-5a33-4b64-a9be-b62b496e3e7b" Apr 17 16:20:11.290655 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:11.290623 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vt8g2" event={"ID":"8de8591f-0659-4b29-abd0-982ba1568fa2","Type":"ContainerStarted","Data":"82dabaaac8174b845c56e8c535507b9e8d561c3d4f4e79cc136cc6a0a31a06bd"} Apr 17 16:20:11.291215 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:11.291189 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:20:11.291315 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:11.291221 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:20:11.304838 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:11.304814 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:20:11.326517 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:11.326490 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j89hr"] Apr 17 16:20:11.326642 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:11.326631 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:20:11.326735 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:11.326717 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j89hr" podUID="dbd283d5-ff0b-4c8f-b1be-15a75816e953" Apr 17 16:20:11.329024 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:11.329001 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xnc8v"] Apr 17 16:20:11.329146 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:11.329094 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:20:11.329202 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:11.329163 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnc8v" podUID="30245442-5a33-4b64-a9be-b62b496e3e7b" Apr 17 16:20:12.294651 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:12.294566 2572 generic.go:358] "Generic (PLEG): container finished" podID="8de8591f-0659-4b29-abd0-982ba1568fa2" containerID="82dabaaac8174b845c56e8c535507b9e8d561c3d4f4e79cc136cc6a0a31a06bd" exitCode=0 Apr 17 16:20:12.294651 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:12.294595 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vt8g2" event={"ID":"8de8591f-0659-4b29-abd0-982ba1568fa2","Type":"ContainerDied","Data":"82dabaaac8174b845c56e8c535507b9e8d561c3d4f4e79cc136cc6a0a31a06bd"} Apr 17 16:20:13.175090 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:13.174892 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:20:13.175261 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:13.174940 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:20:13.175261 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:13.175192 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j89hr" podUID="dbd283d5-ff0b-4c8f-b1be-15a75816e953" Apr 17 16:20:13.175261 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:13.175233 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnc8v" podUID="30245442-5a33-4b64-a9be-b62b496e3e7b" Apr 17 16:20:13.297688 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:13.297602 2572 generic.go:358] "Generic (PLEG): container finished" podID="8de8591f-0659-4b29-abd0-982ba1568fa2" containerID="c8996174d3061d8b7b1d684adb33056eaba1bd90765dd0a548349fc809bc3c69" exitCode=0 Apr 17 16:20:13.298181 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:13.297684 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vt8g2" event={"ID":"8de8591f-0659-4b29-abd0-982ba1568fa2","Type":"ContainerDied","Data":"c8996174d3061d8b7b1d684adb33056eaba1bd90765dd0a548349fc809bc3c69"} Apr 17 16:20:14.657175 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:14.657143 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-dvwcd"] Apr 17 16:20:14.659820 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:14.659804 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dvwcd" Apr 17 16:20:14.659885 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:14.659865 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dvwcd" podUID="a4442d85-8b8a-48bf-b06b-5d49262f2b07" Apr 17 16:20:14.669646 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:14.667510 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dvwcd"] Apr 17 16:20:14.794800 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:14.794766 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a4442d85-8b8a-48bf-b06b-5d49262f2b07-original-pull-secret\") pod \"global-pull-secret-syncer-dvwcd\" (UID: \"a4442d85-8b8a-48bf-b06b-5d49262f2b07\") " pod="kube-system/global-pull-secret-syncer-dvwcd" Apr 17 16:20:14.794956 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:14.794849 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a4442d85-8b8a-48bf-b06b-5d49262f2b07-dbus\") pod \"global-pull-secret-syncer-dvwcd\" (UID: \"a4442d85-8b8a-48bf-b06b-5d49262f2b07\") " pod="kube-system/global-pull-secret-syncer-dvwcd" Apr 17 16:20:14.794956 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:14.794877 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a4442d85-8b8a-48bf-b06b-5d49262f2b07-kubelet-config\") pod \"global-pull-secret-syncer-dvwcd\" (UID: \"a4442d85-8b8a-48bf-b06b-5d49262f2b07\") " pod="kube-system/global-pull-secret-syncer-dvwcd" Apr 17 16:20:14.895276 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:14.895245 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a4442d85-8b8a-48bf-b06b-5d49262f2b07-dbus\") pod \"global-pull-secret-syncer-dvwcd\" (UID: \"a4442d85-8b8a-48bf-b06b-5d49262f2b07\") " pod="kube-system/global-pull-secret-syncer-dvwcd" Apr 17 16:20:14.895434 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:14.895298 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a4442d85-8b8a-48bf-b06b-5d49262f2b07-kubelet-config\") pod \"global-pull-secret-syncer-dvwcd\" (UID: \"a4442d85-8b8a-48bf-b06b-5d49262f2b07\") " pod="kube-system/global-pull-secret-syncer-dvwcd" Apr 17 16:20:14.895434 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:14.895328 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a4442d85-8b8a-48bf-b06b-5d49262f2b07-original-pull-secret\") pod \"global-pull-secret-syncer-dvwcd\" (UID: \"a4442d85-8b8a-48bf-b06b-5d49262f2b07\") " pod="kube-system/global-pull-secret-syncer-dvwcd" Apr 17 16:20:14.895513 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:14.895452 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a4442d85-8b8a-48bf-b06b-5d49262f2b07-kubelet-config\") pod \"global-pull-secret-syncer-dvwcd\" (UID: \"a4442d85-8b8a-48bf-b06b-5d49262f2b07\") " pod="kube-system/global-pull-secret-syncer-dvwcd" Apr 17 16:20:14.895513 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:14.895493 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:20:14.895593 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:14.895509 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a4442d85-8b8a-48bf-b06b-5d49262f2b07-dbus\") pod \"global-pull-secret-syncer-dvwcd\" (UID: \"a4442d85-8b8a-48bf-b06b-5d49262f2b07\") " pod="kube-system/global-pull-secret-syncer-dvwcd" Apr 17 16:20:14.895593 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:14.895550 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4442d85-8b8a-48bf-b06b-5d49262f2b07-original-pull-secret podName:a4442d85-8b8a-48bf-b06b-5d49262f2b07 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:15.395533027 +0000 UTC m=+32.736021086 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a4442d85-8b8a-48bf-b06b-5d49262f2b07-original-pull-secret") pod "global-pull-secret-syncer-dvwcd" (UID: "a4442d85-8b8a-48bf-b06b-5d49262f2b07") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:20:15.173579 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:15.173539 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:20:15.173579 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:15.173573 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:20:15.173804 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:15.173668 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xnc8v" podUID="30245442-5a33-4b64-a9be-b62b496e3e7b" Apr 17 16:20:15.173855 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:15.173796 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-j89hr" podUID="dbd283d5-ff0b-4c8f-b1be-15a75816e953" Apr 17 16:20:15.301541 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:15.301507 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dvwcd" Apr 17 16:20:15.301695 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:15.301646 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dvwcd" podUID="a4442d85-8b8a-48bf-b06b-5d49262f2b07" Apr 17 16:20:15.399502 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:15.399462 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a4442d85-8b8a-48bf-b06b-5d49262f2b07-original-pull-secret\") pod \"global-pull-secret-syncer-dvwcd\" (UID: \"a4442d85-8b8a-48bf-b06b-5d49262f2b07\") " pod="kube-system/global-pull-secret-syncer-dvwcd" Apr 17 16:20:15.399677 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:15.399573 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:20:15.399677 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:15.399651 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4442d85-8b8a-48bf-b06b-5d49262f2b07-original-pull-secret podName:a4442d85-8b8a-48bf-b06b-5d49262f2b07 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:16.399632476 +0000 UTC m=+33.740120533 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a4442d85-8b8a-48bf-b06b-5d49262f2b07-original-pull-secret") pod "global-pull-secret-syncer-dvwcd" (UID: "a4442d85-8b8a-48bf-b06b-5d49262f2b07") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:20:16.035431 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.035397 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-179.ec2.internal" event="NodeReady" Apr 17 16:20:16.035967 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.035558 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 16:20:16.079929 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.079873 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-v5rbd"] Apr 17 16:20:16.110682 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.110603 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9sgrn"] Apr 17 16:20:16.110832 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.110775 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v5rbd" Apr 17 16:20:16.113824 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.113798 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 16:20:16.113964 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.113830 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vgznp\"" Apr 17 16:20:16.113964 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.113806 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 16:20:16.124655 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.124624 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9sgrn"] Apr 17 16:20:16.124766 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.124665 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-v5rbd"] Apr 17 16:20:16.124766 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.124760 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9sgrn" Apr 17 16:20:16.127950 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.127924 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 16:20:16.128062 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.127976 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9rhb9\"" Apr 17 16:20:16.128237 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.128217 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 16:20:16.128237 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.128218 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 16:20:16.205403 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.205373 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8gjr\" (UniqueName: \"kubernetes.io/projected/609a9cbf-301f-406b-a26d-13ae069e0a70-kube-api-access-g8gjr\") pod \"dns-default-v5rbd\" (UID: \"609a9cbf-301f-406b-a26d-13ae069e0a70\") " pod="openshift-dns/dns-default-v5rbd" Apr 17 16:20:16.205403 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.205413 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/609a9cbf-301f-406b-a26d-13ae069e0a70-config-volume\") pod \"dns-default-v5rbd\" (UID: \"609a9cbf-301f-406b-a26d-13ae069e0a70\") " pod="openshift-dns/dns-default-v5rbd" Apr 17 16:20:16.205654 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.205449 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/609a9cbf-301f-406b-a26d-13ae069e0a70-tmp-dir\") pod \"dns-default-v5rbd\" (UID: \"609a9cbf-301f-406b-a26d-13ae069e0a70\") " pod="openshift-dns/dns-default-v5rbd" Apr 17 16:20:16.205654 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.205567 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls\") pod \"dns-default-v5rbd\" (UID: \"609a9cbf-301f-406b-a26d-13ae069e0a70\") " pod="openshift-dns/dns-default-v5rbd" Apr 17 16:20:16.205654 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.205597 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9269t\" (UniqueName: \"kubernetes.io/projected/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-kube-api-access-9269t\") pod \"ingress-canary-9sgrn\" (UID: \"5caf5aa7-4606-4fa1-8754-cab1cd67eac0\") " pod="openshift-ingress-canary/ingress-canary-9sgrn" Apr 17 16:20:16.205654 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.205629 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert\") pod \"ingress-canary-9sgrn\" (UID: \"5caf5aa7-4606-4fa1-8754-cab1cd67eac0\") " pod="openshift-ingress-canary/ingress-canary-9sgrn" Apr 17 16:20:16.306948 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.306909 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls\") pod \"dns-default-v5rbd\" (UID: \"609a9cbf-301f-406b-a26d-13ae069e0a70\") " pod="openshift-dns/dns-default-v5rbd" Apr 17 16:20:16.307301 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.306956 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9269t\" (UniqueName: \"kubernetes.io/projected/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-kube-api-access-9269t\") pod \"ingress-canary-9sgrn\" (UID: \"5caf5aa7-4606-4fa1-8754-cab1cd67eac0\") " pod="openshift-ingress-canary/ingress-canary-9sgrn" Apr 17 16:20:16.307301 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.306991 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert\") pod \"ingress-canary-9sgrn\" (UID: \"5caf5aa7-4606-4fa1-8754-cab1cd67eac0\") " pod="openshift-ingress-canary/ingress-canary-9sgrn" Apr 17 16:20:16.307301 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.307020 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g8gjr\" (UniqueName: \"kubernetes.io/projected/609a9cbf-301f-406b-a26d-13ae069e0a70-kube-api-access-g8gjr\") pod \"dns-default-v5rbd\" (UID: \"609a9cbf-301f-406b-a26d-13ae069e0a70\") " pod="openshift-dns/dns-default-v5rbd" Apr 17 16:20:16.307301 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.307050 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/609a9cbf-301f-406b-a26d-13ae069e0a70-config-volume\") pod \"dns-default-v5rbd\" (UID: \"609a9cbf-301f-406b-a26d-13ae069e0a70\") " pod="openshift-dns/dns-default-v5rbd" Apr 17 16:20:16.307301 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:16.307057 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:20:16.307301 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.307097 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/609a9cbf-301f-406b-a26d-13ae069e0a70-tmp-dir\") pod \"dns-default-v5rbd\" (UID: \"609a9cbf-301f-406b-a26d-13ae069e0a70\") " pod="openshift-dns/dns-default-v5rbd" Apr 17 16:20:16.307301 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:16.307139 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls podName:609a9cbf-301f-406b-a26d-13ae069e0a70 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:16.80711473 +0000 UTC m=+34.147602801 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls") pod "dns-default-v5rbd" (UID: "609a9cbf-301f-406b-a26d-13ae069e0a70") : secret "dns-default-metrics-tls" not found Apr 17 16:20:16.307301 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:16.307290 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:20:16.307618 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:16.307344 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert podName:5caf5aa7-4606-4fa1-8754-cab1cd67eac0 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:16.807327574 +0000 UTC m=+34.147815643 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert") pod "ingress-canary-9sgrn" (UID: "5caf5aa7-4606-4fa1-8754-cab1cd67eac0") : secret "canary-serving-cert" not found Apr 17 16:20:16.307618 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.307432 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/609a9cbf-301f-406b-a26d-13ae069e0a70-tmp-dir\") pod \"dns-default-v5rbd\" (UID: \"609a9cbf-301f-406b-a26d-13ae069e0a70\") " pod="openshift-dns/dns-default-v5rbd" Apr 17 16:20:16.307618 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.307590 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/609a9cbf-301f-406b-a26d-13ae069e0a70-config-volume\") pod \"dns-default-v5rbd\" (UID: \"609a9cbf-301f-406b-a26d-13ae069e0a70\") " pod="openshift-dns/dns-default-v5rbd" Apr 17 16:20:16.317884 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.317861 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8gjr\" (UniqueName: \"kubernetes.io/projected/609a9cbf-301f-406b-a26d-13ae069e0a70-kube-api-access-g8gjr\") pod \"dns-default-v5rbd\" (UID: \"609a9cbf-301f-406b-a26d-13ae069e0a70\") " pod="openshift-dns/dns-default-v5rbd" Apr 17 16:20:16.318008 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.317917 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9269t\" (UniqueName: \"kubernetes.io/projected/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-kube-api-access-9269t\") pod \"ingress-canary-9sgrn\" (UID: \"5caf5aa7-4606-4fa1-8754-cab1cd67eac0\") " pod="openshift-ingress-canary/ingress-canary-9sgrn" Apr 17 16:20:16.408514 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.408459 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a4442d85-8b8a-48bf-b06b-5d49262f2b07-original-pull-secret\") pod \"global-pull-secret-syncer-dvwcd\" (UID: \"a4442d85-8b8a-48bf-b06b-5d49262f2b07\") " pod="kube-system/global-pull-secret-syncer-dvwcd" Apr 17 16:20:16.408709 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:16.408622 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 16:20:16.408709 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:16.408702 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a4442d85-8b8a-48bf-b06b-5d49262f2b07-original-pull-secret podName:a4442d85-8b8a-48bf-b06b-5d49262f2b07 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:18.408684295 +0000 UTC m=+35.749172351 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a4442d85-8b8a-48bf-b06b-5d49262f2b07-original-pull-secret") pod "global-pull-secret-syncer-dvwcd" (UID: "a4442d85-8b8a-48bf-b06b-5d49262f2b07") : object "kube-system"/"original-pull-secret" not registered Apr 17 16:20:16.811329 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.811250 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs\") pod \"network-metrics-daemon-j89hr\" (UID: \"dbd283d5-ff0b-4c8f-b1be-15a75816e953\") " pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:20:16.811329 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.811285 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls\") pod \"dns-default-v5rbd\" (UID: \"609a9cbf-301f-406b-a26d-13ae069e0a70\") " pod="openshift-dns/dns-default-v5rbd" Apr 17 16:20:16.811329 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:16.811308 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert\") pod \"ingress-canary-9sgrn\" (UID: \"5caf5aa7-4606-4fa1-8754-cab1cd67eac0\") " pod="openshift-ingress-canary/ingress-canary-9sgrn" Apr 17 16:20:16.811711 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:16.811414 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:20:16.811711 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:16.811417 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:20:16.811711 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:16.811484 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls podName:609a9cbf-301f-406b-a26d-13ae069e0a70 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:17.811462979 +0000 UTC m=+35.151951037 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls") pod "dns-default-v5rbd" (UID: "609a9cbf-301f-406b-a26d-13ae069e0a70") : secret "dns-default-metrics-tls" not found Apr 17 16:20:16.811711 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:16.811486 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:20:16.811711 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:16.811503 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert podName:5caf5aa7-4606-4fa1-8754-cab1cd67eac0 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:17.811493842 +0000 UTC m=+35.151981899 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert") pod "ingress-canary-9sgrn" (UID: "5caf5aa7-4606-4fa1-8754-cab1cd67eac0") : secret "canary-serving-cert" not found Apr 17 16:20:16.811711 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:16.811553 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs podName:dbd283d5-ff0b-4c8f-b1be-15a75816e953 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:48.81153526 +0000 UTC m=+66.152023322 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs") pod "network-metrics-daemon-j89hr" (UID: "dbd283d5-ff0b-4c8f-b1be-15a75816e953") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 16:20:17.012730 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:17.012699 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5r4m6\" (UniqueName: \"kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6\") pod \"network-check-target-xnc8v\" (UID: \"30245442-5a33-4b64-a9be-b62b496e3e7b\") " pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:20:17.012881 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:17.012832 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 16:20:17.012881 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:17.012846 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 16:20:17.012881 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:17.012854 2572 projected.go:194] Error preparing data for projected volume kube-api-access-5r4m6 for pod openshift-network-diagnostics/network-check-target-xnc8v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:20:17.013000 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:17.012904 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6 podName:30245442-5a33-4b64-a9be-b62b496e3e7b nodeName:}" failed. No retries permitted until 2026-04-17 16:20:49.012890465 +0000 UTC m=+66.353378522 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-5r4m6" (UniqueName: "kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6") pod "network-check-target-xnc8v" (UID: "30245442-5a33-4b64-a9be-b62b496e3e7b") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 16:20:17.176951 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:17.176927 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dvwcd" Apr 17 16:20:17.176951 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:17.176950 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:20:17.177633 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:17.176934 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:20:17.179881 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:17.179857 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:20:17.180027 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:17.179878 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 16:20:17.180027 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:17.179895 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-59zbx\"" Apr 17 16:20:17.183910 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:17.181696 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:20:17.183910 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:17.181838 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:20:17.183910 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:17.181876 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qr2mj\"" Apr 17 16:20:17.818061 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:17.818005 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls\") pod \"dns-default-v5rbd\" (UID: \"609a9cbf-301f-406b-a26d-13ae069e0a70\") " pod="openshift-dns/dns-default-v5rbd" Apr 17 16:20:17.818266 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:17.818086 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert\") pod \"ingress-canary-9sgrn\" (UID: \"5caf5aa7-4606-4fa1-8754-cab1cd67eac0\") " pod="openshift-ingress-canary/ingress-canary-9sgrn" Apr 17 16:20:17.818266 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:17.818182 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:20:17.818266 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:17.818232 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:20:17.818266 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:17.818257 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls podName:609a9cbf-301f-406b-a26d-13ae069e0a70 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:19.818236733 +0000 UTC m=+37.158724817 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls") pod "dns-default-v5rbd" (UID: "609a9cbf-301f-406b-a26d-13ae069e0a70") : secret "dns-default-metrics-tls" not found Apr 17 16:20:17.818474 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:17.818278 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert podName:5caf5aa7-4606-4fa1-8754-cab1cd67eac0 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:19.818266589 +0000 UTC m=+37.158754651 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert") pod "ingress-canary-9sgrn" (UID: "5caf5aa7-4606-4fa1-8754-cab1cd67eac0") : secret "canary-serving-cert" not found Apr 17 16:20:18.422086 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:18.422045 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a4442d85-8b8a-48bf-b06b-5d49262f2b07-original-pull-secret\") pod \"global-pull-secret-syncer-dvwcd\" (UID: \"a4442d85-8b8a-48bf-b06b-5d49262f2b07\") " pod="kube-system/global-pull-secret-syncer-dvwcd" Apr 17 16:20:18.424642 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:18.424605 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a4442d85-8b8a-48bf-b06b-5d49262f2b07-original-pull-secret\") pod \"global-pull-secret-syncer-dvwcd\" (UID: \"a4442d85-8b8a-48bf-b06b-5d49262f2b07\") " pod="kube-system/global-pull-secret-syncer-dvwcd" Apr 17 16:20:18.689510 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:18.689420 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dvwcd" Apr 17 16:20:19.275151 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:19.274929 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dvwcd"] Apr 17 16:20:19.300844 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:20:19.300810 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4442d85_8b8a_48bf_b06b_5d49262f2b07.slice/crio-309d825cc0bf53b0cdc1f3b55909de6e391f5a6c386c701ed7aa67deb013914c WatchSource:0}: Error finding container 309d825cc0bf53b0cdc1f3b55909de6e391f5a6c386c701ed7aa67deb013914c: Status 404 returned error can't find the container with id 309d825cc0bf53b0cdc1f3b55909de6e391f5a6c386c701ed7aa67deb013914c Apr 17 16:20:19.309376 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:19.309344 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dvwcd" event={"ID":"a4442d85-8b8a-48bf-b06b-5d49262f2b07","Type":"ContainerStarted","Data":"309d825cc0bf53b0cdc1f3b55909de6e391f5a6c386c701ed7aa67deb013914c"} Apr 17 16:20:19.833514 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:19.833426 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls\") pod \"dns-default-v5rbd\" (UID: \"609a9cbf-301f-406b-a26d-13ae069e0a70\") " pod="openshift-dns/dns-default-v5rbd" Apr 17 16:20:19.834013 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:19.833576 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:20:19.834013 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:19.833584 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert\") pod \"ingress-canary-9sgrn\" (UID: \"5caf5aa7-4606-4fa1-8754-cab1cd67eac0\") " pod="openshift-ingress-canary/ingress-canary-9sgrn" Apr 17 16:20:19.834013 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:19.833644 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls podName:609a9cbf-301f-406b-a26d-13ae069e0a70 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:23.833624663 +0000 UTC m=+41.174112726 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls") pod "dns-default-v5rbd" (UID: "609a9cbf-301f-406b-a26d-13ae069e0a70") : secret "dns-default-metrics-tls" not found Apr 17 16:20:19.834013 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:19.833694 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:20:19.834013 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:19.833757 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert podName:5caf5aa7-4606-4fa1-8754-cab1cd67eac0 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:23.833740425 +0000 UTC m=+41.174228489 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert") pod "ingress-canary-9sgrn" (UID: "5caf5aa7-4606-4fa1-8754-cab1cd67eac0") : secret "canary-serving-cert" not found Apr 17 16:20:20.313595 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:20.313562 2572 generic.go:358] "Generic (PLEG): container finished" podID="8de8591f-0659-4b29-abd0-982ba1568fa2" containerID="85a0604b1be6531fb9f65934e2db2f027d223a9203e807624f9d606e4386e210" exitCode=0 Apr 17 16:20:20.313757 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:20.313615 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vt8g2" event={"ID":"8de8591f-0659-4b29-abd0-982ba1568fa2","Type":"ContainerDied","Data":"85a0604b1be6531fb9f65934e2db2f027d223a9203e807624f9d606e4386e210"} Apr 17 16:20:21.318868 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:21.318830 2572 generic.go:358] "Generic (PLEG): container finished" podID="8de8591f-0659-4b29-abd0-982ba1568fa2" containerID="dd2a8dc290a4260942b7485fc9e898632d5b351b7dfd872065530e048f6803fa" exitCode=0 Apr 17 16:20:21.319309 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:21.318886 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vt8g2" event={"ID":"8de8591f-0659-4b29-abd0-982ba1568fa2","Type":"ContainerDied","Data":"dd2a8dc290a4260942b7485fc9e898632d5b351b7dfd872065530e048f6803fa"} Apr 17 16:20:22.326530 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:22.326315 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vt8g2" event={"ID":"8de8591f-0659-4b29-abd0-982ba1568fa2","Type":"ContainerStarted","Data":"60639036e5fce2de14947c8db7fa52782c7870e40ed2aab846a374371c36640a"} Apr 17 16:20:22.349619 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:22.349562 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vt8g2" podStartSLOduration=5.920079086 podStartE2EDuration="39.349545141s" podCreationTimestamp="2026-04-17 16:19:43 +0000 UTC" firstStartedPulling="2026-04-17 16:19:45.899566307 +0000 UTC m=+3.240054367" lastFinishedPulling="2026-04-17 16:20:19.329032364 +0000 UTC m=+36.669520422" observedRunningTime="2026-04-17 16:20:22.347503213 +0000 UTC m=+39.687991302" watchObservedRunningTime="2026-04-17 16:20:22.349545141 +0000 UTC m=+39.690033213" Apr 17 16:20:23.861571 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:23.861529 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls\") pod \"dns-default-v5rbd\" (UID: \"609a9cbf-301f-406b-a26d-13ae069e0a70\") " pod="openshift-dns/dns-default-v5rbd" Apr 17 16:20:23.861571 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:23.861575 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert\") pod \"ingress-canary-9sgrn\" (UID: \"5caf5aa7-4606-4fa1-8754-cab1cd67eac0\") " pod="openshift-ingress-canary/ingress-canary-9sgrn" Apr 17 16:20:23.862053 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:23.861682 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:20:23.862053 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:23.861704 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:20:23.862053 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:23.861732 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert podName:5caf5aa7-4606-4fa1-8754-cab1cd67eac0 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:31.861719387 +0000 UTC m=+49.202207444 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert") pod "ingress-canary-9sgrn" (UID: "5caf5aa7-4606-4fa1-8754-cab1cd67eac0") : secret "canary-serving-cert" not found Apr 17 16:20:23.862053 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:23.861775 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls podName:609a9cbf-301f-406b-a26d-13ae069e0a70 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:31.861756824 +0000 UTC m=+49.202244881 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls") pod "dns-default-v5rbd" (UID: "609a9cbf-301f-406b-a26d-13ae069e0a70") : secret "dns-default-metrics-tls" not found Apr 17 16:20:24.331161 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:24.331127 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dvwcd" event={"ID":"a4442d85-8b8a-48bf-b06b-5d49262f2b07","Type":"ContainerStarted","Data":"a0eae73e0edf4f19cacf12088d9936c9767bf61a29f4d8c6f10433c2ce21a19b"} Apr 17 16:20:31.920338 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:31.920290 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls\") pod \"dns-default-v5rbd\" (UID: \"609a9cbf-301f-406b-a26d-13ae069e0a70\") " pod="openshift-dns/dns-default-v5rbd" Apr 17 16:20:31.920338 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:31.920343 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert\") pod \"ingress-canary-9sgrn\" (UID: \"5caf5aa7-4606-4fa1-8754-cab1cd67eac0\") " pod="openshift-ingress-canary/ingress-canary-9sgrn" Apr 17 16:20:31.920934 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:31.920455 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:20:31.920934 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:31.920472 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:20:31.920934 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:31.920521 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls podName:609a9cbf-301f-406b-a26d-13ae069e0a70 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:47.920505179 +0000 UTC m=+65.260993237 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls") pod "dns-default-v5rbd" (UID: "609a9cbf-301f-406b-a26d-13ae069e0a70") : secret "dns-default-metrics-tls" not found Apr 17 16:20:31.920934 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:31.920791 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert podName:5caf5aa7-4606-4fa1-8754-cab1cd67eac0 nodeName:}" failed. No retries permitted until 2026-04-17 16:20:47.920661035 +0000 UTC m=+65.261149092 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert") pod "ingress-canary-9sgrn" (UID: "5caf5aa7-4606-4fa1-8754-cab1cd67eac0") : secret "canary-serving-cert" not found Apr 17 16:20:43.310097 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:43.310055 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nhjqx" Apr 17 16:20:43.336355 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:43.336309 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-dvwcd" podStartSLOduration=25.421845015 podStartE2EDuration="29.336295331s" podCreationTimestamp="2026-04-17 16:20:14 +0000 UTC" firstStartedPulling="2026-04-17 16:20:19.306941314 +0000 UTC m=+36.647429372" lastFinishedPulling="2026-04-17 16:20:23.221391626 +0000 UTC m=+40.561879688" observedRunningTime="2026-04-17 16:20:24.347211734 +0000 UTC m=+41.687699814" watchObservedRunningTime="2026-04-17 16:20:43.336295331 +0000 UTC m=+60.676783409" Apr 17 16:20:47.925489 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:47.925447 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls\") pod \"dns-default-v5rbd\" (UID: \"609a9cbf-301f-406b-a26d-13ae069e0a70\") " pod="openshift-dns/dns-default-v5rbd" Apr 17 16:20:47.925896 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:47.925497 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert\") pod \"ingress-canary-9sgrn\" (UID: \"5caf5aa7-4606-4fa1-8754-cab1cd67eac0\") " pod="openshift-ingress-canary/ingress-canary-9sgrn" Apr 17 16:20:47.925896 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:47.925604 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:20:47.925896 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:47.925664 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert podName:5caf5aa7-4606-4fa1-8754-cab1cd67eac0 nodeName:}" failed. No retries permitted until 2026-04-17 16:21:19.92564919 +0000 UTC m=+97.266137247 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert") pod "ingress-canary-9sgrn" (UID: "5caf5aa7-4606-4fa1-8754-cab1cd67eac0") : secret "canary-serving-cert" not found Apr 17 16:20:47.925896 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:47.925605 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:20:47.925896 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:47.925748 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls podName:609a9cbf-301f-406b-a26d-13ae069e0a70 nodeName:}" failed. No retries permitted until 2026-04-17 16:21:19.92573333 +0000 UTC m=+97.266221393 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls") pod "dns-default-v5rbd" (UID: "609a9cbf-301f-406b-a26d-13ae069e0a70") : secret "dns-default-metrics-tls" not found Apr 17 16:20:48.833660 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:48.833621 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs\") pod \"network-metrics-daemon-j89hr\" (UID: \"dbd283d5-ff0b-4c8f-b1be-15a75816e953\") " pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:20:48.836344 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:48.836325 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 16:20:48.844548 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:48.844530 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:20:48.844607 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:20:48.844593 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs podName:dbd283d5-ff0b-4c8f-b1be-15a75816e953 nodeName:}" failed. No retries permitted until 2026-04-17 16:21:52.844571188 +0000 UTC m=+130.185059257 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs") pod "network-metrics-daemon-j89hr" (UID: "dbd283d5-ff0b-4c8f-b1be-15a75816e953") : secret "metrics-daemon-secret" not found Apr 17 16:20:49.034967 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:49.034917 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5r4m6\" (UniqueName: \"kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6\") pod \"network-check-target-xnc8v\" (UID: \"30245442-5a33-4b64-a9be-b62b496e3e7b\") " pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:20:49.037714 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:49.037694 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 16:20:49.047809 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:49.047788 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 16:20:49.059659 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:49.059631 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r4m6\" (UniqueName: \"kubernetes.io/projected/30245442-5a33-4b64-a9be-b62b496e3e7b-kube-api-access-5r4m6\") pod \"network-check-target-xnc8v\" (UID: \"30245442-5a33-4b64-a9be-b62b496e3e7b\") " pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:20:49.304151 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:49.304113 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-59zbx\"" Apr 17 16:20:49.311788 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:49.311763 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:20:49.424942 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:49.424913 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xnc8v"] Apr 17 16:20:49.429248 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:20:49.429222 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30245442_5a33_4b64_a9be_b62b496e3e7b.slice/crio-e634a1790f4e344f622ef268f2c50cab336d083a34014e6008d05b1ed37af42d WatchSource:0}: Error finding container e634a1790f4e344f622ef268f2c50cab336d083a34014e6008d05b1ed37af42d: Status 404 returned error can't find the container with id e634a1790f4e344f622ef268f2c50cab336d083a34014e6008d05b1ed37af42d Apr 17 16:20:50.382272 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:50.382235 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xnc8v" event={"ID":"30245442-5a33-4b64-a9be-b62b496e3e7b","Type":"ContainerStarted","Data":"e634a1790f4e344f622ef268f2c50cab336d083a34014e6008d05b1ed37af42d"} Apr 17 16:20:53.389912 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:53.389880 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xnc8v" event={"ID":"30245442-5a33-4b64-a9be-b62b496e3e7b","Type":"ContainerStarted","Data":"f81f9c580263a76fd9441f8c7ab95cb401de8ff395b50ddc153bc7343135ced4"} Apr 17 16:20:53.390317 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:20:53.390050 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:21:19.958424 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:19.958281 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls\") pod \"dns-default-v5rbd\" (UID: \"609a9cbf-301f-406b-a26d-13ae069e0a70\") " pod="openshift-dns/dns-default-v5rbd" Apr 17 16:21:19.958424 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:19.958337 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert\") pod \"ingress-canary-9sgrn\" (UID: \"5caf5aa7-4606-4fa1-8754-cab1cd67eac0\") " pod="openshift-ingress-canary/ingress-canary-9sgrn" Apr 17 16:21:19.958916 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:19.958445 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 16:21:19.958916 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:19.958494 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert podName:5caf5aa7-4606-4fa1-8754-cab1cd67eac0 nodeName:}" failed. No retries permitted until 2026-04-17 16:22:23.958480036 +0000 UTC m=+161.298968093 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert") pod "ingress-canary-9sgrn" (UID: "5caf5aa7-4606-4fa1-8754-cab1cd67eac0") : secret "canary-serving-cert" not found Apr 17 16:21:19.958916 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:19.958445 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 16:21:19.958916 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:19.958591 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls podName:609a9cbf-301f-406b-a26d-13ae069e0a70 nodeName:}" failed. No retries permitted until 2026-04-17 16:22:23.958575843 +0000 UTC m=+161.299063900 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls") pod "dns-default-v5rbd" (UID: "609a9cbf-301f-406b-a26d-13ae069e0a70") : secret "dns-default-metrics-tls" not found Apr 17 16:21:24.394504 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:24.394473 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xnc8v" Apr 17 16:21:24.410020 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:24.409975 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-xnc8v" podStartSLOduration=98.3387622 podStartE2EDuration="1m41.409959383s" podCreationTimestamp="2026-04-17 16:19:43 +0000 UTC" firstStartedPulling="2026-04-17 16:20:49.431025872 +0000 UTC m=+66.771513930" lastFinishedPulling="2026-04-17 16:20:52.502223053 +0000 UTC m=+69.842711113" observedRunningTime="2026-04-17 16:20:53.404579733 +0000 UTC m=+70.745067811" watchObservedRunningTime="2026-04-17 16:21:24.409959383 +0000 UTC m=+101.750447464" Apr 17 16:21:30.084065 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.084027 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-mqbqx"] Apr 17 16:21:30.088740 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.088717 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5948c7b7c8-rmrfc"] Apr 17 16:21:30.088897 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.088879 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-mqbqx" Apr 17 16:21:30.091781 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.091755 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:30.091891 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.091842 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-7m68n\"" Apr 17 16:21:30.092352 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.092333 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 16:21:30.092451 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.092338 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 16:21:30.092451 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.092375 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 16:21:30.092451 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.092392 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 16:21:30.096123 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.096101 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-mqbqx"] Apr 17 16:21:30.096465 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.096446 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 16:21:30.096637 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.096617 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 17 16:21:30.096710 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.096653 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 17 16:21:30.096710 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.096483 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 17 16:21:30.096810 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.096624 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 16:21:30.096810 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.096800 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-dfq4x\"" Apr 17 16:21:30.096909 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.096455 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 17 16:21:30.098955 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.098934 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 16:21:30.101543 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.101522 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5948c7b7c8-rmrfc"] Apr 17 16:21:30.230183 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.230148 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80fc0241-d0ad-42e2-9e15-932722a75ffa-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-mqbqx\" (UID: \"80fc0241-d0ad-42e2-9e15-932722a75ffa\") " pod="openshift-insights/insights-operator-585dfdc468-mqbqx" Apr 17 16:21:30.230183 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.230180 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80fc0241-d0ad-42e2-9e15-932722a75ffa-service-ca-bundle\") pod \"insights-operator-585dfdc468-mqbqx\" (UID: \"80fc0241-d0ad-42e2-9e15-932722a75ffa\") " pod="openshift-insights/insights-operator-585dfdc468-mqbqx" Apr 17 16:21:30.230378 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.230200 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsc8r\" (UniqueName: \"kubernetes.io/projected/b5767428-43d6-4cbe-9763-0731e126b82c-kube-api-access-hsc8r\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:30.230378 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.230259 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-stats-auth\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:30.230378 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.230296 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5767428-43d6-4cbe-9763-0731e126b82c-service-ca-bundle\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:30.230378 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.230321 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-default-certificate\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:30.230378 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.230359 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80fc0241-d0ad-42e2-9e15-932722a75ffa-serving-cert\") pod \"insights-operator-585dfdc468-mqbqx\" (UID: \"80fc0241-d0ad-42e2-9e15-932722a75ffa\") " pod="openshift-insights/insights-operator-585dfdc468-mqbqx" Apr 17 16:21:30.230524 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.230387 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/80fc0241-d0ad-42e2-9e15-932722a75ffa-snapshots\") pod \"insights-operator-585dfdc468-mqbqx\" (UID: \"80fc0241-d0ad-42e2-9e15-932722a75ffa\") " pod="openshift-insights/insights-operator-585dfdc468-mqbqx" Apr 17 16:21:30.230524 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.230421 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/80fc0241-d0ad-42e2-9e15-932722a75ffa-tmp\") pod \"insights-operator-585dfdc468-mqbqx\" (UID: \"80fc0241-d0ad-42e2-9e15-932722a75ffa\") " pod="openshift-insights/insights-operator-585dfdc468-mqbqx" Apr 17 16:21:30.230524 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.230438 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf7wn\" (UniqueName: \"kubernetes.io/projected/80fc0241-d0ad-42e2-9e15-932722a75ffa-kube-api-access-vf7wn\") pod \"insights-operator-585dfdc468-mqbqx\" (UID: \"80fc0241-d0ad-42e2-9e15-932722a75ffa\") " pod="openshift-insights/insights-operator-585dfdc468-mqbqx" Apr 17 16:21:30.230524 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.230455 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-metrics-certs\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:30.330918 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.330880 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/80fc0241-d0ad-42e2-9e15-932722a75ffa-snapshots\") pod \"insights-operator-585dfdc468-mqbqx\" (UID: \"80fc0241-d0ad-42e2-9e15-932722a75ffa\") " pod="openshift-insights/insights-operator-585dfdc468-mqbqx" Apr 17 16:21:30.331179 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.331001 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/80fc0241-d0ad-42e2-9e15-932722a75ffa-tmp\") pod \"insights-operator-585dfdc468-mqbqx\" (UID: \"80fc0241-d0ad-42e2-9e15-932722a75ffa\") " pod="openshift-insights/insights-operator-585dfdc468-mqbqx" Apr 17 16:21:30.331179 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.331038 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vf7wn\" (UniqueName: \"kubernetes.io/projected/80fc0241-d0ad-42e2-9e15-932722a75ffa-kube-api-access-vf7wn\") pod \"insights-operator-585dfdc468-mqbqx\" (UID: \"80fc0241-d0ad-42e2-9e15-932722a75ffa\") " pod="openshift-insights/insights-operator-585dfdc468-mqbqx" Apr 17 16:21:30.331179 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.331058 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-metrics-certs\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:30.331179 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.331110 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80fc0241-d0ad-42e2-9e15-932722a75ffa-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-mqbqx\" (UID: \"80fc0241-d0ad-42e2-9e15-932722a75ffa\") " pod="openshift-insights/insights-operator-585dfdc468-mqbqx" Apr 17 16:21:30.331383 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:30.331213 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:21:30.331383 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.331281 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80fc0241-d0ad-42e2-9e15-932722a75ffa-service-ca-bundle\") pod \"insights-operator-585dfdc468-mqbqx\" (UID: \"80fc0241-d0ad-42e2-9e15-932722a75ffa\") " pod="openshift-insights/insights-operator-585dfdc468-mqbqx" Apr 17 16:21:30.331383 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:30.331295 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-metrics-certs podName:b5767428-43d6-4cbe-9763-0731e126b82c nodeName:}" failed. No retries permitted until 2026-04-17 16:21:30.831271035 +0000 UTC m=+108.171759109 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-metrics-certs") pod "router-default-5948c7b7c8-rmrfc" (UID: "b5767428-43d6-4cbe-9763-0731e126b82c") : secret "router-metrics-certs-default" not found Apr 17 16:21:30.331383 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.331341 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsc8r\" (UniqueName: \"kubernetes.io/projected/b5767428-43d6-4cbe-9763-0731e126b82c-kube-api-access-hsc8r\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:30.331618 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.331458 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-stats-auth\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:30.331618 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.331507 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5767428-43d6-4cbe-9763-0731e126b82c-service-ca-bundle\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:30.331618 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.331551 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-default-certificate\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:30.331618 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.331592 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80fc0241-d0ad-42e2-9e15-932722a75ffa-serving-cert\") pod \"insights-operator-585dfdc468-mqbqx\" (UID: \"80fc0241-d0ad-42e2-9e15-932722a75ffa\") " pod="openshift-insights/insights-operator-585dfdc468-mqbqx" Apr 17 16:21:30.331618 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.331598 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/80fc0241-d0ad-42e2-9e15-932722a75ffa-tmp\") pod \"insights-operator-585dfdc468-mqbqx\" (UID: \"80fc0241-d0ad-42e2-9e15-932722a75ffa\") " pod="openshift-insights/insights-operator-585dfdc468-mqbqx" Apr 17 16:21:30.331944 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.331878 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80fc0241-d0ad-42e2-9e15-932722a75ffa-service-ca-bundle\") pod \"insights-operator-585dfdc468-mqbqx\" (UID: \"80fc0241-d0ad-42e2-9e15-932722a75ffa\") " pod="openshift-insights/insights-operator-585dfdc468-mqbqx" Apr 17 16:21:30.332195 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:30.332047 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b5767428-43d6-4cbe-9763-0731e126b82c-service-ca-bundle podName:b5767428-43d6-4cbe-9763-0731e126b82c nodeName:}" failed. No retries permitted until 2026-04-17 16:21:30.83202882 +0000 UTC m=+108.172516880 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b5767428-43d6-4cbe-9763-0731e126b82c-service-ca-bundle") pod "router-default-5948c7b7c8-rmrfc" (UID: "b5767428-43d6-4cbe-9763-0731e126b82c") : configmap references non-existent config key: service-ca.crt Apr 17 16:21:30.332295 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.332200 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/80fc0241-d0ad-42e2-9e15-932722a75ffa-snapshots\") pod \"insights-operator-585dfdc468-mqbqx\" (UID: \"80fc0241-d0ad-42e2-9e15-932722a75ffa\") " pod="openshift-insights/insights-operator-585dfdc468-mqbqx" Apr 17 16:21:30.332547 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.332528 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80fc0241-d0ad-42e2-9e15-932722a75ffa-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-mqbqx\" (UID: \"80fc0241-d0ad-42e2-9e15-932722a75ffa\") " pod="openshift-insights/insights-operator-585dfdc468-mqbqx" Apr 17 16:21:30.334137 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.334061 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80fc0241-d0ad-42e2-9e15-932722a75ffa-serving-cert\") pod \"insights-operator-585dfdc468-mqbqx\" (UID: \"80fc0241-d0ad-42e2-9e15-932722a75ffa\") " pod="openshift-insights/insights-operator-585dfdc468-mqbqx" Apr 17 16:21:30.334210 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.334153 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-default-certificate\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:30.334210 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.334159 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-stats-auth\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:30.339987 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.339958 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsc8r\" (UniqueName: \"kubernetes.io/projected/b5767428-43d6-4cbe-9763-0731e126b82c-kube-api-access-hsc8r\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:30.340452 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.340426 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf7wn\" (UniqueName: \"kubernetes.io/projected/80fc0241-d0ad-42e2-9e15-932722a75ffa-kube-api-access-vf7wn\") pod \"insights-operator-585dfdc468-mqbqx\" (UID: \"80fc0241-d0ad-42e2-9e15-932722a75ffa\") " pod="openshift-insights/insights-operator-585dfdc468-mqbqx" Apr 17 16:21:30.401474 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.401430 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-mqbqx" Apr 17 16:21:30.514881 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.514850 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-mqbqx"] Apr 17 16:21:30.518044 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:21:30.518021 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80fc0241_d0ad_42e2_9e15_932722a75ffa.slice/crio-11b7f18df8d28d82192b85d5c2aea2647383fe0be68ba55a0473ba296861efad WatchSource:0}: Error finding container 11b7f18df8d28d82192b85d5c2aea2647383fe0be68ba55a0473ba296861efad: Status 404 returned error can't find the container with id 11b7f18df8d28d82192b85d5c2aea2647383fe0be68ba55a0473ba296861efad Apr 17 16:21:30.834563 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.834506 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5767428-43d6-4cbe-9763-0731e126b82c-service-ca-bundle\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:30.834757 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:30.834593 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-metrics-certs\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:30.834757 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:30.834658 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b5767428-43d6-4cbe-9763-0731e126b82c-service-ca-bundle podName:b5767428-43d6-4cbe-9763-0731e126b82c nodeName:}" failed. No retries permitted until 2026-04-17 16:21:31.834639261 +0000 UTC m=+109.175127318 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b5767428-43d6-4cbe-9763-0731e126b82c-service-ca-bundle") pod "router-default-5948c7b7c8-rmrfc" (UID: "b5767428-43d6-4cbe-9763-0731e126b82c") : configmap references non-existent config key: service-ca.crt Apr 17 16:21:30.834757 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:30.834701 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:21:30.834757 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:30.834752 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-metrics-certs podName:b5767428-43d6-4cbe-9763-0731e126b82c nodeName:}" failed. No retries permitted until 2026-04-17 16:21:31.83474111 +0000 UTC m=+109.175229168 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-metrics-certs") pod "router-default-5948c7b7c8-rmrfc" (UID: "b5767428-43d6-4cbe-9763-0731e126b82c") : secret "router-metrics-certs-default" not found Apr 17 16:21:31.461607 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:31.461568 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-mqbqx" event={"ID":"80fc0241-d0ad-42e2-9e15-932722a75ffa","Type":"ContainerStarted","Data":"11b7f18df8d28d82192b85d5c2aea2647383fe0be68ba55a0473ba296861efad"} Apr 17 16:21:31.842052 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:31.841967 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5767428-43d6-4cbe-9763-0731e126b82c-service-ca-bundle\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:31.842052 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:31.842038 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-metrics-certs\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:31.842276 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:31.842184 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:21:31.842276 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:31.842197 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b5767428-43d6-4cbe-9763-0731e126b82c-service-ca-bundle podName:b5767428-43d6-4cbe-9763-0731e126b82c nodeName:}" failed. No retries permitted until 2026-04-17 16:21:33.84217336 +0000 UTC m=+111.182661422 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b5767428-43d6-4cbe-9763-0731e126b82c-service-ca-bundle") pod "router-default-5948c7b7c8-rmrfc" (UID: "b5767428-43d6-4cbe-9763-0731e126b82c") : configmap references non-existent config key: service-ca.crt Apr 17 16:21:31.842276 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:31.842249 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-metrics-certs podName:b5767428-43d6-4cbe-9763-0731e126b82c nodeName:}" failed. No retries permitted until 2026-04-17 16:21:33.842231926 +0000 UTC m=+111.182719992 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-metrics-certs") pod "router-default-5948c7b7c8-rmrfc" (UID: "b5767428-43d6-4cbe-9763-0731e126b82c") : secret "router-metrics-certs-default" not found Apr 17 16:21:33.466746 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:33.466655 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-mqbqx" event={"ID":"80fc0241-d0ad-42e2-9e15-932722a75ffa","Type":"ContainerStarted","Data":"d695306fbba55eac2488dd6b4c1f61df7728455b40f4423ae5fb67bb54e391f2"} Apr 17 16:21:33.483115 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:33.483049 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-mqbqx" podStartSLOduration=0.814078204 podStartE2EDuration="3.483036518s" podCreationTimestamp="2026-04-17 16:21:30 +0000 UTC" firstStartedPulling="2026-04-17 16:21:30.51973045 +0000 UTC m=+107.860218510" lastFinishedPulling="2026-04-17 16:21:33.188688766 +0000 UTC m=+110.529176824" observedRunningTime="2026-04-17 16:21:33.481947601 +0000 UTC m=+110.822435693" watchObservedRunningTime="2026-04-17 16:21:33.483036518 +0000 UTC m=+110.823524640" Apr 17 16:21:33.858593 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:33.858492 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5767428-43d6-4cbe-9763-0731e126b82c-service-ca-bundle\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:33.858593 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:33.858561 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-metrics-certs\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:33.858829 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:33.858643 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:21:33.858829 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:33.858691 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b5767428-43d6-4cbe-9763-0731e126b82c-service-ca-bundle podName:b5767428-43d6-4cbe-9763-0731e126b82c nodeName:}" failed. No retries permitted until 2026-04-17 16:21:37.858669761 +0000 UTC m=+115.199157823 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b5767428-43d6-4cbe-9763-0731e126b82c-service-ca-bundle") pod "router-default-5948c7b7c8-rmrfc" (UID: "b5767428-43d6-4cbe-9763-0731e126b82c") : configmap references non-existent config key: service-ca.crt Apr 17 16:21:33.858829 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:33.858725 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-metrics-certs podName:b5767428-43d6-4cbe-9763-0731e126b82c nodeName:}" failed. No retries permitted until 2026-04-17 16:21:37.858715596 +0000 UTC m=+115.199203654 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-metrics-certs") pod "router-default-5948c7b7c8-rmrfc" (UID: "b5767428-43d6-4cbe-9763-0731e126b82c") : secret "router-metrics-certs-default" not found Apr 17 16:21:36.717193 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:36.717164 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s4nbk_b0030b4f-f856-49ec-87a6-eca6a00291ad/dns-node-resolver/0.log" Apr 17 16:21:37.068587 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.068506 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ljhb6"] Apr 17 16:21:37.071551 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.071535 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" Apr 17 16:21:37.074331 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.074312 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-qzrs8\"" Apr 17 16:21:37.074445 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.074333 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:21:37.075553 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.075524 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 16:21:37.075553 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.075543 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 16:21:37.075736 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.075582 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 16:21:37.080694 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.080675 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 16:21:37.081003 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.080983 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ljhb6"] Apr 17 16:21:37.182280 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.182252 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72bbd866-8c40-48e3-9eb3-b34ae76679de-trusted-ca\") pod \"console-operator-9d4b6777b-ljhb6\" (UID: \"72bbd866-8c40-48e3-9eb3-b34ae76679de\") " pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" Apr 17 16:21:37.182471 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.182314 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72bbd866-8c40-48e3-9eb3-b34ae76679de-serving-cert\") pod \"console-operator-9d4b6777b-ljhb6\" (UID: \"72bbd866-8c40-48e3-9eb3-b34ae76679de\") " pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" Apr 17 16:21:37.182471 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.182378 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6bjh\" (UniqueName: \"kubernetes.io/projected/72bbd866-8c40-48e3-9eb3-b34ae76679de-kube-api-access-m6bjh\") pod \"console-operator-9d4b6777b-ljhb6\" (UID: \"72bbd866-8c40-48e3-9eb3-b34ae76679de\") " pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" Apr 17 16:21:37.182471 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.182439 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72bbd866-8c40-48e3-9eb3-b34ae76679de-config\") pod \"console-operator-9d4b6777b-ljhb6\" (UID: \"72bbd866-8c40-48e3-9eb3-b34ae76679de\") " pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" Apr 17 16:21:37.282773 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.282742 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72bbd866-8c40-48e3-9eb3-b34ae76679de-trusted-ca\") pod \"console-operator-9d4b6777b-ljhb6\" (UID: \"72bbd866-8c40-48e3-9eb3-b34ae76679de\") " pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" Apr 17 16:21:37.282909 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.282826 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72bbd866-8c40-48e3-9eb3-b34ae76679de-serving-cert\") pod \"console-operator-9d4b6777b-ljhb6\" (UID: \"72bbd866-8c40-48e3-9eb3-b34ae76679de\") " pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" Apr 17 16:21:37.282909 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.282855 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6bjh\" (UniqueName: \"kubernetes.io/projected/72bbd866-8c40-48e3-9eb3-b34ae76679de-kube-api-access-m6bjh\") pod \"console-operator-9d4b6777b-ljhb6\" (UID: \"72bbd866-8c40-48e3-9eb3-b34ae76679de\") " pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" Apr 17 16:21:37.282909 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.282897 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72bbd866-8c40-48e3-9eb3-b34ae76679de-config\") pod \"console-operator-9d4b6777b-ljhb6\" (UID: \"72bbd866-8c40-48e3-9eb3-b34ae76679de\") " pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" Apr 17 16:21:37.283556 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.283529 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72bbd866-8c40-48e3-9eb3-b34ae76679de-config\") pod \"console-operator-9d4b6777b-ljhb6\" (UID: \"72bbd866-8c40-48e3-9eb3-b34ae76679de\") " pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" Apr 17 16:21:37.283672 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.283558 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/72bbd866-8c40-48e3-9eb3-b34ae76679de-trusted-ca\") pod \"console-operator-9d4b6777b-ljhb6\" (UID: \"72bbd866-8c40-48e3-9eb3-b34ae76679de\") " pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" Apr 17 16:21:37.285281 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.285264 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72bbd866-8c40-48e3-9eb3-b34ae76679de-serving-cert\") pod \"console-operator-9d4b6777b-ljhb6\" (UID: \"72bbd866-8c40-48e3-9eb3-b34ae76679de\") " pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" Apr 17 16:21:37.291226 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.291199 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6bjh\" (UniqueName: \"kubernetes.io/projected/72bbd866-8c40-48e3-9eb3-b34ae76679de-kube-api-access-m6bjh\") pod \"console-operator-9d4b6777b-ljhb6\" (UID: \"72bbd866-8c40-48e3-9eb3-b34ae76679de\") " pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" Apr 17 16:21:37.381493 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.381464 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" Apr 17 16:21:37.494439 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.494376 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ljhb6"] Apr 17 16:21:37.497190 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:21:37.497163 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72bbd866_8c40_48e3_9eb3_b34ae76679de.slice/crio-a525bc1720c79ba5348398444d869a365c9ece6d4535add977d4d158c17eb947 WatchSource:0}: Error finding container a525bc1720c79ba5348398444d869a365c9ece6d4535add977d4d158c17eb947: Status 404 returned error can't find the container with id a525bc1720c79ba5348398444d869a365c9ece6d4535add977d4d158c17eb947 Apr 17 16:21:37.516829 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.516810 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fwdvx_ef613c01-eb3a-451b-b2f2-9eee9ab808cc/node-ca/0.log" Apr 17 16:21:37.886186 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.886156 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-metrics-certs\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:37.886621 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:37.886231 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5767428-43d6-4cbe-9763-0731e126b82c-service-ca-bundle\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:37.886621 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:37.886311 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:21:37.886621 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:37.886380 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b5767428-43d6-4cbe-9763-0731e126b82c-service-ca-bundle podName:b5767428-43d6-4cbe-9763-0731e126b82c nodeName:}" failed. No retries permitted until 2026-04-17 16:21:45.886359434 +0000 UTC m=+123.226847494 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b5767428-43d6-4cbe-9763-0731e126b82c-service-ca-bundle") pod "router-default-5948c7b7c8-rmrfc" (UID: "b5767428-43d6-4cbe-9763-0731e126b82c") : configmap references non-existent config key: service-ca.crt Apr 17 16:21:37.886621 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:37.886408 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-metrics-certs podName:b5767428-43d6-4cbe-9763-0731e126b82c nodeName:}" failed. No retries permitted until 2026-04-17 16:21:45.886398183 +0000 UTC m=+123.226886243 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-metrics-certs") pod "router-default-5948c7b7c8-rmrfc" (UID: "b5767428-43d6-4cbe-9763-0731e126b82c") : secret "router-metrics-certs-default" not found Apr 17 16:21:38.478492 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:38.478452 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" event={"ID":"72bbd866-8c40-48e3-9eb3-b34ae76679de","Type":"ContainerStarted","Data":"a525bc1720c79ba5348398444d869a365c9ece6d4535add977d4d158c17eb947"} Apr 17 16:21:40.112671 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.112591 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z4bpg"] Apr 17 16:21:40.115475 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.115457 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z4bpg" Apr 17 16:21:40.120003 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.119978 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 17 16:21:40.120143 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.119982 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 17 16:21:40.120143 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.119979 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-8d7fh\"" Apr 17 16:21:40.120688 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.120670 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:21:40.120790 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.120760 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 17 16:21:40.125759 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.125737 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z4bpg"] Apr 17 16:21:40.206846 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.206810 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4eb06b52-b2db-4c75-8034-44b127e20319-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-z4bpg\" (UID: \"4eb06b52-b2db-4c75-8034-44b127e20319\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z4bpg" Apr 17 16:21:40.206994 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.206935 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4eb06b52-b2db-4c75-8034-44b127e20319-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-z4bpg\" (UID: \"4eb06b52-b2db-4c75-8034-44b127e20319\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z4bpg" Apr 17 16:21:40.206994 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.206981 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j62jq\" (UniqueName: \"kubernetes.io/projected/4eb06b52-b2db-4c75-8034-44b127e20319-kube-api-access-j62jq\") pod \"kube-storage-version-migrator-operator-6769c5d45-z4bpg\" (UID: \"4eb06b52-b2db-4c75-8034-44b127e20319\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z4bpg" Apr 17 16:21:40.308025 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.307993 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4eb06b52-b2db-4c75-8034-44b127e20319-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-z4bpg\" (UID: \"4eb06b52-b2db-4c75-8034-44b127e20319\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z4bpg" Apr 17 16:21:40.308221 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.308091 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4eb06b52-b2db-4c75-8034-44b127e20319-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-z4bpg\" (UID: \"4eb06b52-b2db-4c75-8034-44b127e20319\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z4bpg" Apr 17 16:21:40.308221 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.308120 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j62jq\" (UniqueName: \"kubernetes.io/projected/4eb06b52-b2db-4c75-8034-44b127e20319-kube-api-access-j62jq\") pod \"kube-storage-version-migrator-operator-6769c5d45-z4bpg\" (UID: \"4eb06b52-b2db-4c75-8034-44b127e20319\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z4bpg" Apr 17 16:21:40.308606 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.308573 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4eb06b52-b2db-4c75-8034-44b127e20319-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-z4bpg\" (UID: \"4eb06b52-b2db-4c75-8034-44b127e20319\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z4bpg" Apr 17 16:21:40.310209 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.310187 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4eb06b52-b2db-4c75-8034-44b127e20319-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-z4bpg\" (UID: \"4eb06b52-b2db-4c75-8034-44b127e20319\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z4bpg" Apr 17 16:21:40.315664 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.315645 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j62jq\" (UniqueName: \"kubernetes.io/projected/4eb06b52-b2db-4c75-8034-44b127e20319-kube-api-access-j62jq\") pod \"kube-storage-version-migrator-operator-6769c5d45-z4bpg\" (UID: \"4eb06b52-b2db-4c75-8034-44b127e20319\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z4bpg" Apr 17 16:21:40.424002 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.423971 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z4bpg" Apr 17 16:21:40.485096 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.485057 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ljhb6_72bbd866-8c40-48e3-9eb3-b34ae76679de/console-operator/0.log" Apr 17 16:21:40.485211 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.485107 2572 generic.go:358] "Generic (PLEG): container finished" podID="72bbd866-8c40-48e3-9eb3-b34ae76679de" containerID="78d269b986661b31238f5c4a25f295792882f069271d16c046da8e25b55874b2" exitCode=255 Apr 17 16:21:40.485211 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.485162 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" event={"ID":"72bbd866-8c40-48e3-9eb3-b34ae76679de","Type":"ContainerDied","Data":"78d269b986661b31238f5c4a25f295792882f069271d16c046da8e25b55874b2"} Apr 17 16:21:40.485470 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.485457 2572 scope.go:117] "RemoveContainer" containerID="78d269b986661b31238f5c4a25f295792882f069271d16c046da8e25b55874b2" Apr 17 16:21:40.541131 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:40.541109 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z4bpg"] Apr 17 16:21:40.544145 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:21:40.544121 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4eb06b52_b2db_4c75_8034_44b127e20319.slice/crio-0e7fb728863d0fa8b1d3279e13129b8b373ee061b297b0db600dc2d2cbd96838 WatchSource:0}: Error finding container 0e7fb728863d0fa8b1d3279e13129b8b373ee061b297b0db600dc2d2cbd96838: Status 404 returned error can't find the container with id 0e7fb728863d0fa8b1d3279e13129b8b373ee061b297b0db600dc2d2cbd96838 Apr 17 16:21:41.488377 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:41.488345 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ljhb6_72bbd866-8c40-48e3-9eb3-b34ae76679de/console-operator/1.log" Apr 17 16:21:41.488841 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:41.488783 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ljhb6_72bbd866-8c40-48e3-9eb3-b34ae76679de/console-operator/0.log" Apr 17 16:21:41.488841 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:41.488828 2572 generic.go:358] "Generic (PLEG): container finished" podID="72bbd866-8c40-48e3-9eb3-b34ae76679de" containerID="6f1ba4a581006a74c495f4cfbd08519244edd03ccafaa0a418f41bb416bcf744" exitCode=255 Apr 17 16:21:41.488947 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:41.488865 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" event={"ID":"72bbd866-8c40-48e3-9eb3-b34ae76679de","Type":"ContainerDied","Data":"6f1ba4a581006a74c495f4cfbd08519244edd03ccafaa0a418f41bb416bcf744"} Apr 17 16:21:41.488947 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:41.488916 2572 scope.go:117] "RemoveContainer" containerID="78d269b986661b31238f5c4a25f295792882f069271d16c046da8e25b55874b2" Apr 17 16:21:41.489268 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:41.489241 2572 scope.go:117] "RemoveContainer" containerID="6f1ba4a581006a74c495f4cfbd08519244edd03ccafaa0a418f41bb416bcf744" Apr 17 16:21:41.489503 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:41.489472 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ljhb6_openshift-console-operator(72bbd866-8c40-48e3-9eb3-b34ae76679de)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" podUID="72bbd866-8c40-48e3-9eb3-b34ae76679de" Apr 17 16:21:41.489957 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:41.489930 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z4bpg" event={"ID":"4eb06b52-b2db-4c75-8034-44b127e20319","Type":"ContainerStarted","Data":"0e7fb728863d0fa8b1d3279e13129b8b373ee061b297b0db600dc2d2cbd96838"} Apr 17 16:21:42.493349 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:42.493309 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ljhb6_72bbd866-8c40-48e3-9eb3-b34ae76679de/console-operator/1.log" Apr 17 16:21:42.493770 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:42.493734 2572 scope.go:117] "RemoveContainer" containerID="6f1ba4a581006a74c495f4cfbd08519244edd03ccafaa0a418f41bb416bcf744" Apr 17 16:21:42.493958 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:42.493938 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ljhb6_openshift-console-operator(72bbd866-8c40-48e3-9eb3-b34ae76679de)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" podUID="72bbd866-8c40-48e3-9eb3-b34ae76679de" Apr 17 16:21:43.497119 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:43.497068 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z4bpg" event={"ID":"4eb06b52-b2db-4c75-8034-44b127e20319","Type":"ContainerStarted","Data":"1503f2a9eda6cb07a8fe5d71d68ac5b11167948f15604d45dd29bfa13d4412d2"} Apr 17 16:21:43.512802 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:43.512754 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z4bpg" podStartSLOduration=1.296756268 podStartE2EDuration="3.512735606s" podCreationTimestamp="2026-04-17 16:21:40 +0000 UTC" firstStartedPulling="2026-04-17 16:21:40.546066383 +0000 UTC m=+117.886554440" lastFinishedPulling="2026-04-17 16:21:42.76204572 +0000 UTC m=+120.102533778" observedRunningTime="2026-04-17 16:21:43.512537676 +0000 UTC m=+120.853025779" watchObservedRunningTime="2026-04-17 16:21:43.512735606 +0000 UTC m=+120.853223685" Apr 17 16:21:44.757666 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:44.757629 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6dmj7"] Apr 17 16:21:44.760698 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:44.760681 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6dmj7" Apr 17 16:21:44.763212 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:44.763185 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-pvp9z\"" Apr 17 16:21:44.767051 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:44.767027 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6dmj7"] Apr 17 16:21:44.846660 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:44.846606 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s97xj\" (UniqueName: \"kubernetes.io/projected/54b7a2f6-44b6-4d58-9229-937aa94ad687-kube-api-access-s97xj\") pod \"network-check-source-8894fc9bd-6dmj7\" (UID: \"54b7a2f6-44b6-4d58-9229-937aa94ad687\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6dmj7" Apr 17 16:21:44.947924 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:44.947870 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s97xj\" (UniqueName: \"kubernetes.io/projected/54b7a2f6-44b6-4d58-9229-937aa94ad687-kube-api-access-s97xj\") pod \"network-check-source-8894fc9bd-6dmj7\" (UID: \"54b7a2f6-44b6-4d58-9229-937aa94ad687\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6dmj7" Apr 17 16:21:44.955611 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:44.955582 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s97xj\" (UniqueName: \"kubernetes.io/projected/54b7a2f6-44b6-4d58-9229-937aa94ad687-kube-api-access-s97xj\") pod \"network-check-source-8894fc9bd-6dmj7\" (UID: \"54b7a2f6-44b6-4d58-9229-937aa94ad687\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6dmj7" Apr 17 16:21:45.070168 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:45.070048 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6dmj7" Apr 17 16:21:45.203442 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:45.203413 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6dmj7"] Apr 17 16:21:45.206515 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:21:45.206487 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54b7a2f6_44b6_4d58_9229_937aa94ad687.slice/crio-78a71821969da86a75bb2a06d2ceed8301544e0600991807a001880517b42b47 WatchSource:0}: Error finding container 78a71821969da86a75bb2a06d2ceed8301544e0600991807a001880517b42b47: Status 404 returned error can't find the container with id 78a71821969da86a75bb2a06d2ceed8301544e0600991807a001880517b42b47 Apr 17 16:21:45.503259 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:45.503217 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6dmj7" event={"ID":"54b7a2f6-44b6-4d58-9229-937aa94ad687","Type":"ContainerStarted","Data":"41c7db03c1193ce3921f6b8a7139e4aa5b755e80ccde16d076f1fd9966fc9d32"} Apr 17 16:21:45.503259 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:45.503264 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6dmj7" event={"ID":"54b7a2f6-44b6-4d58-9229-937aa94ad687","Type":"ContainerStarted","Data":"78a71821969da86a75bb2a06d2ceed8301544e0600991807a001880517b42b47"} Apr 17 16:21:45.519604 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:45.519549 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6dmj7" podStartSLOduration=1.5195336560000001 podStartE2EDuration="1.519533656s" podCreationTimestamp="2026-04-17 16:21:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:21:45.51903267 +0000 UTC m=+122.859520745" watchObservedRunningTime="2026-04-17 16:21:45.519533656 +0000 UTC m=+122.860021735" Apr 17 16:21:45.956407 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:45.956371 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5767428-43d6-4cbe-9763-0731e126b82c-service-ca-bundle\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:45.956816 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:45.956439 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-metrics-certs\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:21:45.956816 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:45.956549 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 17 16:21:45.956816 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:45.956582 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b5767428-43d6-4cbe-9763-0731e126b82c-service-ca-bundle podName:b5767428-43d6-4cbe-9763-0731e126b82c nodeName:}" failed. No retries permitted until 2026-04-17 16:22:01.956555654 +0000 UTC m=+139.297043715 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b5767428-43d6-4cbe-9763-0731e126b82c-service-ca-bundle") pod "router-default-5948c7b7c8-rmrfc" (UID: "b5767428-43d6-4cbe-9763-0731e126b82c") : configmap references non-existent config key: service-ca.crt Apr 17 16:21:45.956816 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:45.956610 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-metrics-certs podName:b5767428-43d6-4cbe-9763-0731e126b82c nodeName:}" failed. No retries permitted until 2026-04-17 16:22:01.956599945 +0000 UTC m=+139.297088035 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-metrics-certs") pod "router-default-5948c7b7c8-rmrfc" (UID: "b5767428-43d6-4cbe-9763-0731e126b82c") : secret "router-metrics-certs-default" not found Apr 17 16:21:47.382539 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:47.382487 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" Apr 17 16:21:47.382539 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:47.382547 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" Apr 17 16:21:47.383038 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:47.382983 2572 scope.go:117] "RemoveContainer" containerID="6f1ba4a581006a74c495f4cfbd08519244edd03ccafaa0a418f41bb416bcf744" Apr 17 16:21:47.383245 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:47.383221 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ljhb6_openshift-console-operator(72bbd866-8c40-48e3-9eb3-b34ae76679de)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" podUID="72bbd866-8c40-48e3-9eb3-b34ae76679de" Apr 17 16:21:52.914007 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:21:52.913964 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs\") pod \"network-metrics-daemon-j89hr\" (UID: \"dbd283d5-ff0b-4c8f-b1be-15a75816e953\") " pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:21:52.914461 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:52.914169 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 16:21:52.914461 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:21:52.914255 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs podName:dbd283d5-ff0b-4c8f-b1be-15a75816e953 nodeName:}" failed. No retries permitted until 2026-04-17 16:23:54.914237989 +0000 UTC m=+252.254726045 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs") pod "network-metrics-daemon-j89hr" (UID: "dbd283d5-ff0b-4c8f-b1be-15a75816e953") : secret "metrics-daemon-secret" not found Apr 17 16:22:00.174491 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:00.174445 2572 scope.go:117] "RemoveContainer" containerID="6f1ba4a581006a74c495f4cfbd08519244edd03ccafaa0a418f41bb416bcf744" Apr 17 16:22:00.540791 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:00.540709 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ljhb6_72bbd866-8c40-48e3-9eb3-b34ae76679de/console-operator/2.log" Apr 17 16:22:00.541103 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:00.541068 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ljhb6_72bbd866-8c40-48e3-9eb3-b34ae76679de/console-operator/1.log" Apr 17 16:22:00.541156 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:00.541120 2572 generic.go:358] "Generic (PLEG): container finished" podID="72bbd866-8c40-48e3-9eb3-b34ae76679de" containerID="6479cd547cda94dce0f4c95f541f33571657e7676d7e70469fe7ad79cf67fc6b" exitCode=255 Apr 17 16:22:00.541199 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:00.541185 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" event={"ID":"72bbd866-8c40-48e3-9eb3-b34ae76679de","Type":"ContainerDied","Data":"6479cd547cda94dce0f4c95f541f33571657e7676d7e70469fe7ad79cf67fc6b"} Apr 17 16:22:00.541234 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:00.541220 2572 scope.go:117] "RemoveContainer" containerID="6f1ba4a581006a74c495f4cfbd08519244edd03ccafaa0a418f41bb416bcf744" Apr 17 16:22:00.541542 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:00.541525 2572 scope.go:117] "RemoveContainer" containerID="6479cd547cda94dce0f4c95f541f33571657e7676d7e70469fe7ad79cf67fc6b" Apr 17 16:22:00.541723 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:22:00.541701 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-ljhb6_openshift-console-operator(72bbd866-8c40-48e3-9eb3-b34ae76679de)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" podUID="72bbd866-8c40-48e3-9eb3-b34ae76679de" Apr 17 16:22:01.544592 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:01.544567 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ljhb6_72bbd866-8c40-48e3-9eb3-b34ae76679de/console-operator/2.log" Apr 17 16:22:01.982094 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:01.982041 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-metrics-certs\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:22:01.982299 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:01.982155 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5767428-43d6-4cbe-9763-0731e126b82c-service-ca-bundle\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:22:01.982725 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:01.982701 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5767428-43d6-4cbe-9763-0731e126b82c-service-ca-bundle\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:22:01.984318 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:01.984300 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b5767428-43d6-4cbe-9763-0731e126b82c-metrics-certs\") pod \"router-default-5948c7b7c8-rmrfc\" (UID: \"b5767428-43d6-4cbe-9763-0731e126b82c\") " pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:22:02.208741 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:02.208704 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-dfq4x\"" Apr 17 16:22:02.217057 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:02.217034 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:22:02.331131 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:02.331096 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5948c7b7c8-rmrfc"] Apr 17 16:22:02.334035 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:22:02.334006 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5767428_43d6_4cbe_9763_0731e126b82c.slice/crio-febcd18527bac84af4c2ee701df148f732945485c74894f6c2addc488cb90dd3 WatchSource:0}: Error finding container febcd18527bac84af4c2ee701df148f732945485c74894f6c2addc488cb90dd3: Status 404 returned error can't find the container with id febcd18527bac84af4c2ee701df148f732945485c74894f6c2addc488cb90dd3 Apr 17 16:22:02.548360 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:02.548279 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" event={"ID":"b5767428-43d6-4cbe-9763-0731e126b82c","Type":"ContainerStarted","Data":"dfb7fcdb1885ba4d06f29394d572df6e58e28273dedf84239a70843cf22491f0"} Apr 17 16:22:02.548360 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:02.548317 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" event={"ID":"b5767428-43d6-4cbe-9763-0731e126b82c","Type":"ContainerStarted","Data":"febcd18527bac84af4c2ee701df148f732945485c74894f6c2addc488cb90dd3"} Apr 17 16:22:02.566317 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:02.566266 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" podStartSLOduration=32.566250285 podStartE2EDuration="32.566250285s" podCreationTimestamp="2026-04-17 16:21:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:22:02.56489117 +0000 UTC m=+139.905379254" watchObservedRunningTime="2026-04-17 16:22:02.566250285 +0000 UTC m=+139.906738363" Apr 17 16:22:03.217754 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:03.217713 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:22:03.220321 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:03.220298 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:22:03.552794 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:03.552721 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:22:03.553957 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:03.553933 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5948c7b7c8-rmrfc" Apr 17 16:22:05.961803 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:05.961769 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-r7w7c"] Apr 17 16:22:05.965105 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:05.965067 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-r7w7c" Apr 17 16:22:05.967692 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:05.967667 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 16:22:05.968715 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:05.968695 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 16:22:05.968794 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:05.968722 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-mn874\"" Apr 17 16:22:05.974756 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:05.974714 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-r7w7c"] Apr 17 16:22:06.011067 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.011034 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/96267ecb-07cc-48af-88b4-f6e710234cfb-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-r7w7c\" (UID: \"96267ecb-07cc-48af-88b4-f6e710234cfb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-r7w7c" Apr 17 16:22:06.011229 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.011184 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/96267ecb-07cc-48af-88b4-f6e710234cfb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-r7w7c\" (UID: \"96267ecb-07cc-48af-88b4-f6e710234cfb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-r7w7c" Apr 17 16:22:06.055499 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.055472 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-59b97fb566-dw8gb"] Apr 17 16:22:06.058561 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.058544 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.061496 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.061471 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 16:22:06.061628 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.061608 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-5mbwd\"" Apr 17 16:22:06.061674 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.061658 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 16:22:06.061724 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.061711 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 16:22:06.067890 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.067872 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 16:22:06.070508 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.070488 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-n6gcq"] Apr 17 16:22:06.073751 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.073731 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-n6gcq" Apr 17 16:22:06.073977 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.073952 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-59b97fb566-dw8gb"] Apr 17 16:22:06.076274 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.076251 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 16:22:06.076368 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.076260 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 16:22:06.076368 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.076261 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8zcsn\"" Apr 17 16:22:06.086359 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.086331 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-n6gcq"] Apr 17 16:22:06.111925 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.111899 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1779d4f2-fa2d-48b1-8e85-3b6c94c91d30-data-volume\") pod \"insights-runtime-extractor-n6gcq\" (UID: \"1779d4f2-fa2d-48b1-8e85-3b6c94c91d30\") " pod="openshift-insights/insights-runtime-extractor-n6gcq" Apr 17 16:22:06.112091 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.111928 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1779d4f2-fa2d-48b1-8e85-3b6c94c91d30-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n6gcq\" (UID: \"1779d4f2-fa2d-48b1-8e85-3b6c94c91d30\") " pod="openshift-insights/insights-runtime-extractor-n6gcq" Apr 17 16:22:06.112091 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.111951 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6eb7a238-0086-416f-a60f-1d6af3198eb9-installation-pull-secrets\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.112091 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.111970 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6eb7a238-0086-416f-a60f-1d6af3198eb9-registry-certificates\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.112091 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.111988 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1779d4f2-fa2d-48b1-8e85-3b6c94c91d30-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n6gcq\" (UID: \"1779d4f2-fa2d-48b1-8e85-3b6c94c91d30\") " pod="openshift-insights/insights-runtime-extractor-n6gcq" Apr 17 16:22:06.112091 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.112011 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/96267ecb-07cc-48af-88b4-f6e710234cfb-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-r7w7c\" (UID: \"96267ecb-07cc-48af-88b4-f6e710234cfb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-r7w7c" Apr 17 16:22:06.112091 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.112032 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6eb7a238-0086-416f-a60f-1d6af3198eb9-bound-sa-token\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.112091 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.112050 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6eb7a238-0086-416f-a60f-1d6af3198eb9-image-registry-private-configuration\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.112371 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.112128 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6eb7a238-0086-416f-a60f-1d6af3198eb9-registry-tls\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.112371 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.112147 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6eb7a238-0086-416f-a60f-1d6af3198eb9-trusted-ca\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.112371 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.112163 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/96267ecb-07cc-48af-88b4-f6e710234cfb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-r7w7c\" (UID: \"96267ecb-07cc-48af-88b4-f6e710234cfb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-r7w7c" Apr 17 16:22:06.112371 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.112179 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6eb7a238-0086-416f-a60f-1d6af3198eb9-ca-trust-extracted\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.112371 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.112201 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1779d4f2-fa2d-48b1-8e85-3b6c94c91d30-crio-socket\") pod \"insights-runtime-extractor-n6gcq\" (UID: \"1779d4f2-fa2d-48b1-8e85-3b6c94c91d30\") " pod="openshift-insights/insights-runtime-extractor-n6gcq" Apr 17 16:22:06.112371 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.112240 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psnvj\" (UniqueName: \"kubernetes.io/projected/1779d4f2-fa2d-48b1-8e85-3b6c94c91d30-kube-api-access-psnvj\") pod \"insights-runtime-extractor-n6gcq\" (UID: \"1779d4f2-fa2d-48b1-8e85-3b6c94c91d30\") " pod="openshift-insights/insights-runtime-extractor-n6gcq" Apr 17 16:22:06.112371 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.112285 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79dxx\" (UniqueName: \"kubernetes.io/projected/6eb7a238-0086-416f-a60f-1d6af3198eb9-kube-api-access-79dxx\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.112634 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.112615 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/96267ecb-07cc-48af-88b4-f6e710234cfb-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-r7w7c\" (UID: \"96267ecb-07cc-48af-88b4-f6e710234cfb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-r7w7c" Apr 17 16:22:06.114542 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.114523 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/96267ecb-07cc-48af-88b4-f6e710234cfb-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-r7w7c\" (UID: \"96267ecb-07cc-48af-88b4-f6e710234cfb\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-r7w7c" Apr 17 16:22:06.212710 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.212614 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6eb7a238-0086-416f-a60f-1d6af3198eb9-registry-tls\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.212710 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.212656 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6eb7a238-0086-416f-a60f-1d6af3198eb9-trusted-ca\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.212710 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.212675 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6eb7a238-0086-416f-a60f-1d6af3198eb9-ca-trust-extracted\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.212710 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.212706 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1779d4f2-fa2d-48b1-8e85-3b6c94c91d30-crio-socket\") pod \"insights-runtime-extractor-n6gcq\" (UID: \"1779d4f2-fa2d-48b1-8e85-3b6c94c91d30\") " pod="openshift-insights/insights-runtime-extractor-n6gcq" Apr 17 16:22:06.213027 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.212738 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psnvj\" (UniqueName: \"kubernetes.io/projected/1779d4f2-fa2d-48b1-8e85-3b6c94c91d30-kube-api-access-psnvj\") pod \"insights-runtime-extractor-n6gcq\" (UID: \"1779d4f2-fa2d-48b1-8e85-3b6c94c91d30\") " pod="openshift-insights/insights-runtime-extractor-n6gcq" Apr 17 16:22:06.213027 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.212764 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79dxx\" (UniqueName: \"kubernetes.io/projected/6eb7a238-0086-416f-a60f-1d6af3198eb9-kube-api-access-79dxx\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.213027 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.212827 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1779d4f2-fa2d-48b1-8e85-3b6c94c91d30-data-volume\") pod \"insights-runtime-extractor-n6gcq\" (UID: \"1779d4f2-fa2d-48b1-8e85-3b6c94c91d30\") " pod="openshift-insights/insights-runtime-extractor-n6gcq" Apr 17 16:22:06.213027 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.212849 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1779d4f2-fa2d-48b1-8e85-3b6c94c91d30-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n6gcq\" (UID: \"1779d4f2-fa2d-48b1-8e85-3b6c94c91d30\") " pod="openshift-insights/insights-runtime-extractor-n6gcq" Apr 17 16:22:06.213027 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.212848 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/1779d4f2-fa2d-48b1-8e85-3b6c94c91d30-crio-socket\") pod \"insights-runtime-extractor-n6gcq\" (UID: \"1779d4f2-fa2d-48b1-8e85-3b6c94c91d30\") " pod="openshift-insights/insights-runtime-extractor-n6gcq" Apr 17 16:22:06.213027 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.212909 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6eb7a238-0086-416f-a60f-1d6af3198eb9-installation-pull-secrets\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.213027 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.212947 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6eb7a238-0086-416f-a60f-1d6af3198eb9-registry-certificates\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.213027 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.212998 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1779d4f2-fa2d-48b1-8e85-3b6c94c91d30-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n6gcq\" (UID: \"1779d4f2-fa2d-48b1-8e85-3b6c94c91d30\") " pod="openshift-insights/insights-runtime-extractor-n6gcq" Apr 17 16:22:06.213446 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.213066 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6eb7a238-0086-416f-a60f-1d6af3198eb9-bound-sa-token\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.213446 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.213127 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6eb7a238-0086-416f-a60f-1d6af3198eb9-image-registry-private-configuration\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.213446 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.213181 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6eb7a238-0086-416f-a60f-1d6af3198eb9-ca-trust-extracted\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.213660 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.213465 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/1779d4f2-fa2d-48b1-8e85-3b6c94c91d30-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-n6gcq\" (UID: \"1779d4f2-fa2d-48b1-8e85-3b6c94c91d30\") " pod="openshift-insights/insights-runtime-extractor-n6gcq" Apr 17 16:22:06.213793 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.213769 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/1779d4f2-fa2d-48b1-8e85-3b6c94c91d30-data-volume\") pod \"insights-runtime-extractor-n6gcq\" (UID: \"1779d4f2-fa2d-48b1-8e85-3b6c94c91d30\") " pod="openshift-insights/insights-runtime-extractor-n6gcq" Apr 17 16:22:06.213880 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.213795 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6eb7a238-0086-416f-a60f-1d6af3198eb9-trusted-ca\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.213941 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.213881 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6eb7a238-0086-416f-a60f-1d6af3198eb9-registry-certificates\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.215512 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.215481 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/1779d4f2-fa2d-48b1-8e85-3b6c94c91d30-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-n6gcq\" (UID: \"1779d4f2-fa2d-48b1-8e85-3b6c94c91d30\") " pod="openshift-insights/insights-runtime-extractor-n6gcq" Apr 17 16:22:06.215639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.215486 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6eb7a238-0086-416f-a60f-1d6af3198eb9-installation-pull-secrets\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.215639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.215541 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6eb7a238-0086-416f-a60f-1d6af3198eb9-registry-tls\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.215761 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.215714 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6eb7a238-0086-416f-a60f-1d6af3198eb9-image-registry-private-configuration\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.223700 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.223679 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6eb7a238-0086-416f-a60f-1d6af3198eb9-bound-sa-token\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.223887 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.223870 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psnvj\" (UniqueName: \"kubernetes.io/projected/1779d4f2-fa2d-48b1-8e85-3b6c94c91d30-kube-api-access-psnvj\") pod \"insights-runtime-extractor-n6gcq\" (UID: \"1779d4f2-fa2d-48b1-8e85-3b6c94c91d30\") " pod="openshift-insights/insights-runtime-extractor-n6gcq" Apr 17 16:22:06.223951 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.223900 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79dxx\" (UniqueName: \"kubernetes.io/projected/6eb7a238-0086-416f-a60f-1d6af3198eb9-kube-api-access-79dxx\") pod \"image-registry-59b97fb566-dw8gb\" (UID: \"6eb7a238-0086-416f-a60f-1d6af3198eb9\") " pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.273536 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.273503 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-r7w7c" Apr 17 16:22:06.368515 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.368480 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:06.383367 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.383345 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-n6gcq" Apr 17 16:22:06.386726 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.386598 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-r7w7c"] Apr 17 16:22:06.389406 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:22:06.389376 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96267ecb_07cc_48af_88b4_f6e710234cfb.slice/crio-42ed88098a03394bac24bebb7f6810a0cb52ae62dd36841513776b3c6481b307 WatchSource:0}: Error finding container 42ed88098a03394bac24bebb7f6810a0cb52ae62dd36841513776b3c6481b307: Status 404 returned error can't find the container with id 42ed88098a03394bac24bebb7f6810a0cb52ae62dd36841513776b3c6481b307 Apr 17 16:22:06.519724 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.519693 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-59b97fb566-dw8gb"] Apr 17 16:22:06.522560 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:22:06.522529 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eb7a238_0086_416f_a60f_1d6af3198eb9.slice/crio-691618c6f76c6f719f415b12f165fc758b3634677eea3547d47c67b6ae0d8c21 WatchSource:0}: Error finding container 691618c6f76c6f719f415b12f165fc758b3634677eea3547d47c67b6ae0d8c21: Status 404 returned error can't find the container with id 691618c6f76c6f719f415b12f165fc758b3634677eea3547d47c67b6ae0d8c21 Apr 17 16:22:06.534987 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.534963 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-n6gcq"] Apr 17 16:22:06.536817 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:22:06.536782 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1779d4f2_fa2d_48b1_8e85_3b6c94c91d30.slice/crio-f53171c79b35c7b062574bab9c3265d9af6674ade8f58c2d33654c60cfa24c06 WatchSource:0}: Error finding container f53171c79b35c7b062574bab9c3265d9af6674ade8f58c2d33654c60cfa24c06: Status 404 returned error can't find the container with id f53171c79b35c7b062574bab9c3265d9af6674ade8f58c2d33654c60cfa24c06 Apr 17 16:22:06.560666 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.560636 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" event={"ID":"6eb7a238-0086-416f-a60f-1d6af3198eb9","Type":"ContainerStarted","Data":"691618c6f76c6f719f415b12f165fc758b3634677eea3547d47c67b6ae0d8c21"} Apr 17 16:22:06.561767 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.561745 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n6gcq" event={"ID":"1779d4f2-fa2d-48b1-8e85-3b6c94c91d30","Type":"ContainerStarted","Data":"f53171c79b35c7b062574bab9c3265d9af6674ade8f58c2d33654c60cfa24c06"} Apr 17 16:22:06.562918 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:06.562892 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-r7w7c" event={"ID":"96267ecb-07cc-48af-88b4-f6e710234cfb","Type":"ContainerStarted","Data":"42ed88098a03394bac24bebb7f6810a0cb52ae62dd36841513776b3c6481b307"} Apr 17 16:22:07.381940 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:07.381893 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" Apr 17 16:22:07.381940 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:07.381943 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" Apr 17 16:22:07.382428 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:07.382393 2572 scope.go:117] "RemoveContainer" containerID="6479cd547cda94dce0f4c95f541f33571657e7676d7e70469fe7ad79cf67fc6b" Apr 17 16:22:07.382668 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:22:07.382643 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-ljhb6_openshift-console-operator(72bbd866-8c40-48e3-9eb3-b34ae76679de)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" podUID="72bbd866-8c40-48e3-9eb3-b34ae76679de" Apr 17 16:22:07.567574 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:07.567535 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" event={"ID":"6eb7a238-0086-416f-a60f-1d6af3198eb9","Type":"ContainerStarted","Data":"838f853ad0a7418a6ade677b3696c1012da0cd08de2524adf8592c0a309f2327"} Apr 17 16:22:07.567747 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:07.567624 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:07.568950 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:07.568914 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n6gcq" event={"ID":"1779d4f2-fa2d-48b1-8e85-3b6c94c91d30","Type":"ContainerStarted","Data":"1b2618922a57170cbe520bc76588edd91d9abfe180f8e31637d24f3049e60af3"} Apr 17 16:22:07.586191 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:07.586147 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" podStartSLOduration=1.586135082 podStartE2EDuration="1.586135082s" podCreationTimestamp="2026-04-17 16:22:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:22:07.585537419 +0000 UTC m=+144.926025503" watchObservedRunningTime="2026-04-17 16:22:07.586135082 +0000 UTC m=+144.926623160" Apr 17 16:22:08.573134 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:08.573094 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n6gcq" event={"ID":"1779d4f2-fa2d-48b1-8e85-3b6c94c91d30","Type":"ContainerStarted","Data":"7441efe82a6c92dcaadd82451642dc6c7880a07ec024169d54b7d25fe481f5bb"} Apr 17 16:22:08.574533 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:08.574500 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-r7w7c" event={"ID":"96267ecb-07cc-48af-88b4-f6e710234cfb","Type":"ContainerStarted","Data":"6a11f175339017204d80c3b4059934216c27144a255f4fafa61d49e2af4ee766"} Apr 17 16:22:08.592786 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:08.592724 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-r7w7c" podStartSLOduration=2.130368346 podStartE2EDuration="3.592704765s" podCreationTimestamp="2026-04-17 16:22:05 +0000 UTC" firstStartedPulling="2026-04-17 16:22:06.391352024 +0000 UTC m=+143.731840086" lastFinishedPulling="2026-04-17 16:22:07.853688446 +0000 UTC m=+145.194176505" observedRunningTime="2026-04-17 16:22:08.591503486 +0000 UTC m=+145.931991576" watchObservedRunningTime="2026-04-17 16:22:08.592704765 +0000 UTC m=+145.933192847" Apr 17 16:22:09.578599 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:09.578512 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-n6gcq" event={"ID":"1779d4f2-fa2d-48b1-8e85-3b6c94c91d30","Type":"ContainerStarted","Data":"e5a27d538c787c6c5e6ad9941e5a3c66750645b788b78e94308804a819dfcb28"} Apr 17 16:22:09.602912 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:09.600847 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-n6gcq" podStartSLOduration=0.902461512 podStartE2EDuration="3.600829312s" podCreationTimestamp="2026-04-17 16:22:06 +0000 UTC" firstStartedPulling="2026-04-17 16:22:06.583120467 +0000 UTC m=+143.923608529" lastFinishedPulling="2026-04-17 16:22:09.281488267 +0000 UTC m=+146.621976329" observedRunningTime="2026-04-17 16:22:09.600350966 +0000 UTC m=+146.940839048" watchObservedRunningTime="2026-04-17 16:22:09.600829312 +0000 UTC m=+146.941317392" Apr 17 16:22:14.903249 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.903210 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2s6bc"] Apr 17 16:22:14.906013 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.905989 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2s6bc" Apr 17 16:22:14.908745 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.908721 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 16:22:14.908852 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.908749 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 16:22:14.908852 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.908749 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-f5d2k\"" Apr 17 16:22:14.908852 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.908801 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 16:22:14.909051 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.909036 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 16:22:14.909944 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.909927 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 16:22:14.916372 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.916353 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-72l89"] Apr 17 16:22:14.918389 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.918368 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:14.918908 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.918890 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-mjp46"] Apr 17 16:22:14.924106 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.922487 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 16:22:14.924106 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.922657 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 16:22:14.924106 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.922951 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 16:22:14.924106 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.923248 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-z7htb\"" Apr 17 16:22:14.924918 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.924897 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2s6bc"] Apr 17 16:22:14.925042 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.925031 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" Apr 17 16:22:14.927379 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.927352 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-c85mp\"" Apr 17 16:22:14.928062 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.928043 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 16:22:14.928155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.928063 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 16:22:14.928349 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.928305 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 16:22:14.932732 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.932707 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-mjp46"] Apr 17 16:22:14.984241 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.984210 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1ca17119-f140-4bd9-9cc4-59cb1122e37e-node-exporter-accelerators-collector-config\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:14.984241 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.984244 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-654kl\" (UniqueName: \"kubernetes.io/projected/bd957af0-a264-4665-92db-2be4172c6ef3-kube-api-access-654kl\") pod \"openshift-state-metrics-9d44df66c-2s6bc\" (UID: \"bd957af0-a264-4665-92db-2be4172c6ef3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2s6bc" Apr 17 16:22:14.984461 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.984295 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1a8eccc3-52d0-4b42-af0a-5a5338b67200-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-mjp46\" (UID: \"1a8eccc3-52d0-4b42-af0a-5a5338b67200\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" Apr 17 16:22:14.984461 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.984312 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd957af0-a264-4665-92db-2be4172c6ef3-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2s6bc\" (UID: \"bd957af0-a264-4665-92db-2be4172c6ef3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2s6bc" Apr 17 16:22:14.984461 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.984344 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1ca17119-f140-4bd9-9cc4-59cb1122e37e-node-exporter-textfile\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:14.984461 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.984432 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1ca17119-f140-4bd9-9cc4-59cb1122e37e-root\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:14.984598 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.984463 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1ca17119-f140-4bd9-9cc4-59cb1122e37e-sys\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:14.984598 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.984483 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1ca17119-f140-4bd9-9cc4-59cb1122e37e-node-exporter-wtmp\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:14.984598 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.984506 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bd957af0-a264-4665-92db-2be4172c6ef3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2s6bc\" (UID: \"bd957af0-a264-4665-92db-2be4172c6ef3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2s6bc" Apr 17 16:22:14.984598 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.984533 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a8eccc3-52d0-4b42-af0a-5a5338b67200-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-mjp46\" (UID: \"1a8eccc3-52d0-4b42-af0a-5a5338b67200\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" Apr 17 16:22:14.984598 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.984550 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1a8eccc3-52d0-4b42-af0a-5a5338b67200-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-mjp46\" (UID: \"1a8eccc3-52d0-4b42-af0a-5a5338b67200\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" Apr 17 16:22:14.984598 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.984567 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a8eccc3-52d0-4b42-af0a-5a5338b67200-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mjp46\" (UID: \"1a8eccc3-52d0-4b42-af0a-5a5338b67200\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" Apr 17 16:22:14.984778 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.984601 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1a8eccc3-52d0-4b42-af0a-5a5338b67200-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-mjp46\" (UID: \"1a8eccc3-52d0-4b42-af0a-5a5338b67200\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" Apr 17 16:22:14.984778 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.984647 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ca17119-f140-4bd9-9cc4-59cb1122e37e-metrics-client-ca\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:14.984778 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.984679 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47xsw\" (UniqueName: \"kubernetes.io/projected/1ca17119-f140-4bd9-9cc4-59cb1122e37e-kube-api-access-47xsw\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:14.984778 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.984715 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v67w2\" (UniqueName: \"kubernetes.io/projected/1a8eccc3-52d0-4b42-af0a-5a5338b67200-kube-api-access-v67w2\") pod \"kube-state-metrics-69db897b98-mjp46\" (UID: \"1a8eccc3-52d0-4b42-af0a-5a5338b67200\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" Apr 17 16:22:14.984778 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.984737 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bd957af0-a264-4665-92db-2be4172c6ef3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2s6bc\" (UID: \"bd957af0-a264-4665-92db-2be4172c6ef3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2s6bc" Apr 17 16:22:14.984778 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.984761 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ca17119-f140-4bd9-9cc4-59cb1122e37e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:14.984958 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:14.984782 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1ca17119-f140-4bd9-9cc4-59cb1122e37e-node-exporter-tls\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:15.085368 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.085330 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ca17119-f140-4bd9-9cc4-59cb1122e37e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:15.085368 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.085369 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1ca17119-f140-4bd9-9cc4-59cb1122e37e-node-exporter-tls\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:15.085619 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.085388 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1ca17119-f140-4bd9-9cc4-59cb1122e37e-node-exporter-accelerators-collector-config\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:15.085619 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.085410 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-654kl\" (UniqueName: \"kubernetes.io/projected/bd957af0-a264-4665-92db-2be4172c6ef3-kube-api-access-654kl\") pod \"openshift-state-metrics-9d44df66c-2s6bc\" (UID: \"bd957af0-a264-4665-92db-2be4172c6ef3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2s6bc" Apr 17 16:22:15.085619 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.085454 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1a8eccc3-52d0-4b42-af0a-5a5338b67200-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-mjp46\" (UID: \"1a8eccc3-52d0-4b42-af0a-5a5338b67200\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" Apr 17 16:22:15.085619 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.085473 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd957af0-a264-4665-92db-2be4172c6ef3-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2s6bc\" (UID: \"bd957af0-a264-4665-92db-2be4172c6ef3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2s6bc" Apr 17 16:22:15.085619 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:22:15.085497 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 16:22:15.085619 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.085519 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1ca17119-f140-4bd9-9cc4-59cb1122e37e-node-exporter-textfile\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:15.085619 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.085553 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1ca17119-f140-4bd9-9cc4-59cb1122e37e-root\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:15.085619 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:22:15.085581 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ca17119-f140-4bd9-9cc4-59cb1122e37e-node-exporter-tls podName:1ca17119-f140-4bd9-9cc4-59cb1122e37e nodeName:}" failed. No retries permitted until 2026-04-17 16:22:15.585558646 +0000 UTC m=+152.926046730 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/1ca17119-f140-4bd9-9cc4-59cb1122e37e-node-exporter-tls") pod "node-exporter-72l89" (UID: "1ca17119-f140-4bd9-9cc4-59cb1122e37e") : secret "node-exporter-tls" not found Apr 17 16:22:15.085619 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.085598 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1ca17119-f140-4bd9-9cc4-59cb1122e37e-root\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:15.085619 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.085617 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1ca17119-f140-4bd9-9cc4-59cb1122e37e-sys\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:15.086147 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.085646 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1ca17119-f140-4bd9-9cc4-59cb1122e37e-node-exporter-wtmp\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:15.086147 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.085674 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bd957af0-a264-4665-92db-2be4172c6ef3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2s6bc\" (UID: \"bd957af0-a264-4665-92db-2be4172c6ef3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2s6bc" Apr 17 16:22:15.086147 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.085704 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a8eccc3-52d0-4b42-af0a-5a5338b67200-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-mjp46\" (UID: \"1a8eccc3-52d0-4b42-af0a-5a5338b67200\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" Apr 17 16:22:15.086147 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.085735 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1a8eccc3-52d0-4b42-af0a-5a5338b67200-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-mjp46\" (UID: \"1a8eccc3-52d0-4b42-af0a-5a5338b67200\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" Apr 17 16:22:15.086147 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.085764 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a8eccc3-52d0-4b42-af0a-5a5338b67200-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mjp46\" (UID: \"1a8eccc3-52d0-4b42-af0a-5a5338b67200\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" Apr 17 16:22:15.086147 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.085804 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1a8eccc3-52d0-4b42-af0a-5a5338b67200-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-mjp46\" (UID: \"1a8eccc3-52d0-4b42-af0a-5a5338b67200\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" Apr 17 16:22:15.086147 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.085853 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ca17119-f140-4bd9-9cc4-59cb1122e37e-metrics-client-ca\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:15.086147 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.085864 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1ca17119-f140-4bd9-9cc4-59cb1122e37e-node-exporter-wtmp\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:15.086147 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.085904 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47xsw\" (UniqueName: \"kubernetes.io/projected/1ca17119-f140-4bd9-9cc4-59cb1122e37e-kube-api-access-47xsw\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:15.086147 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.085960 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v67w2\" (UniqueName: \"kubernetes.io/projected/1a8eccc3-52d0-4b42-af0a-5a5338b67200-kube-api-access-v67w2\") pod \"kube-state-metrics-69db897b98-mjp46\" (UID: \"1a8eccc3-52d0-4b42-af0a-5a5338b67200\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" Apr 17 16:22:15.086147 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.086017 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bd957af0-a264-4665-92db-2be4172c6ef3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2s6bc\" (UID: \"bd957af0-a264-4665-92db-2be4172c6ef3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2s6bc" Apr 17 16:22:15.086735 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.086190 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1a8eccc3-52d0-4b42-af0a-5a5338b67200-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-mjp46\" (UID: \"1a8eccc3-52d0-4b42-af0a-5a5338b67200\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" Apr 17 16:22:15.086735 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:22:15.085986 2572 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 17 16:22:15.086735 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:22:15.086291 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a8eccc3-52d0-4b42-af0a-5a5338b67200-kube-state-metrics-tls podName:1a8eccc3-52d0-4b42-af0a-5a5338b67200 nodeName:}" failed. No retries permitted until 2026-04-17 16:22:15.586272178 +0000 UTC m=+152.926760239 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/1a8eccc3-52d0-4b42-af0a-5a5338b67200-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-mjp46" (UID: "1a8eccc3-52d0-4b42-af0a-5a5338b67200") : secret "kube-state-metrics-tls" not found Apr 17 16:22:15.086735 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.086291 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1a8eccc3-52d0-4b42-af0a-5a5338b67200-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-mjp46\" (UID: \"1a8eccc3-52d0-4b42-af0a-5a5338b67200\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" Apr 17 16:22:15.086735 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.086439 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd957af0-a264-4665-92db-2be4172c6ef3-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-2s6bc\" (UID: \"bd957af0-a264-4665-92db-2be4172c6ef3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2s6bc" Apr 17 16:22:15.086735 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.086486 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1ca17119-f140-4bd9-9cc4-59cb1122e37e-node-exporter-accelerators-collector-config\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:15.086735 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.086496 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1ca17119-f140-4bd9-9cc4-59cb1122e37e-sys\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:15.086735 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.086585 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ca17119-f140-4bd9-9cc4-59cb1122e37e-metrics-client-ca\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:15.087030 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.086791 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1ca17119-f140-4bd9-9cc4-59cb1122e37e-node-exporter-textfile\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:15.087363 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.087342 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1a8eccc3-52d0-4b42-af0a-5a5338b67200-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-mjp46\" (UID: \"1a8eccc3-52d0-4b42-af0a-5a5338b67200\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" Apr 17 16:22:15.088488 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.088465 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1a8eccc3-52d0-4b42-af0a-5a5338b67200-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-mjp46\" (UID: \"1a8eccc3-52d0-4b42-af0a-5a5338b67200\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" Apr 17 16:22:15.088595 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.088488 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ca17119-f140-4bd9-9cc4-59cb1122e37e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:15.088873 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.088853 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/bd957af0-a264-4665-92db-2be4172c6ef3-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-2s6bc\" (UID: \"bd957af0-a264-4665-92db-2be4172c6ef3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2s6bc" Apr 17 16:22:15.098503 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.098473 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bd957af0-a264-4665-92db-2be4172c6ef3-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-2s6bc\" (UID: \"bd957af0-a264-4665-92db-2be4172c6ef3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2s6bc" Apr 17 16:22:15.098740 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.098721 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v67w2\" (UniqueName: \"kubernetes.io/projected/1a8eccc3-52d0-4b42-af0a-5a5338b67200-kube-api-access-v67w2\") pod \"kube-state-metrics-69db897b98-mjp46\" (UID: \"1a8eccc3-52d0-4b42-af0a-5a5338b67200\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" Apr 17 16:22:15.098878 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.098854 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47xsw\" (UniqueName: \"kubernetes.io/projected/1ca17119-f140-4bd9-9cc4-59cb1122e37e-kube-api-access-47xsw\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:15.100155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.100138 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-654kl\" (UniqueName: \"kubernetes.io/projected/bd957af0-a264-4665-92db-2be4172c6ef3-kube-api-access-654kl\") pod \"openshift-state-metrics-9d44df66c-2s6bc\" (UID: \"bd957af0-a264-4665-92db-2be4172c6ef3\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2s6bc" Apr 17 16:22:15.215533 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.215437 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2s6bc" Apr 17 16:22:15.331230 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.331200 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-2s6bc"] Apr 17 16:22:15.334281 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:22:15.334253 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd957af0_a264_4665_92db_2be4172c6ef3.slice/crio-ed6c514e48743265f6d73ce111d258664bc89091373bbaf4d491f8a06f918ef7 WatchSource:0}: Error finding container ed6c514e48743265f6d73ce111d258664bc89091373bbaf4d491f8a06f918ef7: Status 404 returned error can't find the container with id ed6c514e48743265f6d73ce111d258664bc89091373bbaf4d491f8a06f918ef7 Apr 17 16:22:15.590492 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.590399 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1ca17119-f140-4bd9-9cc4-59cb1122e37e-node-exporter-tls\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:15.590492 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.590470 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a8eccc3-52d0-4b42-af0a-5a5338b67200-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mjp46\" (UID: \"1a8eccc3-52d0-4b42-af0a-5a5338b67200\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" Apr 17 16:22:15.590720 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:22:15.590560 2572 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 17 16:22:15.590720 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:22:15.590614 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a8eccc3-52d0-4b42-af0a-5a5338b67200-kube-state-metrics-tls podName:1a8eccc3-52d0-4b42-af0a-5a5338b67200 nodeName:}" failed. No retries permitted until 2026-04-17 16:22:16.590600416 +0000 UTC m=+153.931088473 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/1a8eccc3-52d0-4b42-af0a-5a5338b67200-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-mjp46" (UID: "1a8eccc3-52d0-4b42-af0a-5a5338b67200") : secret "kube-state-metrics-tls" not found Apr 17 16:22:15.592846 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.592819 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1ca17119-f140-4bd9-9cc4-59cb1122e37e-node-exporter-tls\") pod \"node-exporter-72l89\" (UID: \"1ca17119-f140-4bd9-9cc4-59cb1122e37e\") " pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:15.594302 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.594279 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2s6bc" event={"ID":"bd957af0-a264-4665-92db-2be4172c6ef3","Type":"ContainerStarted","Data":"a52c48bb808d4bf408501f6f71ba2e7e268d5fd5788b0584876ef98981c5c9ea"} Apr 17 16:22:15.594382 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.594311 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2s6bc" event={"ID":"bd957af0-a264-4665-92db-2be4172c6ef3","Type":"ContainerStarted","Data":"009a051e7928023027cbf0c9bf8cf168b11e4b41ec1987c25445bc89932753fc"} Apr 17 16:22:15.594382 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.594320 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2s6bc" event={"ID":"bd957af0-a264-4665-92db-2be4172c6ef3","Type":"ContainerStarted","Data":"ed6c514e48743265f6d73ce111d258664bc89091373bbaf4d491f8a06f918ef7"} Apr 17 16:22:15.831495 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.831448 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-72l89" Apr 17 16:22:15.840794 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:22:15.840744 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ca17119_f140_4bd9_9cc4_59cb1122e37e.slice/crio-720bda37713fcff00bbc9b3a1e788aef5a5116590d33afc82346e88b41b53a1f WatchSource:0}: Error finding container 720bda37713fcff00bbc9b3a1e788aef5a5116590d33afc82346e88b41b53a1f: Status 404 returned error can't find the container with id 720bda37713fcff00bbc9b3a1e788aef5a5116590d33afc82346e88b41b53a1f Apr 17 16:22:15.982367 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.982331 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:22:15.985455 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.985433 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:15.987867 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.987843 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-28gbc\"" Apr 17 16:22:15.988035 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.988012 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 16:22:15.988175 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.988157 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 16:22:15.988272 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.988255 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 16:22:15.988368 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.988354 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 16:22:15.988496 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.988477 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 16:22:15.988570 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.988507 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 16:22:15.988620 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.988568 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 16:22:15.988620 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.988610 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 16:22:15.989501 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.989487 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 16:22:15.997153 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:15.997133 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:22:16.095565 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.095493 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dbl7\" (UniqueName: \"kubernetes.io/projected/cdf5600c-d37f-4857-8721-0ef7948500a5-kube-api-access-6dbl7\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.095565 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.095538 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.095751 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.095640 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.095751 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.095669 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdf5600c-d37f-4857-8721-0ef7948500a5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.095751 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.095697 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.095901 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.095748 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cdf5600c-d37f-4857-8721-0ef7948500a5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.095901 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.095840 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cdf5600c-d37f-4857-8721-0ef7948500a5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.095901 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.095891 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-web-config\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.096093 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.095929 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.096093 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.095954 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-config-volume\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.096093 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.095983 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.096093 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.096044 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cdf5600c-d37f-4857-8721-0ef7948500a5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.096295 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.096098 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cdf5600c-d37f-4857-8721-0ef7948500a5-config-out\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.197255 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.197214 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.197425 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.197260 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdf5600c-d37f-4857-8721-0ef7948500a5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.197425 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.197292 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.197425 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.197317 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cdf5600c-d37f-4857-8721-0ef7948500a5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.197425 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.197375 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cdf5600c-d37f-4857-8721-0ef7948500a5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.197583 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:22:16.197425 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cdf5600c-d37f-4857-8721-0ef7948500a5-alertmanager-trusted-ca-bundle podName:cdf5600c-d37f-4857-8721-0ef7948500a5 nodeName:}" failed. No retries permitted until 2026-04-17 16:22:16.69740558 +0000 UTC m=+154.037893643 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/cdf5600c-d37f-4857-8721-0ef7948500a5-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "cdf5600c-d37f-4857-8721-0ef7948500a5") : configmap references non-existent config key: ca-bundle.crt Apr 17 16:22:16.197583 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.197453 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-web-config\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.197583 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.197503 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.197583 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.197528 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-config-volume\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.197583 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.197551 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.197853 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.197592 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cdf5600c-d37f-4857-8721-0ef7948500a5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.197853 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.197616 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cdf5600c-d37f-4857-8721-0ef7948500a5-config-out\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.197853 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.197756 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cdf5600c-d37f-4857-8721-0ef7948500a5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.198645 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.198252 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dbl7\" (UniqueName: \"kubernetes.io/projected/cdf5600c-d37f-4857-8721-0ef7948500a5-kube-api-access-6dbl7\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.198645 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.198300 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.198879 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.198854 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cdf5600c-d37f-4857-8721-0ef7948500a5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.200273 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.200248 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.200481 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.200442 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cdf5600c-d37f-4857-8721-0ef7948500a5-config-out\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.200823 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.200798 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.201338 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.201272 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.201819 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.201777 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cdf5600c-d37f-4857-8721-0ef7948500a5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.201919 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.201897 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-web-config\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.201984 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.201901 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.201984 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.201965 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-config-volume\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.202159 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.202062 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.208354 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.208329 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dbl7\" (UniqueName: \"kubernetes.io/projected/cdf5600c-d37f-4857-8721-0ef7948500a5-kube-api-access-6dbl7\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.598429 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.598387 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-72l89" event={"ID":"1ca17119-f140-4bd9-9cc4-59cb1122e37e","Type":"ContainerStarted","Data":"720bda37713fcff00bbc9b3a1e788aef5a5116590d33afc82346e88b41b53a1f"} Apr 17 16:22:16.603913 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.603881 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a8eccc3-52d0-4b42-af0a-5a5338b67200-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mjp46\" (UID: \"1a8eccc3-52d0-4b42-af0a-5a5338b67200\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" Apr 17 16:22:16.606914 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.606888 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1a8eccc3-52d0-4b42-af0a-5a5338b67200-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-mjp46\" (UID: \"1a8eccc3-52d0-4b42-af0a-5a5338b67200\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" Apr 17 16:22:16.704906 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.704878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdf5600c-d37f-4857-8721-0ef7948500a5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.705772 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.705748 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdf5600c-d37f-4857-8721-0ef7948500a5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:16.737229 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.737198 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" Apr 17 16:22:16.860422 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.860393 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-mjp46"] Apr 17 16:22:16.888886 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:22:16.888859 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a8eccc3_52d0_4b42_af0a_5a5338b67200.slice/crio-7f029c88c3310882887d4da9789b81e8b0b707881c7a8dafd0fd99553c5323ec WatchSource:0}: Error finding container 7f029c88c3310882887d4da9789b81e8b0b707881c7a8dafd0fd99553c5323ec: Status 404 returned error can't find the container with id 7f029c88c3310882887d4da9789b81e8b0b707881c7a8dafd0fd99553c5323ec Apr 17 16:22:16.895581 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:16.895557 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:22:17.016027 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:17.016003 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:22:17.018542 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:22:17.018516 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdf5600c_d37f_4857_8721_0ef7948500a5.slice/crio-13737f7fbe0ead320f454f5c14ab903eaba5fd9c6896e6ee83f40591470ef5fd WatchSource:0}: Error finding container 13737f7fbe0ead320f454f5c14ab903eaba5fd9c6896e6ee83f40591470ef5fd: Status 404 returned error can't find the container with id 13737f7fbe0ead320f454f5c14ab903eaba5fd9c6896e6ee83f40591470ef5fd Apr 17 16:22:17.602854 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:17.602820 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2s6bc" event={"ID":"bd957af0-a264-4665-92db-2be4172c6ef3","Type":"ContainerStarted","Data":"27b318f5cd770c67acbe5682a273e9c4fe94eeac6cc39da8e86ca990dd3cf0c6"} Apr 17 16:22:17.604217 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:17.604191 2572 generic.go:358] "Generic (PLEG): container finished" podID="1ca17119-f140-4bd9-9cc4-59cb1122e37e" containerID="e6df35590a3c3959a53925486b9749018be8edd22a824e1857a0a157e3372400" exitCode=0 Apr 17 16:22:17.604345 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:17.604252 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-72l89" event={"ID":"1ca17119-f140-4bd9-9cc4-59cb1122e37e","Type":"ContainerDied","Data":"e6df35590a3c3959a53925486b9749018be8edd22a824e1857a0a157e3372400"} Apr 17 16:22:17.605254 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:17.605228 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cdf5600c-d37f-4857-8721-0ef7948500a5","Type":"ContainerStarted","Data":"13737f7fbe0ead320f454f5c14ab903eaba5fd9c6896e6ee83f40591470ef5fd"} Apr 17 16:22:17.606175 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:17.606148 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" event={"ID":"1a8eccc3-52d0-4b42-af0a-5a5338b67200","Type":"ContainerStarted","Data":"7f029c88c3310882887d4da9789b81e8b0b707881c7a8dafd0fd99553c5323ec"} Apr 17 16:22:17.621017 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:17.620959 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-2s6bc" podStartSLOduration=2.40572492 podStartE2EDuration="3.6209443s" podCreationTimestamp="2026-04-17 16:22:14 +0000 UTC" firstStartedPulling="2026-04-17 16:22:15.435149323 +0000 UTC m=+152.775637380" lastFinishedPulling="2026-04-17 16:22:16.650368699 +0000 UTC m=+153.990856760" observedRunningTime="2026-04-17 16:22:17.618451378 +0000 UTC m=+154.958939459" watchObservedRunningTime="2026-04-17 16:22:17.6209443 +0000 UTC m=+154.961432383" Apr 17 16:22:18.610347 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:18.610320 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-72l89" event={"ID":"1ca17119-f140-4bd9-9cc4-59cb1122e37e","Type":"ContainerStarted","Data":"0eaa52dd878086bcc031c821371f51cce603e84d055e08994d48d2de179ceb7f"} Apr 17 16:22:18.610652 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:18.610356 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-72l89" event={"ID":"1ca17119-f140-4bd9-9cc4-59cb1122e37e","Type":"ContainerStarted","Data":"2a50961825cf26b6f497b552e8c7ff784ef751467370a01280211ed86d7b37cf"} Apr 17 16:22:18.630639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:18.630571 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-72l89" podStartSLOduration=3.821343174 podStartE2EDuration="4.630551985s" podCreationTimestamp="2026-04-17 16:22:14 +0000 UTC" firstStartedPulling="2026-04-17 16:22:15.842332766 +0000 UTC m=+153.182820824" lastFinishedPulling="2026-04-17 16:22:16.651541575 +0000 UTC m=+153.992029635" observedRunningTime="2026-04-17 16:22:18.62922222 +0000 UTC m=+155.969710299" watchObservedRunningTime="2026-04-17 16:22:18.630551985 +0000 UTC m=+155.971040063" Apr 17 16:22:19.121853 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:22:19.121752 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-v5rbd" podUID="609a9cbf-301f-406b-a26d-13ae069e0a70" Apr 17 16:22:19.135124 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:22:19.135054 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-9sgrn" podUID="5caf5aa7-4606-4fa1-8754-cab1cd67eac0" Apr 17 16:22:19.616760 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:19.616726 2572 generic.go:358] "Generic (PLEG): container finished" podID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerID="73629cd773fd2c8b2e11501a343e43d392202c7a57fb3f8a8a374cc719076c6c" exitCode=0 Apr 17 16:22:19.617174 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:19.616818 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cdf5600c-d37f-4857-8721-0ef7948500a5","Type":"ContainerDied","Data":"73629cd773fd2c8b2e11501a343e43d392202c7a57fb3f8a8a374cc719076c6c"} Apr 17 16:22:19.618701 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:19.618678 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" event={"ID":"1a8eccc3-52d0-4b42-af0a-5a5338b67200","Type":"ContainerStarted","Data":"64c72699b922893e7083db60d38fa4b8a2e21d14d8bd27fae22793b41635b841"} Apr 17 16:22:19.618781 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:19.618711 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" event={"ID":"1a8eccc3-52d0-4b42-af0a-5a5338b67200","Type":"ContainerStarted","Data":"abb389bafc5350f16618f08b4c07bc99321101e808b8fc51a6063ac29e70520a"} Apr 17 16:22:19.618781 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:19.618728 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" event={"ID":"1a8eccc3-52d0-4b42-af0a-5a5338b67200","Type":"ContainerStarted","Data":"7cc2ff28cb916ba75d0335d325490443d1acaf75743de877d3deb7e20ff20d45"} Apr 17 16:22:19.618976 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:19.618960 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v5rbd" Apr 17 16:22:19.658937 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:19.658889 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-mjp46" podStartSLOduration=3.963529821 podStartE2EDuration="5.658876125s" podCreationTimestamp="2026-04-17 16:22:14 +0000 UTC" firstStartedPulling="2026-04-17 16:22:16.890699272 +0000 UTC m=+154.231187328" lastFinishedPulling="2026-04-17 16:22:18.586045575 +0000 UTC m=+155.926533632" observedRunningTime="2026-04-17 16:22:19.657686716 +0000 UTC m=+156.998174794" watchObservedRunningTime="2026-04-17 16:22:19.658876125 +0000 UTC m=+156.999364205" Apr 17 16:22:19.703859 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:19.703832 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-zxnm6"] Apr 17 16:22:19.705903 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:19.705881 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zxnm6" Apr 17 16:22:19.708360 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:19.708339 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 16:22:19.708463 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:19.708348 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-nlh79\"" Apr 17 16:22:19.714588 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:19.714566 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-zxnm6"] Apr 17 16:22:19.833932 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:19.833892 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b39db528-5360-480e-a54c-fb959515df7c-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-zxnm6\" (UID: \"b39db528-5360-480e-a54c-fb959515df7c\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zxnm6" Apr 17 16:22:19.935271 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:19.935232 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b39db528-5360-480e-a54c-fb959515df7c-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-zxnm6\" (UID: \"b39db528-5360-480e-a54c-fb959515df7c\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zxnm6" Apr 17 16:22:19.937624 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:19.937601 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b39db528-5360-480e-a54c-fb959515df7c-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-zxnm6\" (UID: \"b39db528-5360-480e-a54c-fb959515df7c\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zxnm6" Apr 17 16:22:20.014609 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:20.014573 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zxnm6" Apr 17 16:22:20.134102 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:20.134063 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-zxnm6"] Apr 17 16:22:20.136512 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:22:20.136484 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb39db528_5360_480e_a54c_fb959515df7c.slice/crio-0a3f03b21de629411189769e7a6bbdbbd1c270c83a5caa96960e65943d8e3a5f WatchSource:0}: Error finding container 0a3f03b21de629411189769e7a6bbdbbd1c270c83a5caa96960e65943d8e3a5f: Status 404 returned error can't find the container with id 0a3f03b21de629411189769e7a6bbdbbd1c270c83a5caa96960e65943d8e3a5f Apr 17 16:22:20.174782 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:20.174755 2572 scope.go:117] "RemoveContainer" containerID="6479cd547cda94dce0f4c95f541f33571657e7676d7e70469fe7ad79cf67fc6b" Apr 17 16:22:20.174945 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:22:20.174929 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-ljhb6_openshift-console-operator(72bbd866-8c40-48e3-9eb3-b34ae76679de)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" podUID="72bbd866-8c40-48e3-9eb3-b34ae76679de" Apr 17 16:22:20.196353 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:22:20.196282 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-j89hr" podUID="dbd283d5-ff0b-4c8f-b1be-15a75816e953" Apr 17 16:22:20.623704 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:20.623668 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zxnm6" event={"ID":"b39db528-5360-480e-a54c-fb959515df7c","Type":"ContainerStarted","Data":"0a3f03b21de629411189769e7a6bbdbbd1c270c83a5caa96960e65943d8e3a5f"} Apr 17 16:22:21.164383 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.164333 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:22:21.167129 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.167104 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.172274 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.172250 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 16:22:21.172848 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.172821 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 16:22:21.173810 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.173787 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 16:22:21.174135 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.174107 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 16:22:21.174290 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.174269 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 16:22:21.174395 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.174135 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 16:22:21.174395 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.174220 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 16:22:21.174395 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.174141 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 16:22:21.175639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.175619 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 16:22:21.175794 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.175762 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-fmvcn\"" Apr 17 16:22:21.175866 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.175796 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 16:22:21.175998 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.175977 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 16:22:21.176153 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.176136 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-6sd4bp4t45j9\"" Apr 17 16:22:21.176303 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.176285 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 16:22:21.176534 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.176512 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 16:22:21.181974 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.181953 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:22:21.250210 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.250185 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.250329 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.250227 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.250329 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.250259 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-web-config\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.250438 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.250328 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-config-out\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.250485 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.250456 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.250538 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.250501 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.250593 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.250570 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.250652 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.250625 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.250696 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.250673 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.250741 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.250695 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.250741 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.250712 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.250825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.250743 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.250872 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.250828 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.250922 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.250868 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-config\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.250922 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.250896 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.251016 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.250956 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbr22\" (UniqueName: \"kubernetes.io/projected/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-kube-api-access-lbr22\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.251016 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.250996 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.251100 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.251041 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.352866 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.352569 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.352866 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.352623 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.352866 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.352654 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-web-config\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.352866 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.352683 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-config-out\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.352866 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.352736 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.352866 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.352762 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.352866 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.352791 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.352866 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.352820 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.352866 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.352853 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.354153 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.353622 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.354153 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.353669 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.354153 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.353711 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.354153 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.353787 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.354153 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.353822 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-config\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.354153 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.353848 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.354153 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.353895 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbr22\" (UniqueName: \"kubernetes.io/projected/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-kube-api-access-lbr22\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.354153 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.353944 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.354153 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.353983 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.354658 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.354273 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.354658 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.354551 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.355599 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.355479 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.355728 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.355669 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.356945 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.356662 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.357641 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.357549 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-config\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.358031 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.357946 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-config-out\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.358619 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.358574 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.359923 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.359186 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.359923 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.359589 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.359923 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.359600 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.359923 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.359737 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.360135 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.359946 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.360204 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.360181 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.360678 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.360636 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.360864 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.360835 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.361171 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.361145 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-web-config\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.362808 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.362787 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbr22\" (UniqueName: \"kubernetes.io/projected/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-kube-api-access-lbr22\") pod \"prometheus-k8s-0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.483654 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.483560 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:21.628803 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.628768 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cdf5600c-d37f-4857-8721-0ef7948500a5","Type":"ContainerStarted","Data":"308d6c6de71489d70e4b7591d15cdb62144b52260668abda6d6a3d2418ca4a85"} Apr 17 16:22:21.814837 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:21.814769 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:22:21.818975 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:22:21.818945 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae9f1dc0_9ec1_476d_abd7_fa62aa50d9b0.slice/crio-cff87dbe80dde579c7866e826449b7fc58ca723924f0d8607cf9451e1b706b7c WatchSource:0}: Error finding container cff87dbe80dde579c7866e826449b7fc58ca723924f0d8607cf9451e1b706b7c: Status 404 returned error can't find the container with id cff87dbe80dde579c7866e826449b7fc58ca723924f0d8607cf9451e1b706b7c Apr 17 16:22:22.633243 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:22.633114 2572 generic.go:358] "Generic (PLEG): container finished" podID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerID="9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80" exitCode=0 Apr 17 16:22:22.633694 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:22.633209 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0","Type":"ContainerDied","Data":"9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80"} Apr 17 16:22:22.633694 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:22.633314 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0","Type":"ContainerStarted","Data":"cff87dbe80dde579c7866e826449b7fc58ca723924f0d8607cf9451e1b706b7c"} Apr 17 16:22:22.634968 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:22.634851 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zxnm6" event={"ID":"b39db528-5360-480e-a54c-fb959515df7c","Type":"ContainerStarted","Data":"0e31e44cd8b19cb66f9ba61b74cfb08c3ce8886ece4c3c2f406bc98b9867fc64"} Apr 17 16:22:22.635042 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:22.635013 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zxnm6" Apr 17 16:22:22.640297 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:22.640272 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cdf5600c-d37f-4857-8721-0ef7948500a5","Type":"ContainerStarted","Data":"45742f725df3ec7ff1032227e7bdaac196b1ad10e9db0574bc82a71e33351fd0"} Apr 17 16:22:22.640403 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:22.640305 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cdf5600c-d37f-4857-8721-0ef7948500a5","Type":"ContainerStarted","Data":"22261344e6d47d1ea4377ff7bf1e50a613a64d9aa77d745531071783bbbb7499"} Apr 17 16:22:22.640403 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:22.640319 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cdf5600c-d37f-4857-8721-0ef7948500a5","Type":"ContainerStarted","Data":"878dc06d4824b091e3491cdd9be099cf2696f799116bb0bd451186a9ddb650fa"} Apr 17 16:22:22.640403 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:22.640331 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cdf5600c-d37f-4857-8721-0ef7948500a5","Type":"ContainerStarted","Data":"0690e827b290d8a46821e608aefa4f2f9fb494b12c05eed27531ebc6c2f6fed2"} Apr 17 16:22:22.640807 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:22.640777 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zxnm6" Apr 17 16:22:23.646117 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:23.646062 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cdf5600c-d37f-4857-8721-0ef7948500a5","Type":"ContainerStarted","Data":"6b2352b4ab3170a20f15a685fdf3d17025242cb9742216800f7839262ad69d6f"} Apr 17 16:22:23.672731 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:23.672649 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.9044092040000002 podStartE2EDuration="8.672610713s" podCreationTimestamp="2026-04-17 16:22:15 +0000 UTC" firstStartedPulling="2026-04-17 16:22:17.020346685 +0000 UTC m=+154.360834741" lastFinishedPulling="2026-04-17 16:22:22.788548192 +0000 UTC m=+160.129036250" observedRunningTime="2026-04-17 16:22:23.672097077 +0000 UTC m=+161.012585155" watchObservedRunningTime="2026-04-17 16:22:23.672610713 +0000 UTC m=+161.013098790" Apr 17 16:22:23.673430 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:23.673395 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zxnm6" podStartSLOduration=3.123150342 podStartE2EDuration="4.673384614s" podCreationTimestamp="2026-04-17 16:22:19 +0000 UTC" firstStartedPulling="2026-04-17 16:22:20.138356027 +0000 UTC m=+157.478844085" lastFinishedPulling="2026-04-17 16:22:21.688590283 +0000 UTC m=+159.029078357" observedRunningTime="2026-04-17 16:22:22.676827851 +0000 UTC m=+160.017315931" watchObservedRunningTime="2026-04-17 16:22:23.673384614 +0000 UTC m=+161.013872690" Apr 17 16:22:23.978286 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:23.978192 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert\") pod \"ingress-canary-9sgrn\" (UID: \"5caf5aa7-4606-4fa1-8754-cab1cd67eac0\") " pod="openshift-ingress-canary/ingress-canary-9sgrn" Apr 17 16:22:23.978444 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:23.978321 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls\") pod \"dns-default-v5rbd\" (UID: \"609a9cbf-301f-406b-a26d-13ae069e0a70\") " pod="openshift-dns/dns-default-v5rbd" Apr 17 16:22:23.981160 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:23.980901 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/609a9cbf-301f-406b-a26d-13ae069e0a70-metrics-tls\") pod \"dns-default-v5rbd\" (UID: \"609a9cbf-301f-406b-a26d-13ae069e0a70\") " pod="openshift-dns/dns-default-v5rbd" Apr 17 16:22:23.981160 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:23.981119 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5caf5aa7-4606-4fa1-8754-cab1cd67eac0-cert\") pod \"ingress-canary-9sgrn\" (UID: \"5caf5aa7-4606-4fa1-8754-cab1cd67eac0\") " pod="openshift-ingress-canary/ingress-canary-9sgrn" Apr 17 16:22:24.122926 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:24.122892 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vgznp\"" Apr 17 16:22:24.131333 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:24.131296 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-v5rbd" Apr 17 16:22:24.295623 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:24.295433 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-v5rbd"] Apr 17 16:22:24.298656 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:22:24.298624 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod609a9cbf_301f_406b_a26d_13ae069e0a70.slice/crio-1ed92f9968dff660fe7ada409301093f2adce4c416f328ad4fadcba89d76b040 WatchSource:0}: Error finding container 1ed92f9968dff660fe7ada409301093f2adce4c416f328ad4fadcba89d76b040: Status 404 returned error can't find the container with id 1ed92f9968dff660fe7ada409301093f2adce4c416f328ad4fadcba89d76b040 Apr 17 16:22:24.650291 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:24.650250 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v5rbd" event={"ID":"609a9cbf-301f-406b-a26d-13ae069e0a70","Type":"ContainerStarted","Data":"1ed92f9968dff660fe7ada409301093f2adce4c416f328ad4fadcba89d76b040"} Apr 17 16:22:26.658497 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:26.658459 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v5rbd" event={"ID":"609a9cbf-301f-406b-a26d-13ae069e0a70","Type":"ContainerStarted","Data":"6522221c40adae03c7d10fdca0f499af44b414091199157223853ee120cd898b"} Apr 17 16:22:26.658497 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:26.658500 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-v5rbd" event={"ID":"609a9cbf-301f-406b-a26d-13ae069e0a70","Type":"ContainerStarted","Data":"7b84ae64f85d74cd5df3c51ccec8a0721e7497fcda24fc3c85973d605d0a3265"} Apr 17 16:22:26.658990 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:26.658519 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-v5rbd" Apr 17 16:22:26.660278 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:26.660256 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0","Type":"ContainerStarted","Data":"896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06"} Apr 17 16:22:26.660362 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:26.660285 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0","Type":"ContainerStarted","Data":"916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887"} Apr 17 16:22:26.674952 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:26.674904 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-v5rbd" podStartSLOduration=128.686149132 podStartE2EDuration="2m10.6748906s" podCreationTimestamp="2026-04-17 16:20:16 +0000 UTC" firstStartedPulling="2026-04-17 16:22:24.301011141 +0000 UTC m=+161.641499203" lastFinishedPulling="2026-04-17 16:22:26.2897526 +0000 UTC m=+163.630240671" observedRunningTime="2026-04-17 16:22:26.674224945 +0000 UTC m=+164.014713024" watchObservedRunningTime="2026-04-17 16:22:26.6748906 +0000 UTC m=+164.015378669" Apr 17 16:22:28.578754 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:28.578679 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-59b97fb566-dw8gb" Apr 17 16:22:28.669328 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:28.669291 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0","Type":"ContainerStarted","Data":"9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f"} Apr 17 16:22:28.669328 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:28.669331 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0","Type":"ContainerStarted","Data":"dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553"} Apr 17 16:22:28.669514 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:28.669341 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0","Type":"ContainerStarted","Data":"9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4"} Apr 17 16:22:28.669514 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:28.669350 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0","Type":"ContainerStarted","Data":"da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1"} Apr 17 16:22:28.697573 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:28.697521 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.193214833 podStartE2EDuration="7.697504747s" podCreationTimestamp="2026-04-17 16:22:21 +0000 UTC" firstStartedPulling="2026-04-17 16:22:22.634572142 +0000 UTC m=+159.975060200" lastFinishedPulling="2026-04-17 16:22:28.138862052 +0000 UTC m=+165.479350114" observedRunningTime="2026-04-17 16:22:28.694842855 +0000 UTC m=+166.035330946" watchObservedRunningTime="2026-04-17 16:22:28.697504747 +0000 UTC m=+166.037992825" Apr 17 16:22:31.484565 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:31.484531 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:22:32.174565 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:32.174531 2572 scope.go:117] "RemoveContainer" containerID="6479cd547cda94dce0f4c95f541f33571657e7676d7e70469fe7ad79cf67fc6b" Apr 17 16:22:32.687317 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:32.687288 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ljhb6_72bbd866-8c40-48e3-9eb3-b34ae76679de/console-operator/2.log" Apr 17 16:22:32.687893 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:32.687426 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" event={"ID":"72bbd866-8c40-48e3-9eb3-b34ae76679de","Type":"ContainerStarted","Data":"88edd673be00617cabe1b92ff983eb9e3dc7c06a54e1f8d76b07cd1640f901b8"} Apr 17 16:22:32.687893 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:32.687815 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" Apr 17 16:22:32.693128 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:32.693104 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" Apr 17 16:22:32.707918 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:32.706714 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-ljhb6" podStartSLOduration=53.488412381 podStartE2EDuration="55.706697257s" podCreationTimestamp="2026-04-17 16:21:37 +0000 UTC" firstStartedPulling="2026-04-17 16:21:37.499397237 +0000 UTC m=+114.839885294" lastFinishedPulling="2026-04-17 16:21:39.717682113 +0000 UTC m=+117.058170170" observedRunningTime="2026-04-17 16:22:32.706380325 +0000 UTC m=+170.046868417" watchObservedRunningTime="2026-04-17 16:22:32.706697257 +0000 UTC m=+170.047185336" Apr 17 16:22:33.175542 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:33.175511 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:22:34.174490 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:34.174445 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9sgrn" Apr 17 16:22:34.177744 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:34.177726 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9rhb9\"" Apr 17 16:22:34.185060 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:34.185037 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9sgrn" Apr 17 16:22:34.303145 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:34.303124 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9sgrn"] Apr 17 16:22:34.305416 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:22:34.305392 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5caf5aa7_4606_4fa1_8754_cab1cd67eac0.slice/crio-4193f82632e19f2fb820e14345fa42dcf2112fbc2b0525b09d0818a3ce4c38ab WatchSource:0}: Error finding container 4193f82632e19f2fb820e14345fa42dcf2112fbc2b0525b09d0818a3ce4c38ab: Status 404 returned error can't find the container with id 4193f82632e19f2fb820e14345fa42dcf2112fbc2b0525b09d0818a3ce4c38ab Apr 17 16:22:34.695734 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:34.695676 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9sgrn" event={"ID":"5caf5aa7-4606-4fa1-8754-cab1cd67eac0","Type":"ContainerStarted","Data":"4193f82632e19f2fb820e14345fa42dcf2112fbc2b0525b09d0818a3ce4c38ab"} Apr 17 16:22:36.666412 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:36.666383 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-v5rbd" Apr 17 16:22:36.703670 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:36.703636 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9sgrn" event={"ID":"5caf5aa7-4606-4fa1-8754-cab1cd67eac0","Type":"ContainerStarted","Data":"f7a2203d5b50206021cc6f74d24325a646bfbe1c722399c22ae91f1758187295"} Apr 17 16:22:36.718713 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:36.718668 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9sgrn" podStartSLOduration=139.221744611 podStartE2EDuration="2m20.718654621s" podCreationTimestamp="2026-04-17 16:20:16 +0000 UTC" firstStartedPulling="2026-04-17 16:22:34.307324506 +0000 UTC m=+171.647812563" lastFinishedPulling="2026-04-17 16:22:35.804234512 +0000 UTC m=+173.144722573" observedRunningTime="2026-04-17 16:22:36.718183274 +0000 UTC m=+174.058671355" watchObservedRunningTime="2026-04-17 16:22:36.718654621 +0000 UTC m=+174.059142699" Apr 17 16:22:58.774046 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:58.774012 2572 generic.go:358] "Generic (PLEG): container finished" podID="80fc0241-d0ad-42e2-9e15-932722a75ffa" containerID="d695306fbba55eac2488dd6b4c1f61df7728455b40f4423ae5fb67bb54e391f2" exitCode=0 Apr 17 16:22:58.774443 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:58.774061 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-mqbqx" event={"ID":"80fc0241-d0ad-42e2-9e15-932722a75ffa","Type":"ContainerDied","Data":"d695306fbba55eac2488dd6b4c1f61df7728455b40f4423ae5fb67bb54e391f2"} Apr 17 16:22:58.774443 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:58.774403 2572 scope.go:117] "RemoveContainer" containerID="d695306fbba55eac2488dd6b4c1f61df7728455b40f4423ae5fb67bb54e391f2" Apr 17 16:22:59.403296 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:59.403258 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5948c7b7c8-rmrfc_b5767428-43d6-4cbe-9763-0731e126b82c/router/0.log" Apr 17 16:22:59.413337 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:59.413308 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9sgrn_5caf5aa7-4606-4fa1-8754-cab1cd67eac0/serve-healthcheck-canary/0.log" Apr 17 16:22:59.778496 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:22:59.778413 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-mqbqx" event={"ID":"80fc0241-d0ad-42e2-9e15-932722a75ffa","Type":"ContainerStarted","Data":"e05d77d8b11af516abb675eb71f7b41126686aa721a25ec82d5a0f267b02240f"} Apr 17 16:23:13.820876 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:13.820784 2572 generic.go:358] "Generic (PLEG): container finished" podID="4eb06b52-b2db-4c75-8034-44b127e20319" containerID="1503f2a9eda6cb07a8fe5d71d68ac5b11167948f15604d45dd29bfa13d4412d2" exitCode=0 Apr 17 16:23:13.820876 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:13.820864 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z4bpg" event={"ID":"4eb06b52-b2db-4c75-8034-44b127e20319","Type":"ContainerDied","Data":"1503f2a9eda6cb07a8fe5d71d68ac5b11167948f15604d45dd29bfa13d4412d2"} Apr 17 16:23:13.821322 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:13.821195 2572 scope.go:117] "RemoveContainer" containerID="1503f2a9eda6cb07a8fe5d71d68ac5b11167948f15604d45dd29bfa13d4412d2" Apr 17 16:23:14.824991 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:14.824957 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-z4bpg" event={"ID":"4eb06b52-b2db-4c75-8034-44b127e20319","Type":"ContainerStarted","Data":"80c444da9e4e34e70fac192695d1cb15cbb97ec2ba0d79cda0f933ee193d9e5d"} Apr 17 16:23:21.483983 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:21.483945 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:21.500172 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:21.500063 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:21.862204 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:21.862066 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:35.217831 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:35.217795 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:23:35.218275 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:35.218229 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="alertmanager" containerID="cri-o://308d6c6de71489d70e4b7591d15cdb62144b52260668abda6d6a3d2418ca4a85" gracePeriod=120 Apr 17 16:23:35.218331 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:35.218278 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="kube-rbac-proxy-metric" containerID="cri-o://45742f725df3ec7ff1032227e7bdaac196b1ad10e9db0574bc82a71e33351fd0" gracePeriod=120 Apr 17 16:23:35.218394 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:35.218329 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="prom-label-proxy" containerID="cri-o://6b2352b4ab3170a20f15a685fdf3d17025242cb9742216800f7839262ad69d6f" gracePeriod=120 Apr 17 16:23:35.218394 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:35.218305 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="kube-rbac-proxy-web" containerID="cri-o://878dc06d4824b091e3491cdd9be099cf2696f799116bb0bd451186a9ddb650fa" gracePeriod=120 Apr 17 16:23:35.218394 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:35.218366 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="config-reloader" containerID="cri-o://0690e827b290d8a46821e608aefa4f2f9fb494b12c05eed27531ebc6c2f6fed2" gracePeriod=120 Apr 17 16:23:35.218521 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:35.218360 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="kube-rbac-proxy" containerID="cri-o://22261344e6d47d1ea4377ff7bf1e50a613a64d9aa77d745531071783bbbb7499" gracePeriod=120 Apr 17 16:23:35.891514 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:35.891483 2572 generic.go:358] "Generic (PLEG): container finished" podID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerID="6b2352b4ab3170a20f15a685fdf3d17025242cb9742216800f7839262ad69d6f" exitCode=0 Apr 17 16:23:35.891514 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:35.891508 2572 generic.go:358] "Generic (PLEG): container finished" podID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerID="22261344e6d47d1ea4377ff7bf1e50a613a64d9aa77d745531071783bbbb7499" exitCode=0 Apr 17 16:23:35.891514 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:35.891514 2572 generic.go:358] "Generic (PLEG): container finished" podID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerID="0690e827b290d8a46821e608aefa4f2f9fb494b12c05eed27531ebc6c2f6fed2" exitCode=0 Apr 17 16:23:35.891514 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:35.891520 2572 generic.go:358] "Generic (PLEG): container finished" podID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerID="308d6c6de71489d70e4b7591d15cdb62144b52260668abda6d6a3d2418ca4a85" exitCode=0 Apr 17 16:23:35.891781 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:35.891566 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cdf5600c-d37f-4857-8721-0ef7948500a5","Type":"ContainerDied","Data":"6b2352b4ab3170a20f15a685fdf3d17025242cb9742216800f7839262ad69d6f"} Apr 17 16:23:35.891781 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:35.891600 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cdf5600c-d37f-4857-8721-0ef7948500a5","Type":"ContainerDied","Data":"22261344e6d47d1ea4377ff7bf1e50a613a64d9aa77d745531071783bbbb7499"} Apr 17 16:23:35.891781 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:35.891610 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cdf5600c-d37f-4857-8721-0ef7948500a5","Type":"ContainerDied","Data":"0690e827b290d8a46821e608aefa4f2f9fb494b12c05eed27531ebc6c2f6fed2"} Apr 17 16:23:35.891781 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:35.891619 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cdf5600c-d37f-4857-8721-0ef7948500a5","Type":"ContainerDied","Data":"308d6c6de71489d70e4b7591d15cdb62144b52260668abda6d6a3d2418ca4a85"} Apr 17 16:23:36.457350 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.457325 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:36.533900 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.533869 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-config-volume\") pod \"cdf5600c-d37f-4857-8721-0ef7948500a5\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " Apr 17 16:23:36.534101 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.533915 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-web-config\") pod \"cdf5600c-d37f-4857-8721-0ef7948500a5\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " Apr 17 16:23:36.534101 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.533956 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"cdf5600c-d37f-4857-8721-0ef7948500a5\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " Apr 17 16:23:36.534101 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.533984 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cdf5600c-d37f-4857-8721-0ef7948500a5-alertmanager-main-db\") pod \"cdf5600c-d37f-4857-8721-0ef7948500a5\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " Apr 17 16:23:36.534101 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.534016 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-kube-rbac-proxy-web\") pod \"cdf5600c-d37f-4857-8721-0ef7948500a5\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " Apr 17 16:23:36.534101 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.534058 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cdf5600c-d37f-4857-8721-0ef7948500a5-config-out\") pod \"cdf5600c-d37f-4857-8721-0ef7948500a5\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " Apr 17 16:23:36.534384 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.534112 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdf5600c-d37f-4857-8721-0ef7948500a5-alertmanager-trusted-ca-bundle\") pod \"cdf5600c-d37f-4857-8721-0ef7948500a5\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " Apr 17 16:23:36.534384 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.534142 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-kube-rbac-proxy\") pod \"cdf5600c-d37f-4857-8721-0ef7948500a5\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " Apr 17 16:23:36.534384 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.534178 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-main-tls\") pod \"cdf5600c-d37f-4857-8721-0ef7948500a5\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " Apr 17 16:23:36.534384 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.534210 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cdf5600c-d37f-4857-8721-0ef7948500a5-tls-assets\") pod \"cdf5600c-d37f-4857-8721-0ef7948500a5\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " Apr 17 16:23:36.534384 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.534250 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cdf5600c-d37f-4857-8721-0ef7948500a5-metrics-client-ca\") pod \"cdf5600c-d37f-4857-8721-0ef7948500a5\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " Apr 17 16:23:36.534637 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.534610 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdf5600c-d37f-4857-8721-0ef7948500a5-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "cdf5600c-d37f-4857-8721-0ef7948500a5" (UID: "cdf5600c-d37f-4857-8721-0ef7948500a5"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:23:36.534637 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.534626 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdf5600c-d37f-4857-8721-0ef7948500a5-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "cdf5600c-d37f-4857-8721-0ef7948500a5" (UID: "cdf5600c-d37f-4857-8721-0ef7948500a5"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:23:36.534740 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.534685 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dbl7\" (UniqueName: \"kubernetes.io/projected/cdf5600c-d37f-4857-8721-0ef7948500a5-kube-api-access-6dbl7\") pod \"cdf5600c-d37f-4857-8721-0ef7948500a5\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " Apr 17 16:23:36.534740 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.534726 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-cluster-tls-config\") pod \"cdf5600c-d37f-4857-8721-0ef7948500a5\" (UID: \"cdf5600c-d37f-4857-8721-0ef7948500a5\") " Apr 17 16:23:36.535120 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.535046 2572 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdf5600c-d37f-4857-8721-0ef7948500a5-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:36.535120 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.535093 2572 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cdf5600c-d37f-4857-8721-0ef7948500a5-metrics-client-ca\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:36.535451 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.535410 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdf5600c-d37f-4857-8721-0ef7948500a5-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "cdf5600c-d37f-4857-8721-0ef7948500a5" (UID: "cdf5600c-d37f-4857-8721-0ef7948500a5"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:23:36.536782 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.536753 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "cdf5600c-d37f-4857-8721-0ef7948500a5" (UID: "cdf5600c-d37f-4857-8721-0ef7948500a5"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:23:36.537271 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.537244 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "cdf5600c-d37f-4857-8721-0ef7948500a5" (UID: "cdf5600c-d37f-4857-8721-0ef7948500a5"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:23:36.537357 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.537269 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "cdf5600c-d37f-4857-8721-0ef7948500a5" (UID: "cdf5600c-d37f-4857-8721-0ef7948500a5"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:23:36.537357 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.537316 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-config-volume" (OuterVolumeSpecName: "config-volume") pod "cdf5600c-d37f-4857-8721-0ef7948500a5" (UID: "cdf5600c-d37f-4857-8721-0ef7948500a5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:23:36.537850 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.537820 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdf5600c-d37f-4857-8721-0ef7948500a5-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "cdf5600c-d37f-4857-8721-0ef7948500a5" (UID: "cdf5600c-d37f-4857-8721-0ef7948500a5"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:23:36.538029 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.538011 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdf5600c-d37f-4857-8721-0ef7948500a5-config-out" (OuterVolumeSpecName: "config-out") pod "cdf5600c-d37f-4857-8721-0ef7948500a5" (UID: "cdf5600c-d37f-4857-8721-0ef7948500a5"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:23:36.538382 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.538357 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdf5600c-d37f-4857-8721-0ef7948500a5-kube-api-access-6dbl7" (OuterVolumeSpecName: "kube-api-access-6dbl7") pod "cdf5600c-d37f-4857-8721-0ef7948500a5" (UID: "cdf5600c-d37f-4857-8721-0ef7948500a5"). InnerVolumeSpecName "kube-api-access-6dbl7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:23:36.538536 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.538516 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "cdf5600c-d37f-4857-8721-0ef7948500a5" (UID: "cdf5600c-d37f-4857-8721-0ef7948500a5"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:23:36.542036 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.541978 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "cdf5600c-d37f-4857-8721-0ef7948500a5" (UID: "cdf5600c-d37f-4857-8721-0ef7948500a5"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:23:36.548762 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.548740 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-web-config" (OuterVolumeSpecName: "web-config") pod "cdf5600c-d37f-4857-8721-0ef7948500a5" (UID: "cdf5600c-d37f-4857-8721-0ef7948500a5"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:23:36.635665 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.635633 2572 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-cluster-tls-config\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:36.635665 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.635663 2572 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-config-volume\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:36.635846 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.635673 2572 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-web-config\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:36.635846 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.635682 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:36.635846 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.635692 2572 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/cdf5600c-d37f-4857-8721-0ef7948500a5-alertmanager-main-db\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:36.635846 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.635701 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:36.635846 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.635710 2572 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cdf5600c-d37f-4857-8721-0ef7948500a5-config-out\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:36.635846 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.635718 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:36.635846 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.635729 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/cdf5600c-d37f-4857-8721-0ef7948500a5-secret-alertmanager-main-tls\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:36.635846 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.635737 2572 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cdf5600c-d37f-4857-8721-0ef7948500a5-tls-assets\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:36.635846 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.635746 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6dbl7\" (UniqueName: \"kubernetes.io/projected/cdf5600c-d37f-4857-8721-0ef7948500a5-kube-api-access-6dbl7\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:36.897418 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.897330 2572 generic.go:358] "Generic (PLEG): container finished" podID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerID="45742f725df3ec7ff1032227e7bdaac196b1ad10e9db0574bc82a71e33351fd0" exitCode=0 Apr 17 16:23:36.897418 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.897357 2572 generic.go:358] "Generic (PLEG): container finished" podID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerID="878dc06d4824b091e3491cdd9be099cf2696f799116bb0bd451186a9ddb650fa" exitCode=0 Apr 17 16:23:36.897418 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.897401 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cdf5600c-d37f-4857-8721-0ef7948500a5","Type":"ContainerDied","Data":"45742f725df3ec7ff1032227e7bdaac196b1ad10e9db0574bc82a71e33351fd0"} Apr 17 16:23:36.897622 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.897443 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cdf5600c-d37f-4857-8721-0ef7948500a5","Type":"ContainerDied","Data":"878dc06d4824b091e3491cdd9be099cf2696f799116bb0bd451186a9ddb650fa"} Apr 17 16:23:36.897622 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.897448 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:36.897622 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.897454 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"cdf5600c-d37f-4857-8721-0ef7948500a5","Type":"ContainerDied","Data":"13737f7fbe0ead320f454f5c14ab903eaba5fd9c6896e6ee83f40591470ef5fd"} Apr 17 16:23:36.897622 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.897469 2572 scope.go:117] "RemoveContainer" containerID="6b2352b4ab3170a20f15a685fdf3d17025242cb9742216800f7839262ad69d6f" Apr 17 16:23:36.905814 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.905798 2572 scope.go:117] "RemoveContainer" containerID="45742f725df3ec7ff1032227e7bdaac196b1ad10e9db0574bc82a71e33351fd0" Apr 17 16:23:36.912524 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.912505 2572 scope.go:117] "RemoveContainer" containerID="22261344e6d47d1ea4377ff7bf1e50a613a64d9aa77d745531071783bbbb7499" Apr 17 16:23:36.918844 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.918827 2572 scope.go:117] "RemoveContainer" containerID="878dc06d4824b091e3491cdd9be099cf2696f799116bb0bd451186a9ddb650fa" Apr 17 16:23:36.921652 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.921628 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:23:36.924217 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.924197 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:23:36.926572 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.926559 2572 scope.go:117] "RemoveContainer" containerID="0690e827b290d8a46821e608aefa4f2f9fb494b12c05eed27531ebc6c2f6fed2" Apr 17 16:23:36.932875 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.932858 2572 scope.go:117] "RemoveContainer" containerID="308d6c6de71489d70e4b7591d15cdb62144b52260668abda6d6a3d2418ca4a85" Apr 17 16:23:36.939588 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.939566 2572 scope.go:117] "RemoveContainer" containerID="73629cd773fd2c8b2e11501a343e43d392202c7a57fb3f8a8a374cc719076c6c" Apr 17 16:23:36.946722 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.946690 2572 scope.go:117] "RemoveContainer" containerID="6b2352b4ab3170a20f15a685fdf3d17025242cb9742216800f7839262ad69d6f" Apr 17 16:23:36.947003 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:23:36.946978 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b2352b4ab3170a20f15a685fdf3d17025242cb9742216800f7839262ad69d6f\": container with ID starting with 6b2352b4ab3170a20f15a685fdf3d17025242cb9742216800f7839262ad69d6f not found: ID does not exist" containerID="6b2352b4ab3170a20f15a685fdf3d17025242cb9742216800f7839262ad69d6f" Apr 17 16:23:36.947092 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947037 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b2352b4ab3170a20f15a685fdf3d17025242cb9742216800f7839262ad69d6f"} err="failed to get container status \"6b2352b4ab3170a20f15a685fdf3d17025242cb9742216800f7839262ad69d6f\": rpc error: code = NotFound desc = could not find container \"6b2352b4ab3170a20f15a685fdf3d17025242cb9742216800f7839262ad69d6f\": container with ID starting with 6b2352b4ab3170a20f15a685fdf3d17025242cb9742216800f7839262ad69d6f not found: ID does not exist" Apr 17 16:23:36.947149 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947097 2572 scope.go:117] "RemoveContainer" containerID="45742f725df3ec7ff1032227e7bdaac196b1ad10e9db0574bc82a71e33351fd0" Apr 17 16:23:36.947149 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947102 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:23:36.947365 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:23:36.947345 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45742f725df3ec7ff1032227e7bdaac196b1ad10e9db0574bc82a71e33351fd0\": container with ID starting with 45742f725df3ec7ff1032227e7bdaac196b1ad10e9db0574bc82a71e33351fd0 not found: ID does not exist" containerID="45742f725df3ec7ff1032227e7bdaac196b1ad10e9db0574bc82a71e33351fd0" Apr 17 16:23:36.947401 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947372 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45742f725df3ec7ff1032227e7bdaac196b1ad10e9db0574bc82a71e33351fd0"} err="failed to get container status \"45742f725df3ec7ff1032227e7bdaac196b1ad10e9db0574bc82a71e33351fd0\": rpc error: code = NotFound desc = could not find container \"45742f725df3ec7ff1032227e7bdaac196b1ad10e9db0574bc82a71e33351fd0\": container with ID starting with 45742f725df3ec7ff1032227e7bdaac196b1ad10e9db0574bc82a71e33351fd0 not found: ID does not exist" Apr 17 16:23:36.947401 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947388 2572 scope.go:117] "RemoveContainer" containerID="22261344e6d47d1ea4377ff7bf1e50a613a64d9aa77d745531071783bbbb7499" Apr 17 16:23:36.947492 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947415 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="alertmanager" Apr 17 16:23:36.947492 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947428 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="alertmanager" Apr 17 16:23:36.947492 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947440 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="prom-label-proxy" Apr 17 16:23:36.947492 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947446 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="prom-label-proxy" Apr 17 16:23:36.947492 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947457 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="init-config-reloader" Apr 17 16:23:36.947492 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947463 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="init-config-reloader" Apr 17 16:23:36.947492 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947471 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="kube-rbac-proxy-web" Apr 17 16:23:36.947492 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947477 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="kube-rbac-proxy-web" Apr 17 16:23:36.947492 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947483 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="kube-rbac-proxy-metric" Apr 17 16:23:36.947492 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947488 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="kube-rbac-proxy-metric" Apr 17 16:23:36.947492 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947495 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="config-reloader" Apr 17 16:23:36.947881 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947500 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="config-reloader" Apr 17 16:23:36.947881 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947513 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="kube-rbac-proxy" Apr 17 16:23:36.947881 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947518 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="kube-rbac-proxy" Apr 17 16:23:36.947881 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947574 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="prom-label-proxy" Apr 17 16:23:36.947881 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947586 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="alertmanager" Apr 17 16:23:36.947881 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947595 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="config-reloader" Apr 17 16:23:36.947881 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947605 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="kube-rbac-proxy-metric" Apr 17 16:23:36.947881 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947616 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="kube-rbac-proxy-web" Apr 17 16:23:36.947881 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:23:36.947610 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22261344e6d47d1ea4377ff7bf1e50a613a64d9aa77d745531071783bbbb7499\": container with ID starting with 22261344e6d47d1ea4377ff7bf1e50a613a64d9aa77d745531071783bbbb7499 not found: ID does not exist" containerID="22261344e6d47d1ea4377ff7bf1e50a613a64d9aa77d745531071783bbbb7499" Apr 17 16:23:36.947881 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947641 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22261344e6d47d1ea4377ff7bf1e50a613a64d9aa77d745531071783bbbb7499"} err="failed to get container status \"22261344e6d47d1ea4377ff7bf1e50a613a64d9aa77d745531071783bbbb7499\": rpc error: code = NotFound desc = could not find container \"22261344e6d47d1ea4377ff7bf1e50a613a64d9aa77d745531071783bbbb7499\": container with ID starting with 22261344e6d47d1ea4377ff7bf1e50a613a64d9aa77d745531071783bbbb7499 not found: ID does not exist" Apr 17 16:23:36.947881 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947659 2572 scope.go:117] "RemoveContainer" containerID="878dc06d4824b091e3491cdd9be099cf2696f799116bb0bd451186a9ddb650fa" Apr 17 16:23:36.947881 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947625 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" containerName="kube-rbac-proxy" Apr 17 16:23:36.947881 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:23:36.947860 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"878dc06d4824b091e3491cdd9be099cf2696f799116bb0bd451186a9ddb650fa\": container with ID starting with 878dc06d4824b091e3491cdd9be099cf2696f799116bb0bd451186a9ddb650fa not found: ID does not exist" containerID="878dc06d4824b091e3491cdd9be099cf2696f799116bb0bd451186a9ddb650fa" Apr 17 16:23:36.948437 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947885 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878dc06d4824b091e3491cdd9be099cf2696f799116bb0bd451186a9ddb650fa"} err="failed to get container status \"878dc06d4824b091e3491cdd9be099cf2696f799116bb0bd451186a9ddb650fa\": rpc error: code = NotFound desc = could not find container \"878dc06d4824b091e3491cdd9be099cf2696f799116bb0bd451186a9ddb650fa\": container with ID starting with 878dc06d4824b091e3491cdd9be099cf2696f799116bb0bd451186a9ddb650fa not found: ID does not exist" Apr 17 16:23:36.948437 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.947902 2572 scope.go:117] "RemoveContainer" containerID="0690e827b290d8a46821e608aefa4f2f9fb494b12c05eed27531ebc6c2f6fed2" Apr 17 16:23:36.948437 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:23:36.948163 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0690e827b290d8a46821e608aefa4f2f9fb494b12c05eed27531ebc6c2f6fed2\": container with ID starting with 0690e827b290d8a46821e608aefa4f2f9fb494b12c05eed27531ebc6c2f6fed2 not found: ID does not exist" containerID="0690e827b290d8a46821e608aefa4f2f9fb494b12c05eed27531ebc6c2f6fed2" Apr 17 16:23:36.948437 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.948189 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0690e827b290d8a46821e608aefa4f2f9fb494b12c05eed27531ebc6c2f6fed2"} err="failed to get container status \"0690e827b290d8a46821e608aefa4f2f9fb494b12c05eed27531ebc6c2f6fed2\": rpc error: code = NotFound desc = could not find container \"0690e827b290d8a46821e608aefa4f2f9fb494b12c05eed27531ebc6c2f6fed2\": container with ID starting with 0690e827b290d8a46821e608aefa4f2f9fb494b12c05eed27531ebc6c2f6fed2 not found: ID does not exist" Apr 17 16:23:36.948437 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.948209 2572 scope.go:117] "RemoveContainer" containerID="308d6c6de71489d70e4b7591d15cdb62144b52260668abda6d6a3d2418ca4a85" Apr 17 16:23:36.948659 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:23:36.948498 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"308d6c6de71489d70e4b7591d15cdb62144b52260668abda6d6a3d2418ca4a85\": container with ID starting with 308d6c6de71489d70e4b7591d15cdb62144b52260668abda6d6a3d2418ca4a85 not found: ID does not exist" containerID="308d6c6de71489d70e4b7591d15cdb62144b52260668abda6d6a3d2418ca4a85" Apr 17 16:23:36.948659 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.948527 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"308d6c6de71489d70e4b7591d15cdb62144b52260668abda6d6a3d2418ca4a85"} err="failed to get container status \"308d6c6de71489d70e4b7591d15cdb62144b52260668abda6d6a3d2418ca4a85\": rpc error: code = NotFound desc = could not find container \"308d6c6de71489d70e4b7591d15cdb62144b52260668abda6d6a3d2418ca4a85\": container with ID starting with 308d6c6de71489d70e4b7591d15cdb62144b52260668abda6d6a3d2418ca4a85 not found: ID does not exist" Apr 17 16:23:36.948659 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.948548 2572 scope.go:117] "RemoveContainer" containerID="73629cd773fd2c8b2e11501a343e43d392202c7a57fb3f8a8a374cc719076c6c" Apr 17 16:23:36.948792 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:23:36.948775 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73629cd773fd2c8b2e11501a343e43d392202c7a57fb3f8a8a374cc719076c6c\": container with ID starting with 73629cd773fd2c8b2e11501a343e43d392202c7a57fb3f8a8a374cc719076c6c not found: ID does not exist" containerID="73629cd773fd2c8b2e11501a343e43d392202c7a57fb3f8a8a374cc719076c6c" Apr 17 16:23:36.948832 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.948798 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73629cd773fd2c8b2e11501a343e43d392202c7a57fb3f8a8a374cc719076c6c"} err="failed to get container status \"73629cd773fd2c8b2e11501a343e43d392202c7a57fb3f8a8a374cc719076c6c\": rpc error: code = NotFound desc = could not find container \"73629cd773fd2c8b2e11501a343e43d392202c7a57fb3f8a8a374cc719076c6c\": container with ID starting with 73629cd773fd2c8b2e11501a343e43d392202c7a57fb3f8a8a374cc719076c6c not found: ID does not exist" Apr 17 16:23:36.948832 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.948812 2572 scope.go:117] "RemoveContainer" containerID="6b2352b4ab3170a20f15a685fdf3d17025242cb9742216800f7839262ad69d6f" Apr 17 16:23:36.949017 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.948999 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b2352b4ab3170a20f15a685fdf3d17025242cb9742216800f7839262ad69d6f"} err="failed to get container status \"6b2352b4ab3170a20f15a685fdf3d17025242cb9742216800f7839262ad69d6f\": rpc error: code = NotFound desc = could not find container \"6b2352b4ab3170a20f15a685fdf3d17025242cb9742216800f7839262ad69d6f\": container with ID starting with 6b2352b4ab3170a20f15a685fdf3d17025242cb9742216800f7839262ad69d6f not found: ID does not exist" Apr 17 16:23:36.949093 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.949019 2572 scope.go:117] "RemoveContainer" containerID="45742f725df3ec7ff1032227e7bdaac196b1ad10e9db0574bc82a71e33351fd0" Apr 17 16:23:36.949313 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.949292 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45742f725df3ec7ff1032227e7bdaac196b1ad10e9db0574bc82a71e33351fd0"} err="failed to get container status \"45742f725df3ec7ff1032227e7bdaac196b1ad10e9db0574bc82a71e33351fd0\": rpc error: code = NotFound desc = could not find container \"45742f725df3ec7ff1032227e7bdaac196b1ad10e9db0574bc82a71e33351fd0\": container with ID starting with 45742f725df3ec7ff1032227e7bdaac196b1ad10e9db0574bc82a71e33351fd0 not found: ID does not exist" Apr 17 16:23:36.949357 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.949313 2572 scope.go:117] "RemoveContainer" containerID="22261344e6d47d1ea4377ff7bf1e50a613a64d9aa77d745531071783bbbb7499" Apr 17 16:23:36.949526 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.949511 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22261344e6d47d1ea4377ff7bf1e50a613a64d9aa77d745531071783bbbb7499"} err="failed to get container status \"22261344e6d47d1ea4377ff7bf1e50a613a64d9aa77d745531071783bbbb7499\": rpc error: code = NotFound desc = could not find container \"22261344e6d47d1ea4377ff7bf1e50a613a64d9aa77d745531071783bbbb7499\": container with ID starting with 22261344e6d47d1ea4377ff7bf1e50a613a64d9aa77d745531071783bbbb7499 not found: ID does not exist" Apr 17 16:23:36.949565 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.949526 2572 scope.go:117] "RemoveContainer" containerID="878dc06d4824b091e3491cdd9be099cf2696f799116bb0bd451186a9ddb650fa" Apr 17 16:23:36.949803 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.949717 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878dc06d4824b091e3491cdd9be099cf2696f799116bb0bd451186a9ddb650fa"} err="failed to get container status \"878dc06d4824b091e3491cdd9be099cf2696f799116bb0bd451186a9ddb650fa\": rpc error: code = NotFound desc = could not find container \"878dc06d4824b091e3491cdd9be099cf2696f799116bb0bd451186a9ddb650fa\": container with ID starting with 878dc06d4824b091e3491cdd9be099cf2696f799116bb0bd451186a9ddb650fa not found: ID does not exist" Apr 17 16:23:36.949803 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.949740 2572 scope.go:117] "RemoveContainer" containerID="0690e827b290d8a46821e608aefa4f2f9fb494b12c05eed27531ebc6c2f6fed2" Apr 17 16:23:36.949981 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.949961 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0690e827b290d8a46821e608aefa4f2f9fb494b12c05eed27531ebc6c2f6fed2"} err="failed to get container status \"0690e827b290d8a46821e608aefa4f2f9fb494b12c05eed27531ebc6c2f6fed2\": rpc error: code = NotFound desc = could not find container \"0690e827b290d8a46821e608aefa4f2f9fb494b12c05eed27531ebc6c2f6fed2\": container with ID starting with 0690e827b290d8a46821e608aefa4f2f9fb494b12c05eed27531ebc6c2f6fed2 not found: ID does not exist" Apr 17 16:23:36.950025 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.949982 2572 scope.go:117] "RemoveContainer" containerID="308d6c6de71489d70e4b7591d15cdb62144b52260668abda6d6a3d2418ca4a85" Apr 17 16:23:36.950244 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.950224 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"308d6c6de71489d70e4b7591d15cdb62144b52260668abda6d6a3d2418ca4a85"} err="failed to get container status \"308d6c6de71489d70e4b7591d15cdb62144b52260668abda6d6a3d2418ca4a85\": rpc error: code = NotFound desc = could not find container \"308d6c6de71489d70e4b7591d15cdb62144b52260668abda6d6a3d2418ca4a85\": container with ID starting with 308d6c6de71489d70e4b7591d15cdb62144b52260668abda6d6a3d2418ca4a85 not found: ID does not exist" Apr 17 16:23:36.950306 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.950245 2572 scope.go:117] "RemoveContainer" containerID="73629cd773fd2c8b2e11501a343e43d392202c7a57fb3f8a8a374cc719076c6c" Apr 17 16:23:36.950454 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.950407 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73629cd773fd2c8b2e11501a343e43d392202c7a57fb3f8a8a374cc719076c6c"} err="failed to get container status \"73629cd773fd2c8b2e11501a343e43d392202c7a57fb3f8a8a374cc719076c6c\": rpc error: code = NotFound desc = could not find container \"73629cd773fd2c8b2e11501a343e43d392202c7a57fb3f8a8a374cc719076c6c\": container with ID starting with 73629cd773fd2c8b2e11501a343e43d392202c7a57fb3f8a8a374cc719076c6c not found: ID does not exist" Apr 17 16:23:36.951372 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.951357 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:36.954128 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.954096 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 16:23:36.954128 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.954095 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-28gbc\"" Apr 17 16:23:36.954310 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.954172 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 16:23:36.954310 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.954203 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 16:23:36.954310 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.954236 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 16:23:36.954310 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.954274 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 16:23:36.954548 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.954533 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 16:23:36.954614 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.954537 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 16:23:36.954871 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.954855 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 16:23:36.958970 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.958948 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 16:23:36.962167 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:36.962139 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:23:37.040412 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.040369 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3fe07d-1b95-427a-9bf4-df826918a7ec-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.040614 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.040419 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e3fe07d-1b95-427a-9bf4-df826918a7ec-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.040614 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.040445 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7e3fe07d-1b95-427a-9bf4-df826918a7ec-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.040614 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.040478 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7e3fe07d-1b95-427a-9bf4-df826918a7ec-config-out\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.040614 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.040506 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7e3fe07d-1b95-427a-9bf4-df826918a7ec-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.040614 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.040528 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7e3fe07d-1b95-427a-9bf4-df826918a7ec-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.040614 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.040596 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7e3fe07d-1b95-427a-9bf4-df826918a7ec-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.040844 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.040636 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7e3fe07d-1b95-427a-9bf4-df826918a7ec-config-volume\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.040844 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.040654 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7e3fe07d-1b95-427a-9bf4-df826918a7ec-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.040844 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.040690 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7e3fe07d-1b95-427a-9bf4-df826918a7ec-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.040844 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.040716 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7e3fe07d-1b95-427a-9bf4-df826918a7ec-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.040844 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.040771 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7e3fe07d-1b95-427a-9bf4-df826918a7ec-web-config\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.040844 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.040799 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv7zg\" (UniqueName: \"kubernetes.io/projected/7e3fe07d-1b95-427a-9bf4-df826918a7ec-kube-api-access-dv7zg\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.141918 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.141874 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7e3fe07d-1b95-427a-9bf4-df826918a7ec-config-out\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.141918 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.141916 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7e3fe07d-1b95-427a-9bf4-df826918a7ec-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.142185 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.141933 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7e3fe07d-1b95-427a-9bf4-df826918a7ec-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.142185 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.141952 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7e3fe07d-1b95-427a-9bf4-df826918a7ec-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.142294 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.142241 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7e3fe07d-1b95-427a-9bf4-df826918a7ec-config-volume\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.142294 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.142277 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7e3fe07d-1b95-427a-9bf4-df826918a7ec-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.142397 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.142329 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7e3fe07d-1b95-427a-9bf4-df826918a7ec-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.142397 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.142371 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7e3fe07d-1b95-427a-9bf4-df826918a7ec-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.142397 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.142378 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7e3fe07d-1b95-427a-9bf4-df826918a7ec-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.142611 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.142425 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7e3fe07d-1b95-427a-9bf4-df826918a7ec-web-config\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.142611 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.142470 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dv7zg\" (UniqueName: \"kubernetes.io/projected/7e3fe07d-1b95-427a-9bf4-df826918a7ec-kube-api-access-dv7zg\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.142611 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.142512 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3fe07d-1b95-427a-9bf4-df826918a7ec-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.142611 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.142543 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e3fe07d-1b95-427a-9bf4-df826918a7ec-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.142611 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.142573 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7e3fe07d-1b95-427a-9bf4-df826918a7ec-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.144935 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.144835 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7e3fe07d-1b95-427a-9bf4-df826918a7ec-config-out\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.145247 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.145053 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7e3fe07d-1b95-427a-9bf4-df826918a7ec-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.145247 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.145099 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7e3fe07d-1b95-427a-9bf4-df826918a7ec-config-volume\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.145247 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.145120 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7e3fe07d-1b95-427a-9bf4-df826918a7ec-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.145247 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.145102 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7e3fe07d-1b95-427a-9bf4-df826918a7ec-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.145247 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.145232 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7e3fe07d-1b95-427a-9bf4-df826918a7ec-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.145547 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.145272 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7e3fe07d-1b95-427a-9bf4-df826918a7ec-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.145547 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.145488 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7e3fe07d-1b95-427a-9bf4-df826918a7ec-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.145661 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.145644 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e3fe07d-1b95-427a-9bf4-df826918a7ec-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.145791 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.145773 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e3fe07d-1b95-427a-9bf4-df826918a7ec-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.146928 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.146911 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7e3fe07d-1b95-427a-9bf4-df826918a7ec-web-config\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.153489 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.153434 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv7zg\" (UniqueName: \"kubernetes.io/projected/7e3fe07d-1b95-427a-9bf4-df826918a7ec-kube-api-access-dv7zg\") pod \"alertmanager-main-0\" (UID: \"7e3fe07d-1b95-427a-9bf4-df826918a7ec\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.177945 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.177919 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdf5600c-d37f-4857-8721-0ef7948500a5" path="/var/lib/kubelet/pods/cdf5600c-d37f-4857-8721-0ef7948500a5/volumes" Apr 17 16:23:37.260796 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.260770 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 16:23:37.387727 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.387696 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 16:23:37.390993 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:23:37.390966 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e3fe07d_1b95_427a_9bf4_df826918a7ec.slice/crio-a2bd6d284fef9d8e4bdfc24c55f201e41f6d04e5c707e938e54d05930e82b844 WatchSource:0}: Error finding container a2bd6d284fef9d8e4bdfc24c55f201e41f6d04e5c707e938e54d05930e82b844: Status 404 returned error can't find the container with id a2bd6d284fef9d8e4bdfc24c55f201e41f6d04e5c707e938e54d05930e82b844 Apr 17 16:23:37.902716 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.902678 2572 generic.go:358] "Generic (PLEG): container finished" podID="7e3fe07d-1b95-427a-9bf4-df826918a7ec" containerID="76d836e78b8182e761aff1e7ff6a35a55b33c3eafe30642db6fffb10bad36b05" exitCode=0 Apr 17 16:23:37.903091 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.902737 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7e3fe07d-1b95-427a-9bf4-df826918a7ec","Type":"ContainerDied","Data":"76d836e78b8182e761aff1e7ff6a35a55b33c3eafe30642db6fffb10bad36b05"} Apr 17 16:23:37.903091 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:37.902760 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7e3fe07d-1b95-427a-9bf4-df826918a7ec","Type":"ContainerStarted","Data":"a2bd6d284fef9d8e4bdfc24c55f201e41f6d04e5c707e938e54d05930e82b844"} Apr 17 16:23:38.908878 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:38.908845 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7e3fe07d-1b95-427a-9bf4-df826918a7ec","Type":"ContainerStarted","Data":"3dd82456f944ce8a09bbf74c80006e7a7694f0ef044331fb29785c0bed3a2f65"} Apr 17 16:23:38.908878 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:38.908882 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7e3fe07d-1b95-427a-9bf4-df826918a7ec","Type":"ContainerStarted","Data":"b3a608b7a16db91aeb7998a9542c79c7d8c486886171a8328df1f0662011ef4a"} Apr 17 16:23:38.909316 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:38.908891 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7e3fe07d-1b95-427a-9bf4-df826918a7ec","Type":"ContainerStarted","Data":"4474903bbaebcd0b3fdd7c077de1df168d24a7e1bd5ebe3fb0bbf82f75513257"} Apr 17 16:23:38.909316 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:38.908899 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7e3fe07d-1b95-427a-9bf4-df826918a7ec","Type":"ContainerStarted","Data":"65e684df9cc3f6e10bf3f986c2c9c55eb1a6d4e356f4be2715f3598461aed8ac"} Apr 17 16:23:38.909316 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:38.908907 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7e3fe07d-1b95-427a-9bf4-df826918a7ec","Type":"ContainerStarted","Data":"0ed74baa224c16bca067efd8558e878bd52e84bf7eb8f2c226ffc95b5a72ec6e"} Apr 17 16:23:38.909316 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:38.908915 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7e3fe07d-1b95-427a-9bf4-df826918a7ec","Type":"ContainerStarted","Data":"f80b3395281230806bbe9df4d6a444ae759a1e529f59e3d6770af1c0c565fe7b"} Apr 17 16:23:38.934896 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:38.934845 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.9348310570000002 podStartE2EDuration="2.934831057s" podCreationTimestamp="2026-04-17 16:23:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:23:38.932541677 +0000 UTC m=+236.273029766" watchObservedRunningTime="2026-04-17 16:23:38.934831057 +0000 UTC m=+236.275319173" Apr 17 16:23:39.416636 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.416605 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:23:39.417049 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.417020 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="prometheus" containerID="cri-o://916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887" gracePeriod=600 Apr 17 16:23:39.417169 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.417042 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="thanos-sidecar" containerID="cri-o://da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1" gracePeriod=600 Apr 17 16:23:39.417169 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.417031 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="kube-rbac-proxy" containerID="cri-o://dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553" gracePeriod=600 Apr 17 16:23:39.417169 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.417065 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="kube-rbac-proxy-web" containerID="cri-o://9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4" gracePeriod=600 Apr 17 16:23:39.417169 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.417113 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="kube-rbac-proxy-thanos" containerID="cri-o://9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f" gracePeriod=600 Apr 17 16:23:39.417169 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.417126 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="config-reloader" containerID="cri-o://896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06" gracePeriod=600 Apr 17 16:23:39.653997 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.653974 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:39.766116 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.765998 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " Apr 17 16:23:39.766116 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.766039 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-configmap-kubelet-serving-ca-bundle\") pod \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " Apr 17 16:23:39.766116 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.766056 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-metrics-client-certs\") pod \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " Apr 17 16:23:39.766116 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.766099 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbr22\" (UniqueName: \"kubernetes.io/projected/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-kube-api-access-lbr22\") pod \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " Apr 17 16:23:39.766440 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.766127 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-thanos-prometheus-http-client-file\") pod \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " Apr 17 16:23:39.766440 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.766293 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-prometheus-trusted-ca-bundle\") pod \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " Apr 17 16:23:39.766440 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.766339 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-prometheus-k8s-rulefiles-0\") pod \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " Apr 17 16:23:39.766596 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.766549 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-prometheus-k8s-tls\") pod \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " Apr 17 16:23:39.766647 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.766595 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-grpc-tls\") pod \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " Apr 17 16:23:39.766647 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.766540 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" (UID: "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:23:39.766647 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.766628 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-prometheus-k8s-db\") pod \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " Apr 17 16:23:39.766787 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.766676 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-tls-assets\") pod \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " Apr 17 16:23:39.766787 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.766708 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-configmap-metrics-client-ca\") pod \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " Apr 17 16:23:39.766787 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.766743 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " Apr 17 16:23:39.766787 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.766783 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-web-config\") pod \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " Apr 17 16:23:39.766974 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.766824 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-kube-rbac-proxy\") pod \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " Apr 17 16:23:39.766974 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.766872 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-config\") pod \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " Apr 17 16:23:39.766974 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.766912 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-config-out\") pod \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " Apr 17 16:23:39.766974 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.766948 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-configmap-serving-certs-ca-bundle\") pod \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\" (UID: \"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0\") " Apr 17 16:23:39.767290 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.767267 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:39.767595 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.767525 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" (UID: "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:23:39.768382 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.767762 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" (UID: "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:23:39.768382 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.768123 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" (UID: "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:23:39.768747 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.768708 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" (UID: "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 16:23:39.769331 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.768881 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" (UID: "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:23:39.769422 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.769355 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" (UID: "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:23:39.770199 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.770147 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" (UID: "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:23:39.770304 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.770287 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" (UID: "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:23:39.770516 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.770459 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" (UID: "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:23:39.770516 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.770481 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-kube-api-access-lbr22" (OuterVolumeSpecName: "kube-api-access-lbr22") pod "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" (UID: "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0"). InnerVolumeSpecName "kube-api-access-lbr22". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:23:39.770703 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.770683 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-config-out" (OuterVolumeSpecName: "config-out") pod "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" (UID: "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 16:23:39.770703 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.770684 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" (UID: "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:23:39.771001 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.770984 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" (UID: "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:23:39.771223 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.771209 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-config" (OuterVolumeSpecName: "config") pod "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" (UID: "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:23:39.771356 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.771340 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" (UID: "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:23:39.771910 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.771895 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" (UID: "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:23:39.780459 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.780433 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-web-config" (OuterVolumeSpecName: "web-config") pod "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" (UID: "ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 16:23:39.868216 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.868171 2572 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-config-out\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:39.868216 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.868211 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:39.868216 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.868221 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:39.868216 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.868232 2572 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-metrics-client-certs\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:39.868474 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.868242 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lbr22\" (UniqueName: \"kubernetes.io/projected/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-kube-api-access-lbr22\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:39.868474 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.868251 2572 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-thanos-prometheus-http-client-file\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:39.868474 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.868262 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-prometheus-trusted-ca-bundle\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:39.868474 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.868272 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:39.868474 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.868281 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-prometheus-k8s-tls\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:39.868474 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.868290 2572 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-grpc-tls\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:39.868474 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.868298 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-prometheus-k8s-db\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:39.868474 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.868307 2572 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-tls-assets\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:39.868474 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.868316 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-configmap-metrics-client-ca\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:39.868474 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.868326 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:39.868474 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.868335 2572 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-web-config\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:39.868474 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.868343 2572 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-secret-kube-rbac-proxy\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:39.868474 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.868353 2572 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0-config\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:23:39.918802 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.918770 2572 generic.go:358] "Generic (PLEG): container finished" podID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerID="9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f" exitCode=0 Apr 17 16:23:39.918802 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.918797 2572 generic.go:358] "Generic (PLEG): container finished" podID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerID="dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553" exitCode=0 Apr 17 16:23:39.918802 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.918803 2572 generic.go:358] "Generic (PLEG): container finished" podID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerID="9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4" exitCode=0 Apr 17 16:23:39.918802 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.918811 2572 generic.go:358] "Generic (PLEG): container finished" podID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerID="da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1" exitCode=0 Apr 17 16:23:39.919306 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.918817 2572 generic.go:358] "Generic (PLEG): container finished" podID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerID="896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06" exitCode=0 Apr 17 16:23:39.919306 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.918809 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0","Type":"ContainerDied","Data":"9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f"} Apr 17 16:23:39.919306 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.918849 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0","Type":"ContainerDied","Data":"dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553"} Apr 17 16:23:39.919306 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.918860 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0","Type":"ContainerDied","Data":"9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4"} Apr 17 16:23:39.919306 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.918869 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0","Type":"ContainerDied","Data":"da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1"} Apr 17 16:23:39.919306 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.918878 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0","Type":"ContainerDied","Data":"896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06"} Apr 17 16:23:39.919306 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.918887 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0","Type":"ContainerDied","Data":"916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887"} Apr 17 16:23:39.919306 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.918822 2572 generic.go:358] "Generic (PLEG): container finished" podID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerID="916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887" exitCode=0 Apr 17 16:23:39.919306 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.918903 2572 scope.go:117] "RemoveContainer" containerID="9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f" Apr 17 16:23:39.919306 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.918907 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:39.919306 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.918974 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0","Type":"ContainerDied","Data":"cff87dbe80dde579c7866e826449b7fc58ca723924f0d8607cf9451e1b706b7c"} Apr 17 16:23:39.926548 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.926527 2572 scope.go:117] "RemoveContainer" containerID="dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553" Apr 17 16:23:39.933522 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.933501 2572 scope.go:117] "RemoveContainer" containerID="9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4" Apr 17 16:23:39.939707 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.939688 2572 scope.go:117] "RemoveContainer" containerID="da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1" Apr 17 16:23:39.942369 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.942347 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:23:39.945459 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.945439 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:23:39.949448 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.949426 2572 scope.go:117] "RemoveContainer" containerID="896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06" Apr 17 16:23:39.955810 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.955792 2572 scope.go:117] "RemoveContainer" containerID="916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887" Apr 17 16:23:39.962516 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.962502 2572 scope.go:117] "RemoveContainer" containerID="9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80" Apr 17 16:23:39.967928 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.967904 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:23:39.968288 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.968269 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="init-config-reloader" Apr 17 16:23:39.968288 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.968289 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="init-config-reloader" Apr 17 16:23:39.968405 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.968307 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="thanos-sidecar" Apr 17 16:23:39.968405 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.968317 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="thanos-sidecar" Apr 17 16:23:39.968405 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.968361 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="prometheus" Apr 17 16:23:39.968405 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.968368 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="prometheus" Apr 17 16:23:39.968405 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.968380 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="kube-rbac-proxy-thanos" Apr 17 16:23:39.968405 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.968385 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="kube-rbac-proxy-thanos" Apr 17 16:23:39.968405 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.968392 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="kube-rbac-proxy" Apr 17 16:23:39.968405 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.968397 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="kube-rbac-proxy" Apr 17 16:23:39.968405 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.968403 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="kube-rbac-proxy-web" Apr 17 16:23:39.968405 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.968408 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="kube-rbac-proxy-web" Apr 17 16:23:39.968873 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.968415 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="config-reloader" Apr 17 16:23:39.968873 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.968420 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="config-reloader" Apr 17 16:23:39.968873 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.968475 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="config-reloader" Apr 17 16:23:39.968873 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.968484 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="prometheus" Apr 17 16:23:39.968873 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.968495 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="kube-rbac-proxy-web" Apr 17 16:23:39.968873 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.968503 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="kube-rbac-proxy-thanos" Apr 17 16:23:39.968873 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.968512 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="kube-rbac-proxy" Apr 17 16:23:39.968873 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.968519 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" containerName="thanos-sidecar" Apr 17 16:23:39.969767 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.969754 2572 scope.go:117] "RemoveContainer" containerID="9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f" Apr 17 16:23:39.970000 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:23:39.969983 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f\": container with ID starting with 9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f not found: ID does not exist" containerID="9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f" Apr 17 16:23:39.970054 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.970009 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f"} err="failed to get container status \"9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f\": rpc error: code = NotFound desc = could not find container \"9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f\": container with ID starting with 9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f not found: ID does not exist" Apr 17 16:23:39.970054 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.970025 2572 scope.go:117] "RemoveContainer" containerID="dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553" Apr 17 16:23:39.970316 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:23:39.970294 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553\": container with ID starting with dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553 not found: ID does not exist" containerID="dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553" Apr 17 16:23:39.970359 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.970327 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553"} err="failed to get container status \"dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553\": rpc error: code = NotFound desc = could not find container \"dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553\": container with ID starting with dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553 not found: ID does not exist" Apr 17 16:23:39.970359 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.970346 2572 scope.go:117] "RemoveContainer" containerID="9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4" Apr 17 16:23:39.970581 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:23:39.970563 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4\": container with ID starting with 9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4 not found: ID does not exist" containerID="9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4" Apr 17 16:23:39.970642 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.970590 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4"} err="failed to get container status \"9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4\": rpc error: code = NotFound desc = could not find container \"9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4\": container with ID starting with 9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4 not found: ID does not exist" Apr 17 16:23:39.970642 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.970612 2572 scope.go:117] "RemoveContainer" containerID="da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1" Apr 17 16:23:39.970880 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:23:39.970862 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1\": container with ID starting with da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1 not found: ID does not exist" containerID="da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1" Apr 17 16:23:39.970949 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.970884 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1"} err="failed to get container status \"da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1\": rpc error: code = NotFound desc = could not find container \"da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1\": container with ID starting with da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1 not found: ID does not exist" Apr 17 16:23:39.970949 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.970897 2572 scope.go:117] "RemoveContainer" containerID="896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06" Apr 17 16:23:39.971124 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:23:39.971105 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06\": container with ID starting with 896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06 not found: ID does not exist" containerID="896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06" Apr 17 16:23:39.971169 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.971129 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06"} err="failed to get container status \"896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06\": rpc error: code = NotFound desc = could not find container \"896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06\": container with ID starting with 896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06 not found: ID does not exist" Apr 17 16:23:39.971169 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.971145 2572 scope.go:117] "RemoveContainer" containerID="916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887" Apr 17 16:23:39.971385 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:23:39.971371 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887\": container with ID starting with 916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887 not found: ID does not exist" containerID="916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887" Apr 17 16:23:39.971421 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.971389 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887"} err="failed to get container status \"916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887\": rpc error: code = NotFound desc = could not find container \"916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887\": container with ID starting with 916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887 not found: ID does not exist" Apr 17 16:23:39.971421 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.971401 2572 scope.go:117] "RemoveContainer" containerID="9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80" Apr 17 16:23:39.971601 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:23:39.971585 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80\": container with ID starting with 9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80 not found: ID does not exist" containerID="9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80" Apr 17 16:23:39.971633 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.971606 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80"} err="failed to get container status \"9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80\": rpc error: code = NotFound desc = could not find container \"9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80\": container with ID starting with 9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80 not found: ID does not exist" Apr 17 16:23:39.971633 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.971622 2572 scope.go:117] "RemoveContainer" containerID="9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f" Apr 17 16:23:39.971872 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.971851 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f"} err="failed to get container status \"9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f\": rpc error: code = NotFound desc = could not find container \"9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f\": container with ID starting with 9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f not found: ID does not exist" Apr 17 16:23:39.971949 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.971873 2572 scope.go:117] "RemoveContainer" containerID="dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553" Apr 17 16:23:39.972108 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.972091 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553"} err="failed to get container status \"dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553\": rpc error: code = NotFound desc = could not find container \"dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553\": container with ID starting with dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553 not found: ID does not exist" Apr 17 16:23:39.972155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.972113 2572 scope.go:117] "RemoveContainer" containerID="9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4" Apr 17 16:23:39.972155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.972096 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:39.972398 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.972373 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4"} err="failed to get container status \"9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4\": rpc error: code = NotFound desc = could not find container \"9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4\": container with ID starting with 9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4 not found: ID does not exist" Apr 17 16:23:39.972471 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.972400 2572 scope.go:117] "RemoveContainer" containerID="da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1" Apr 17 16:23:39.972734 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.972712 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1"} err="failed to get container status \"da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1\": rpc error: code = NotFound desc = could not find container \"da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1\": container with ID starting with da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1 not found: ID does not exist" Apr 17 16:23:39.972798 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.972736 2572 scope.go:117] "RemoveContainer" containerID="896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06" Apr 17 16:23:39.972971 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.972951 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06"} err="failed to get container status \"896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06\": rpc error: code = NotFound desc = could not find container \"896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06\": container with ID starting with 896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06 not found: ID does not exist" Apr 17 16:23:39.973015 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.972974 2572 scope.go:117] "RemoveContainer" containerID="916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887" Apr 17 16:23:39.973238 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.973214 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887"} err="failed to get container status \"916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887\": rpc error: code = NotFound desc = could not find container \"916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887\": container with ID starting with 916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887 not found: ID does not exist" Apr 17 16:23:39.973277 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.973242 2572 scope.go:117] "RemoveContainer" containerID="9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80" Apr 17 16:23:39.973482 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.973465 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80"} err="failed to get container status \"9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80\": rpc error: code = NotFound desc = could not find container \"9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80\": container with ID starting with 9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80 not found: ID does not exist" Apr 17 16:23:39.973522 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.973483 2572 scope.go:117] "RemoveContainer" containerID="9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f" Apr 17 16:23:39.973700 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.973679 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f"} err="failed to get container status \"9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f\": rpc error: code = NotFound desc = could not find container \"9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f\": container with ID starting with 9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f not found: ID does not exist" Apr 17 16:23:39.973740 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.973702 2572 scope.go:117] "RemoveContainer" containerID="dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553" Apr 17 16:23:39.973931 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.973909 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553"} err="failed to get container status \"dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553\": rpc error: code = NotFound desc = could not find container \"dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553\": container with ID starting with dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553 not found: ID does not exist" Apr 17 16:23:39.973970 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.973933 2572 scope.go:117] "RemoveContainer" containerID="9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4" Apr 17 16:23:39.974212 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.974188 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4"} err="failed to get container status \"9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4\": rpc error: code = NotFound desc = could not find container \"9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4\": container with ID starting with 9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4 not found: ID does not exist" Apr 17 16:23:39.974268 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.974214 2572 scope.go:117] "RemoveContainer" containerID="da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1" Apr 17 16:23:39.974427 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.974410 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1"} err="failed to get container status \"da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1\": rpc error: code = NotFound desc = could not find container \"da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1\": container with ID starting with da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1 not found: ID does not exist" Apr 17 16:23:39.974427 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.974426 2572 scope.go:117] "RemoveContainer" containerID="896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06" Apr 17 16:23:39.974666 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.974640 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06"} err="failed to get container status \"896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06\": rpc error: code = NotFound desc = could not find container \"896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06\": container with ID starting with 896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06 not found: ID does not exist" Apr 17 16:23:39.974746 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.974669 2572 scope.go:117] "RemoveContainer" containerID="916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887" Apr 17 16:23:39.974936 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.974917 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887"} err="failed to get container status \"916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887\": rpc error: code = NotFound desc = could not find container \"916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887\": container with ID starting with 916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887 not found: ID does not exist" Apr 17 16:23:39.974997 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.974936 2572 scope.go:117] "RemoveContainer" containerID="9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80" Apr 17 16:23:39.974997 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.974984 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 16:23:39.975203 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.975184 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80"} err="failed to get container status \"9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80\": rpc error: code = NotFound desc = could not find container \"9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80\": container with ID starting with 9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80 not found: ID does not exist" Apr 17 16:23:39.975203 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.975203 2572 scope.go:117] "RemoveContainer" containerID="9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f" Apr 17 16:23:39.975420 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.975323 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 16:23:39.975420 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.975335 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 16:23:39.975420 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.975335 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 16:23:39.975420 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.975327 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 16:23:39.975420 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.975327 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-fmvcn\"" Apr 17 16:23:39.975660 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.975513 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f"} err="failed to get container status \"9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f\": rpc error: code = NotFound desc = could not find container \"9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f\": container with ID starting with 9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f not found: ID does not exist" Apr 17 16:23:39.975660 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.975535 2572 scope.go:117] "RemoveContainer" containerID="dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553" Apr 17 16:23:39.975660 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.975647 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 16:23:39.975807 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.975649 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 16:23:39.975807 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.975685 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 16:23:39.975807 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.975754 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 16:23:39.975915 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.975860 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553"} err="failed to get container status \"dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553\": rpc error: code = NotFound desc = could not find container \"dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553\": container with ID starting with dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553 not found: ID does not exist" Apr 17 16:23:39.975915 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.975882 2572 scope.go:117] "RemoveContainer" containerID="9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4" Apr 17 16:23:39.975915 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.975886 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-6sd4bp4t45j9\"" Apr 17 16:23:39.976144 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.976124 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4"} err="failed to get container status \"9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4\": rpc error: code = NotFound desc = could not find container \"9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4\": container with ID starting with 9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4 not found: ID does not exist" Apr 17 16:23:39.976144 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.976143 2572 scope.go:117] "RemoveContainer" containerID="da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1" Apr 17 16:23:39.976429 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.976394 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1"} err="failed to get container status \"da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1\": rpc error: code = NotFound desc = could not find container \"da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1\": container with ID starting with da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1 not found: ID does not exist" Apr 17 16:23:39.976429 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.976421 2572 scope.go:117] "RemoveContainer" containerID="896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06" Apr 17 16:23:39.976533 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.976406 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 16:23:39.976533 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.976409 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 16:23:39.976731 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.976713 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06"} err="failed to get container status \"896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06\": rpc error: code = NotFound desc = could not find container \"896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06\": container with ID starting with 896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06 not found: ID does not exist" Apr 17 16:23:39.976767 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.976733 2572 scope.go:117] "RemoveContainer" containerID="916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887" Apr 17 16:23:39.976994 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.976964 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887"} err="failed to get container status \"916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887\": rpc error: code = NotFound desc = could not find container \"916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887\": container with ID starting with 916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887 not found: ID does not exist" Apr 17 16:23:39.977060 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.976994 2572 scope.go:117] "RemoveContainer" containerID="9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80" Apr 17 16:23:39.977301 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.977277 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80"} err="failed to get container status \"9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80\": rpc error: code = NotFound desc = could not find container \"9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80\": container with ID starting with 9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80 not found: ID does not exist" Apr 17 16:23:39.977301 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.977301 2572 scope.go:117] "RemoveContainer" containerID="9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f" Apr 17 16:23:39.977592 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.977554 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f"} err="failed to get container status \"9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f\": rpc error: code = NotFound desc = could not find container \"9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f\": container with ID starting with 9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f not found: ID does not exist" Apr 17 16:23:39.977665 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.977593 2572 scope.go:117] "RemoveContainer" containerID="dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553" Apr 17 16:23:39.977827 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.977810 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553"} err="failed to get container status \"dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553\": rpc error: code = NotFound desc = could not find container \"dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553\": container with ID starting with dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553 not found: ID does not exist" Apr 17 16:23:39.977897 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.977827 2572 scope.go:117] "RemoveContainer" containerID="9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4" Apr 17 16:23:39.978043 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.978018 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4"} err="failed to get container status \"9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4\": rpc error: code = NotFound desc = could not find container \"9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4\": container with ID starting with 9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4 not found: ID does not exist" Apr 17 16:23:39.978149 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.978041 2572 scope.go:117] "RemoveContainer" containerID="da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1" Apr 17 16:23:39.978306 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.978286 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1"} err="failed to get container status \"da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1\": rpc error: code = NotFound desc = could not find container \"da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1\": container with ID starting with da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1 not found: ID does not exist" Apr 17 16:23:39.978356 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.978307 2572 scope.go:117] "RemoveContainer" containerID="896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06" Apr 17 16:23:39.978624 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.978600 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06"} err="failed to get container status \"896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06\": rpc error: code = NotFound desc = could not find container \"896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06\": container with ID starting with 896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06 not found: ID does not exist" Apr 17 16:23:39.978752 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.978626 2572 scope.go:117] "RemoveContainer" containerID="916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887" Apr 17 16:23:39.978949 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.978920 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887"} err="failed to get container status \"916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887\": rpc error: code = NotFound desc = could not find container \"916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887\": container with ID starting with 916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887 not found: ID does not exist" Apr 17 16:23:39.978949 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.978943 2572 scope.go:117] "RemoveContainer" containerID="9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80" Apr 17 16:23:39.979330 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.979310 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80"} err="failed to get container status \"9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80\": rpc error: code = NotFound desc = could not find container \"9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80\": container with ID starting with 9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80 not found: ID does not exist" Apr 17 16:23:39.979330 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.979330 2572 scope.go:117] "RemoveContainer" containerID="9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f" Apr 17 16:23:39.979487 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.979392 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 16:23:39.979622 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.979604 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f"} err="failed to get container status \"9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f\": rpc error: code = NotFound desc = could not find container \"9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f\": container with ID starting with 9d957669d10cf2c46da76f67e05dc3e6ced9b271baf6ca825657496fa663085f not found: ID does not exist" Apr 17 16:23:39.979705 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.979623 2572 scope.go:117] "RemoveContainer" containerID="dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553" Apr 17 16:23:39.979840 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.979825 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553"} err="failed to get container status \"dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553\": rpc error: code = NotFound desc = could not find container \"dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553\": container with ID starting with dc82d020bee7cd2e0031804f6c7430ac3d652f8cebf7fd2b2224d4aed89b5553 not found: ID does not exist" Apr 17 16:23:39.979902 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.979841 2572 scope.go:117] "RemoveContainer" containerID="9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4" Apr 17 16:23:39.980048 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.980023 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4"} err="failed to get container status \"9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4\": rpc error: code = NotFound desc = could not find container \"9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4\": container with ID starting with 9be0ee5c9a6c4620b1d687c300fb84fd3b0474e8fc0610a8ab2b8d4f5e1b08c4 not found: ID does not exist" Apr 17 16:23:39.980048 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.980048 2572 scope.go:117] "RemoveContainer" containerID="da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1" Apr 17 16:23:39.980547 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.980467 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1"} err="failed to get container status \"da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1\": rpc error: code = NotFound desc = could not find container \"da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1\": container with ID starting with da84cf9f75e6ad8da85da920cb65a31a0d833b8d0854bfdf4decea9c0f056ef1 not found: ID does not exist" Apr 17 16:23:39.980547 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.980493 2572 scope.go:117] "RemoveContainer" containerID="896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06" Apr 17 16:23:39.980767 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.980746 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06"} err="failed to get container status \"896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06\": rpc error: code = NotFound desc = could not find container \"896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06\": container with ID starting with 896cc7b85079b35f0f249852a6682fc452a619838988aedd6f2b36e3105e9d06 not found: ID does not exist" Apr 17 16:23:39.980840 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.980769 2572 scope.go:117] "RemoveContainer" containerID="916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887" Apr 17 16:23:39.980996 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.980979 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887"} err="failed to get container status \"916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887\": rpc error: code = NotFound desc = could not find container \"916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887\": container with ID starting with 916157435ba41ba2ac5bcb7c4016dab3b06997c73cdbe3464d119a0679fd9887 not found: ID does not exist" Apr 17 16:23:39.981036 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.980997 2572 scope.go:117] "RemoveContainer" containerID="9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80" Apr 17 16:23:39.981224 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.981209 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 16:23:39.981277 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.981226 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80"} err="failed to get container status \"9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80\": rpc error: code = NotFound desc = could not find container \"9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80\": container with ID starting with 9d8feaba9d49b7cd0a3ec64ccef2a5c7c387dac12a8fedcff0077862218f8d80 not found: ID does not exist" Apr 17 16:23:39.985224 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:39.985201 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:23:40.070122 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.070006 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f52106fd-1924-474a-99d3-a7af72e7a3ce-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.070122 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.070043 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.070122 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.070068 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.070122 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.070103 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f52106fd-1924-474a-99d3-a7af72e7a3ce-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.070122 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.070121 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f52106fd-1924-474a-99d3-a7af72e7a3ce-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.070411 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.070136 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.070411 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.070163 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rsgw\" (UniqueName: \"kubernetes.io/projected/f52106fd-1924-474a-99d3-a7af72e7a3ce-kube-api-access-2rsgw\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.070411 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.070225 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.070411 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.070251 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f52106fd-1924-474a-99d3-a7af72e7a3ce-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.070411 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.070280 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.070411 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.070307 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.070411 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.070325 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-config\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.070411 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.070339 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f52106fd-1924-474a-99d3-a7af72e7a3ce-config-out\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.070411 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.070357 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f52106fd-1924-474a-99d3-a7af72e7a3ce-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.070411 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.070382 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f52106fd-1924-474a-99d3-a7af72e7a3ce-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.070715 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.070416 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-web-config\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.070715 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.070449 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f52106fd-1924-474a-99d3-a7af72e7a3ce-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.070715 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.070492 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.171704 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.171670 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f52106fd-1924-474a-99d3-a7af72e7a3ce-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.171859 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.171720 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.171859 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.171760 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f52106fd-1924-474a-99d3-a7af72e7a3ce-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.171859 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.171778 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.171859 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.171798 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.171859 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.171816 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f52106fd-1924-474a-99d3-a7af72e7a3ce-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.171859 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.171835 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f52106fd-1924-474a-99d3-a7af72e7a3ce-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.171859 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.171852 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.172280 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.171878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rsgw\" (UniqueName: \"kubernetes.io/projected/f52106fd-1924-474a-99d3-a7af72e7a3ce-kube-api-access-2rsgw\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.172652 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.172622 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f52106fd-1924-474a-99d3-a7af72e7a3ce-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.172758 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.172699 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f52106fd-1924-474a-99d3-a7af72e7a3ce-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.172813 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.172754 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.172813 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.172789 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f52106fd-1924-474a-99d3-a7af72e7a3ce-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.172915 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.172842 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.172915 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.172876 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.172915 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.172902 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-config\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.173065 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.172925 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f52106fd-1924-474a-99d3-a7af72e7a3ce-config-out\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.173065 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.172950 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f52106fd-1924-474a-99d3-a7af72e7a3ce-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.173065 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.172978 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f52106fd-1924-474a-99d3-a7af72e7a3ce-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.173065 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.173025 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-web-config\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.174783 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.174664 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.175111 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.174956 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f52106fd-1924-474a-99d3-a7af72e7a3ce-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.175111 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.174984 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.175111 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.175093 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.175402 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.175381 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.175871 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.175602 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f52106fd-1924-474a-99d3-a7af72e7a3ce-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.175953 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.175914 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f52106fd-1924-474a-99d3-a7af72e7a3ce-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.176094 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.176056 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f52106fd-1924-474a-99d3-a7af72e7a3ce-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.176177 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.176156 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.176334 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.176309 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-web-config\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.176700 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.176677 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f52106fd-1924-474a-99d3-a7af72e7a3ce-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.177143 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.177118 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-config\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.177620 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.177601 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f52106fd-1924-474a-99d3-a7af72e7a3ce-config-out\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.177845 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.177825 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.178227 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.178211 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f52106fd-1924-474a-99d3-a7af72e7a3ce-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.179788 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.179768 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rsgw\" (UniqueName: \"kubernetes.io/projected/f52106fd-1924-474a-99d3-a7af72e7a3ce-kube-api-access-2rsgw\") pod \"prometheus-k8s-0\" (UID: \"f52106fd-1924-474a-99d3-a7af72e7a3ce\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.282928 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.282881 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:40.429976 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.429944 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 16:23:40.434031 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:23:40.434003 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf52106fd_1924_474a_99d3_a7af72e7a3ce.slice/crio-6b3822f2671039c6e05c68d6e5214f8919c899a27a8b60002bed91d698f47394 WatchSource:0}: Error finding container 6b3822f2671039c6e05c68d6e5214f8919c899a27a8b60002bed91d698f47394: Status 404 returned error can't find the container with id 6b3822f2671039c6e05c68d6e5214f8919c899a27a8b60002bed91d698f47394 Apr 17 16:23:40.923696 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.923660 2572 generic.go:358] "Generic (PLEG): container finished" podID="f52106fd-1924-474a-99d3-a7af72e7a3ce" containerID="db747fa2b41fce9b0be02e2ff40b3421b768f4d5f9bd877b1564a9215071ca3f" exitCode=0 Apr 17 16:23:40.924212 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.923754 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f52106fd-1924-474a-99d3-a7af72e7a3ce","Type":"ContainerDied","Data":"db747fa2b41fce9b0be02e2ff40b3421b768f4d5f9bd877b1564a9215071ca3f"} Apr 17 16:23:40.924212 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:40.923797 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f52106fd-1924-474a-99d3-a7af72e7a3ce","Type":"ContainerStarted","Data":"6b3822f2671039c6e05c68d6e5214f8919c899a27a8b60002bed91d698f47394"} Apr 17 16:23:41.186532 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:41.186495 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0" path="/var/lib/kubelet/pods/ae9f1dc0-9ec1-476d-abd7-fa62aa50d9b0/volumes" Apr 17 16:23:41.930780 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:41.930746 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f52106fd-1924-474a-99d3-a7af72e7a3ce","Type":"ContainerStarted","Data":"06c88e6b2c6b72a275d8dffc804b853cb124643944baa8686039ee2acaf1cb45"} Apr 17 16:23:41.930780 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:41.930783 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f52106fd-1924-474a-99d3-a7af72e7a3ce","Type":"ContainerStarted","Data":"7b98c38231555da020f96e592539f244052c174addcb8577ce22448f4cd455a7"} Apr 17 16:23:41.931243 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:41.930793 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f52106fd-1924-474a-99d3-a7af72e7a3ce","Type":"ContainerStarted","Data":"2bfd67316d524c113412362243715270e42bc948b3e1165bc0699b8de99bf905"} Apr 17 16:23:41.931243 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:41.930803 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f52106fd-1924-474a-99d3-a7af72e7a3ce","Type":"ContainerStarted","Data":"469b126418ea609500e03a279e4f68ee7f4b62c5390b8da5005a80e9c77beae9"} Apr 17 16:23:41.931243 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:41.930811 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f52106fd-1924-474a-99d3-a7af72e7a3ce","Type":"ContainerStarted","Data":"fa4af9a0dd6ee7e295dc9a27ab4ee9e4d99dda274fbd64832c6a02840da455e4"} Apr 17 16:23:41.931243 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:41.930819 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f52106fd-1924-474a-99d3-a7af72e7a3ce","Type":"ContainerStarted","Data":"8797ccac6961b269e8aad71539d26be72c4b9ccfb97044350604a8c4fe189555"} Apr 17 16:23:41.959501 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:41.959456 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.959442992 podStartE2EDuration="2.959442992s" podCreationTimestamp="2026-04-17 16:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:23:41.958171643 +0000 UTC m=+239.298659707" watchObservedRunningTime="2026-04-17 16:23:41.959442992 +0000 UTC m=+239.299931070" Apr 17 16:23:45.283822 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:45.283787 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:23:55.002824 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:55.002787 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs\") pod \"network-metrics-daemon-j89hr\" (UID: \"dbd283d5-ff0b-4c8f-b1be-15a75816e953\") " pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:23:55.005093 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:55.005051 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dbd283d5-ff0b-4c8f-b1be-15a75816e953-metrics-certs\") pod \"network-metrics-daemon-j89hr\" (UID: \"dbd283d5-ff0b-4c8f-b1be-15a75816e953\") " pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:23:55.079052 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:55.079014 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qr2mj\"" Apr 17 16:23:55.087163 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:55.087140 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-j89hr" Apr 17 16:23:55.219639 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:55.219614 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-j89hr"] Apr 17 16:23:55.222339 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:23:55.222309 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbd283d5_ff0b_4c8f_b1be_15a75816e953.slice/crio-e7451570d61bf77774dd4a27f6e1dfea887edebf1c7665a751060f2a1bc5f1b2 WatchSource:0}: Error finding container e7451570d61bf77774dd4a27f6e1dfea887edebf1c7665a751060f2a1bc5f1b2: Status 404 returned error can't find the container with id e7451570d61bf77774dd4a27f6e1dfea887edebf1c7665a751060f2a1bc5f1b2 Apr 17 16:23:55.981241 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:55.981195 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j89hr" event={"ID":"dbd283d5-ff0b-4c8f-b1be-15a75816e953","Type":"ContainerStarted","Data":"e7451570d61bf77774dd4a27f6e1dfea887edebf1c7665a751060f2a1bc5f1b2"} Apr 17 16:23:56.988985 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:56.988944 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j89hr" event={"ID":"dbd283d5-ff0b-4c8f-b1be-15a75816e953","Type":"ContainerStarted","Data":"296613f97dba79ae83f8b210bab4ca44ba9e218e8095328bf00e3fc7dd557c40"} Apr 17 16:23:56.988985 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:56.988990 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-j89hr" event={"ID":"dbd283d5-ff0b-4c8f-b1be-15a75816e953","Type":"ContainerStarted","Data":"7cb8c6d0d142f10c2e47dd0223d26ffc1448bba9aa5cf3eb8e407eaced3a55b6"} Apr 17 16:23:57.005969 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:23:57.005910 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-j89hr" podStartSLOduration=252.958184638 podStartE2EDuration="4m14.005891339s" podCreationTimestamp="2026-04-17 16:19:43 +0000 UTC" firstStartedPulling="2026-04-17 16:23:55.224406592 +0000 UTC m=+252.564894664" lastFinishedPulling="2026-04-17 16:23:56.272113308 +0000 UTC m=+253.612601365" observedRunningTime="2026-04-17 16:23:57.003630877 +0000 UTC m=+254.344118956" watchObservedRunningTime="2026-04-17 16:23:57.005891339 +0000 UTC m=+254.346379424" Apr 17 16:24:40.283245 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:24:40.283206 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:24:40.298520 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:24:40.298496 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:24:41.137256 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:24:41.137225 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 16:24:43.059650 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:24:43.059624 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ljhb6_72bbd866-8c40-48e3-9eb3-b34ae76679de/console-operator/2.log" Apr 17 16:24:43.060116 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:24:43.059916 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ljhb6_72bbd866-8c40-48e3-9eb3-b34ae76679de/console-operator/2.log" Apr 17 16:24:43.069962 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:24:43.069939 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 16:25:42.096155 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.096047 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7bf4f445d7-w9z9l"] Apr 17 16:25:42.099376 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.099354 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-w9z9l" Apr 17 16:25:42.103229 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.103200 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 16:25:42.103433 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.103414 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-gzmb9\"" Apr 17 16:25:42.103556 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.103436 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 16:25:42.103556 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.103517 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 16:25:42.103668 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.103652 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 16:25:42.103724 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.103713 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 16:25:42.108897 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.108875 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7bf4f445d7-w9z9l"] Apr 17 16:25:42.229697 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.229663 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd7f7eed-6e93-49b8-8304-e883b3d51258-metrics-cert\") pod \"lws-controller-manager-7bf4f445d7-w9z9l\" (UID: \"fd7f7eed-6e93-49b8-8304-e883b3d51258\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-w9z9l" Apr 17 16:25:42.229697 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.229702 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fd7f7eed-6e93-49b8-8304-e883b3d51258-manager-config\") pod \"lws-controller-manager-7bf4f445d7-w9z9l\" (UID: \"fd7f7eed-6e93-49b8-8304-e883b3d51258\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-w9z9l" Apr 17 16:25:42.229955 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.229819 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd7f7eed-6e93-49b8-8304-e883b3d51258-cert\") pod \"lws-controller-manager-7bf4f445d7-w9z9l\" (UID: \"fd7f7eed-6e93-49b8-8304-e883b3d51258\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-w9z9l" Apr 17 16:25:42.229955 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.229868 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47tx7\" (UniqueName: \"kubernetes.io/projected/fd7f7eed-6e93-49b8-8304-e883b3d51258-kube-api-access-47tx7\") pod \"lws-controller-manager-7bf4f445d7-w9z9l\" (UID: \"fd7f7eed-6e93-49b8-8304-e883b3d51258\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-w9z9l" Apr 17 16:25:42.331163 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.331128 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd7f7eed-6e93-49b8-8304-e883b3d51258-cert\") pod \"lws-controller-manager-7bf4f445d7-w9z9l\" (UID: \"fd7f7eed-6e93-49b8-8304-e883b3d51258\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-w9z9l" Apr 17 16:25:42.331163 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.331167 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47tx7\" (UniqueName: \"kubernetes.io/projected/fd7f7eed-6e93-49b8-8304-e883b3d51258-kube-api-access-47tx7\") pod \"lws-controller-manager-7bf4f445d7-w9z9l\" (UID: \"fd7f7eed-6e93-49b8-8304-e883b3d51258\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-w9z9l" Apr 17 16:25:42.331366 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.331208 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd7f7eed-6e93-49b8-8304-e883b3d51258-metrics-cert\") pod \"lws-controller-manager-7bf4f445d7-w9z9l\" (UID: \"fd7f7eed-6e93-49b8-8304-e883b3d51258\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-w9z9l" Apr 17 16:25:42.331366 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.331231 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fd7f7eed-6e93-49b8-8304-e883b3d51258-manager-config\") pod \"lws-controller-manager-7bf4f445d7-w9z9l\" (UID: \"fd7f7eed-6e93-49b8-8304-e883b3d51258\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-w9z9l" Apr 17 16:25:42.331838 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.331816 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/fd7f7eed-6e93-49b8-8304-e883b3d51258-manager-config\") pod \"lws-controller-manager-7bf4f445d7-w9z9l\" (UID: \"fd7f7eed-6e93-49b8-8304-e883b3d51258\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-w9z9l" Apr 17 16:25:42.333677 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.333652 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd7f7eed-6e93-49b8-8304-e883b3d51258-cert\") pod \"lws-controller-manager-7bf4f445d7-w9z9l\" (UID: \"fd7f7eed-6e93-49b8-8304-e883b3d51258\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-w9z9l" Apr 17 16:25:42.333677 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.333675 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd7f7eed-6e93-49b8-8304-e883b3d51258-metrics-cert\") pod \"lws-controller-manager-7bf4f445d7-w9z9l\" (UID: \"fd7f7eed-6e93-49b8-8304-e883b3d51258\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-w9z9l" Apr 17 16:25:42.345603 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.345577 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47tx7\" (UniqueName: \"kubernetes.io/projected/fd7f7eed-6e93-49b8-8304-e883b3d51258-kube-api-access-47tx7\") pod \"lws-controller-manager-7bf4f445d7-w9z9l\" (UID: \"fd7f7eed-6e93-49b8-8304-e883b3d51258\") " pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-w9z9l" Apr 17 16:25:42.409307 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.409268 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-w9z9l" Apr 17 16:25:42.528028 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.527998 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7bf4f445d7-w9z9l"] Apr 17 16:25:42.533473 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:25:42.533048 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd7f7eed_6e93_49b8_8304_e883b3d51258.slice/crio-087cdf48f6786ee98d6359dfd2b6b02b1e1f62887d35abb6e14f980454e9584d WatchSource:0}: Error finding container 087cdf48f6786ee98d6359dfd2b6b02b1e1f62887d35abb6e14f980454e9584d: Status 404 returned error can't find the container with id 087cdf48f6786ee98d6359dfd2b6b02b1e1f62887d35abb6e14f980454e9584d Apr 17 16:25:42.534997 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:42.534978 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:25:43.312951 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:43.312900 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-w9z9l" event={"ID":"fd7f7eed-6e93-49b8-8304-e883b3d51258","Type":"ContainerStarted","Data":"087cdf48f6786ee98d6359dfd2b6b02b1e1f62887d35abb6e14f980454e9584d"} Apr 17 16:25:45.320268 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:45.320185 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-w9z9l" event={"ID":"fd7f7eed-6e93-49b8-8304-e883b3d51258","Type":"ContainerStarted","Data":"925d51654bbdb484088016f7b59c1df6cfee95a96e5256e1c2d14f7a1e3632ac"} Apr 17 16:25:45.320641 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:45.320415 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-w9z9l" Apr 17 16:25:45.338650 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:45.338601 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-w9z9l" podStartSLOduration=0.828734197 podStartE2EDuration="3.338587488s" podCreationTimestamp="2026-04-17 16:25:42 +0000 UTC" firstStartedPulling="2026-04-17 16:25:42.535164157 +0000 UTC m=+359.875652213" lastFinishedPulling="2026-04-17 16:25:45.045017444 +0000 UTC m=+362.385505504" observedRunningTime="2026-04-17 16:25:45.337394803 +0000 UTC m=+362.677882884" watchObservedRunningTime="2026-04-17 16:25:45.338587488 +0000 UTC m=+362.679075566" Apr 17 16:25:56.325337 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:25:56.325301 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7bf4f445d7-w9z9l" Apr 17 16:26:11.686214 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:11.686170 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-54994d49cf-jz6l5"] Apr 17 16:26:11.690138 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:11.690104 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-jz6l5" Apr 17 16:26:11.693791 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:11.693762 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-jlzxq\"" Apr 17 16:26:11.694051 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:11.693956 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 16:26:11.694265 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:11.694202 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 16:26:11.694349 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:11.694110 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 16:26:11.698901 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:11.698883 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 16:26:11.703531 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:11.703499 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-54994d49cf-jz6l5"] Apr 17 16:26:11.779167 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:11.779130 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca5c3e8c-559a-4015-8f20-7a94b4500850-webhook-cert\") pod \"opendatahub-operator-controller-manager-54994d49cf-jz6l5\" (UID: \"ca5c3e8c-559a-4015-8f20-7a94b4500850\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-jz6l5" Apr 17 16:26:11.779167 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:11.779170 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca5c3e8c-559a-4015-8f20-7a94b4500850-apiservice-cert\") pod \"opendatahub-operator-controller-manager-54994d49cf-jz6l5\" (UID: \"ca5c3e8c-559a-4015-8f20-7a94b4500850\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-jz6l5" Apr 17 16:26:11.779425 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:11.779324 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f4fg\" (UniqueName: \"kubernetes.io/projected/ca5c3e8c-559a-4015-8f20-7a94b4500850-kube-api-access-9f4fg\") pod \"opendatahub-operator-controller-manager-54994d49cf-jz6l5\" (UID: \"ca5c3e8c-559a-4015-8f20-7a94b4500850\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-jz6l5" Apr 17 16:26:11.880192 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:11.880153 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca5c3e8c-559a-4015-8f20-7a94b4500850-webhook-cert\") pod \"opendatahub-operator-controller-manager-54994d49cf-jz6l5\" (UID: \"ca5c3e8c-559a-4015-8f20-7a94b4500850\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-jz6l5" Apr 17 16:26:11.880192 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:11.880195 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca5c3e8c-559a-4015-8f20-7a94b4500850-apiservice-cert\") pod \"opendatahub-operator-controller-manager-54994d49cf-jz6l5\" (UID: \"ca5c3e8c-559a-4015-8f20-7a94b4500850\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-jz6l5" Apr 17 16:26:11.880400 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:11.880271 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9f4fg\" (UniqueName: \"kubernetes.io/projected/ca5c3e8c-559a-4015-8f20-7a94b4500850-kube-api-access-9f4fg\") pod \"opendatahub-operator-controller-manager-54994d49cf-jz6l5\" (UID: \"ca5c3e8c-559a-4015-8f20-7a94b4500850\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-jz6l5" Apr 17 16:26:11.882670 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:11.882642 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca5c3e8c-559a-4015-8f20-7a94b4500850-webhook-cert\") pod \"opendatahub-operator-controller-manager-54994d49cf-jz6l5\" (UID: \"ca5c3e8c-559a-4015-8f20-7a94b4500850\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-jz6l5" Apr 17 16:26:11.882784 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:11.882651 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca5c3e8c-559a-4015-8f20-7a94b4500850-apiservice-cert\") pod \"opendatahub-operator-controller-manager-54994d49cf-jz6l5\" (UID: \"ca5c3e8c-559a-4015-8f20-7a94b4500850\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-jz6l5" Apr 17 16:26:11.889462 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:11.889434 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f4fg\" (UniqueName: \"kubernetes.io/projected/ca5c3e8c-559a-4015-8f20-7a94b4500850-kube-api-access-9f4fg\") pod \"opendatahub-operator-controller-manager-54994d49cf-jz6l5\" (UID: \"ca5c3e8c-559a-4015-8f20-7a94b4500850\") " pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-jz6l5" Apr 17 16:26:12.003688 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:12.003601 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-jz6l5" Apr 17 16:26:12.125499 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:12.125469 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-54994d49cf-jz6l5"] Apr 17 16:26:12.128270 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:26:12.128241 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca5c3e8c_559a_4015_8f20_7a94b4500850.slice/crio-55f20d976c31fdd19cab07c842220eb84c375c676f533ff2876689d1da3829ce WatchSource:0}: Error finding container 55f20d976c31fdd19cab07c842220eb84c375c676f533ff2876689d1da3829ce: Status 404 returned error can't find the container with id 55f20d976c31fdd19cab07c842220eb84c375c676f533ff2876689d1da3829ce Apr 17 16:26:12.396519 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:12.396477 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-jz6l5" event={"ID":"ca5c3e8c-559a-4015-8f20-7a94b4500850","Type":"ContainerStarted","Data":"55f20d976c31fdd19cab07c842220eb84c375c676f533ff2876689d1da3829ce"} Apr 17 16:26:15.408281 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:15.408246 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-jz6l5" event={"ID":"ca5c3e8c-559a-4015-8f20-7a94b4500850","Type":"ContainerStarted","Data":"a7c265d05a6f824cc82ee4a3efe7250599537b17e214419a02c1b6940f5c20a6"} Apr 17 16:26:15.408674 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:15.408388 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-jz6l5" Apr 17 16:26:15.434004 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:15.433952 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-jz6l5" podStartSLOduration=1.899130628 podStartE2EDuration="4.433939368s" podCreationTimestamp="2026-04-17 16:26:11 +0000 UTC" firstStartedPulling="2026-04-17 16:26:12.130056376 +0000 UTC m=+389.470544433" lastFinishedPulling="2026-04-17 16:26:14.664865114 +0000 UTC m=+392.005353173" observedRunningTime="2026-04-17 16:26:15.431620515 +0000 UTC m=+392.772108595" watchObservedRunningTime="2026-04-17 16:26:15.433939368 +0000 UTC m=+392.774427448" Apr 17 16:26:26.413738 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:26.413705 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-54994d49cf-jz6l5" Apr 17 16:26:29.611614 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:29.611579 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-667bf5bb7-xs847"] Apr 17 16:26:29.615284 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:29.615259 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-667bf5bb7-xs847" Apr 17 16:26:29.618144 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:29.618124 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 16:26:29.619296 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:29.619276 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-c5cck\"" Apr 17 16:26:29.619409 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:29.619286 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 16:26:29.623657 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:29.623634 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-667bf5bb7-xs847"] Apr 17 16:26:29.733919 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:29.733880 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c42c1d3-83ba-4fe7-89d0-e86749bbaab6-tmp\") pod \"kube-auth-proxy-667bf5bb7-xs847\" (UID: \"2c42c1d3-83ba-4fe7-89d0-e86749bbaab6\") " pod="openshift-ingress/kube-auth-proxy-667bf5bb7-xs847" Apr 17 16:26:29.733919 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:29.733924 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c42c1d3-83ba-4fe7-89d0-e86749bbaab6-tls-certs\") pod \"kube-auth-proxy-667bf5bb7-xs847\" (UID: \"2c42c1d3-83ba-4fe7-89d0-e86749bbaab6\") " pod="openshift-ingress/kube-auth-proxy-667bf5bb7-xs847" Apr 17 16:26:29.734161 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:29.734013 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbw8r\" (UniqueName: \"kubernetes.io/projected/2c42c1d3-83ba-4fe7-89d0-e86749bbaab6-kube-api-access-wbw8r\") pod \"kube-auth-proxy-667bf5bb7-xs847\" (UID: \"2c42c1d3-83ba-4fe7-89d0-e86749bbaab6\") " pod="openshift-ingress/kube-auth-proxy-667bf5bb7-xs847" Apr 17 16:26:29.835396 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:29.835364 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbw8r\" (UniqueName: \"kubernetes.io/projected/2c42c1d3-83ba-4fe7-89d0-e86749bbaab6-kube-api-access-wbw8r\") pod \"kube-auth-proxy-667bf5bb7-xs847\" (UID: \"2c42c1d3-83ba-4fe7-89d0-e86749bbaab6\") " pod="openshift-ingress/kube-auth-proxy-667bf5bb7-xs847" Apr 17 16:26:29.835566 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:29.835414 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c42c1d3-83ba-4fe7-89d0-e86749bbaab6-tmp\") pod \"kube-auth-proxy-667bf5bb7-xs847\" (UID: \"2c42c1d3-83ba-4fe7-89d0-e86749bbaab6\") " pod="openshift-ingress/kube-auth-proxy-667bf5bb7-xs847" Apr 17 16:26:29.835566 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:29.835448 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c42c1d3-83ba-4fe7-89d0-e86749bbaab6-tls-certs\") pod \"kube-auth-proxy-667bf5bb7-xs847\" (UID: \"2c42c1d3-83ba-4fe7-89d0-e86749bbaab6\") " pod="openshift-ingress/kube-auth-proxy-667bf5bb7-xs847" Apr 17 16:26:29.837922 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:29.837896 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2c42c1d3-83ba-4fe7-89d0-e86749bbaab6-tmp\") pod \"kube-auth-proxy-667bf5bb7-xs847\" (UID: \"2c42c1d3-83ba-4fe7-89d0-e86749bbaab6\") " pod="openshift-ingress/kube-auth-proxy-667bf5bb7-xs847" Apr 17 16:26:29.838082 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:29.838057 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2c42c1d3-83ba-4fe7-89d0-e86749bbaab6-tls-certs\") pod \"kube-auth-proxy-667bf5bb7-xs847\" (UID: \"2c42c1d3-83ba-4fe7-89d0-e86749bbaab6\") " pod="openshift-ingress/kube-auth-proxy-667bf5bb7-xs847" Apr 17 16:26:29.847817 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:29.847787 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbw8r\" (UniqueName: \"kubernetes.io/projected/2c42c1d3-83ba-4fe7-89d0-e86749bbaab6-kube-api-access-wbw8r\") pod \"kube-auth-proxy-667bf5bb7-xs847\" (UID: \"2c42c1d3-83ba-4fe7-89d0-e86749bbaab6\") " pod="openshift-ingress/kube-auth-proxy-667bf5bb7-xs847" Apr 17 16:26:29.926599 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:29.926567 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-667bf5bb7-xs847" Apr 17 16:26:30.046448 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:30.046419 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-667bf5bb7-xs847"] Apr 17 16:26:30.049180 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:26:30.049151 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c42c1d3_83ba_4fe7_89d0_e86749bbaab6.slice/crio-1621400a48d4945b359ccdc25ae380dd460673878f59ebe3048d7c8d444fc4d1 WatchSource:0}: Error finding container 1621400a48d4945b359ccdc25ae380dd460673878f59ebe3048d7c8d444fc4d1: Status 404 returned error can't find the container with id 1621400a48d4945b359ccdc25ae380dd460673878f59ebe3048d7c8d444fc4d1 Apr 17 16:26:30.456771 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:30.456740 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-667bf5bb7-xs847" event={"ID":"2c42c1d3-83ba-4fe7-89d0-e86749bbaab6","Type":"ContainerStarted","Data":"1621400a48d4945b359ccdc25ae380dd460673878f59ebe3048d7c8d444fc4d1"} Apr 17 16:26:33.469512 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:33.469477 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-667bf5bb7-xs847" event={"ID":"2c42c1d3-83ba-4fe7-89d0-e86749bbaab6","Type":"ContainerStarted","Data":"2a293fa2bc53fe527d4479c11b9f52aa17282612f7954bf1862ca2a9bda611de"} Apr 17 16:26:33.485676 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:26:33.485635 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-667bf5bb7-xs847" podStartSLOduration=1.140347308 podStartE2EDuration="4.485621982s" podCreationTimestamp="2026-04-17 16:26:29 +0000 UTC" firstStartedPulling="2026-04-17 16:26:30.050860411 +0000 UTC m=+407.391348471" lastFinishedPulling="2026-04-17 16:26:33.396135085 +0000 UTC m=+410.736623145" observedRunningTime="2026-04-17 16:26:33.484572903 +0000 UTC m=+410.825060982" watchObservedRunningTime="2026-04-17 16:26:33.485621982 +0000 UTC m=+410.826110061" Apr 17 16:28:11.561239 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:11.561204 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b9z8q"] Apr 17 16:28:11.564605 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:11.564587 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b9z8q" Apr 17 16:28:11.567655 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:11.567629 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 17 16:28:11.567841 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:11.567826 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 16:28:11.567980 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:11.567964 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 16:28:11.568954 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:11.568937 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 17 16:28:11.569009 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:11.568960 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-vpm4x\"" Apr 17 16:28:11.574048 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:11.574027 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b9z8q"] Apr 17 16:28:11.708744 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:11.708707 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7026a0c5-2913-4045-93bc-d4dbca24d13e-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-b9z8q\" (UID: \"7026a0c5-2913-4045-93bc-d4dbca24d13e\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b9z8q" Apr 17 16:28:11.708927 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:11.708758 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95fm6\" (UniqueName: \"kubernetes.io/projected/7026a0c5-2913-4045-93bc-d4dbca24d13e-kube-api-access-95fm6\") pod \"kuadrant-console-plugin-6cb54b5c86-b9z8q\" (UID: \"7026a0c5-2913-4045-93bc-d4dbca24d13e\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b9z8q" Apr 17 16:28:11.708927 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:11.708852 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7026a0c5-2913-4045-93bc-d4dbca24d13e-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-b9z8q\" (UID: \"7026a0c5-2913-4045-93bc-d4dbca24d13e\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b9z8q" Apr 17 16:28:11.810297 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:11.810258 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7026a0c5-2913-4045-93bc-d4dbca24d13e-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-b9z8q\" (UID: \"7026a0c5-2913-4045-93bc-d4dbca24d13e\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b9z8q" Apr 17 16:28:11.810504 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:11.810314 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95fm6\" (UniqueName: \"kubernetes.io/projected/7026a0c5-2913-4045-93bc-d4dbca24d13e-kube-api-access-95fm6\") pod \"kuadrant-console-plugin-6cb54b5c86-b9z8q\" (UID: \"7026a0c5-2913-4045-93bc-d4dbca24d13e\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b9z8q" Apr 17 16:28:11.810504 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:11.810345 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7026a0c5-2913-4045-93bc-d4dbca24d13e-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-b9z8q\" (UID: \"7026a0c5-2913-4045-93bc-d4dbca24d13e\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b9z8q" Apr 17 16:28:11.810998 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:11.810977 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7026a0c5-2913-4045-93bc-d4dbca24d13e-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-b9z8q\" (UID: \"7026a0c5-2913-4045-93bc-d4dbca24d13e\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b9z8q" Apr 17 16:28:11.812731 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:11.812680 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7026a0c5-2913-4045-93bc-d4dbca24d13e-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-b9z8q\" (UID: \"7026a0c5-2913-4045-93bc-d4dbca24d13e\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b9z8q" Apr 17 16:28:11.821526 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:11.821500 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95fm6\" (UniqueName: \"kubernetes.io/projected/7026a0c5-2913-4045-93bc-d4dbca24d13e-kube-api-access-95fm6\") pod \"kuadrant-console-plugin-6cb54b5c86-b9z8q\" (UID: \"7026a0c5-2913-4045-93bc-d4dbca24d13e\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b9z8q" Apr 17 16:28:11.884500 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:11.884462 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b9z8q" Apr 17 16:28:12.009280 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:12.009256 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b9z8q"] Apr 17 16:28:12.012142 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:28:12.012112 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7026a0c5_2913_4045_93bc_d4dbca24d13e.slice/crio-b024dec8087267fb7526baec2f98f31062c7995f8026475d56d04bfad6b5e8fe WatchSource:0}: Error finding container b024dec8087267fb7526baec2f98f31062c7995f8026475d56d04bfad6b5e8fe: Status 404 returned error can't find the container with id b024dec8087267fb7526baec2f98f31062c7995f8026475d56d04bfad6b5e8fe Apr 17 16:28:12.786742 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:12.786700 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b9z8q" event={"ID":"7026a0c5-2913-4045-93bc-d4dbca24d13e","Type":"ContainerStarted","Data":"b024dec8087267fb7526baec2f98f31062c7995f8026475d56d04bfad6b5e8fe"} Apr 17 16:28:37.877767 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:37.877727 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b9z8q" event={"ID":"7026a0c5-2913-4045-93bc-d4dbca24d13e","Type":"ContainerStarted","Data":"e7ada7a2cb5bb1f0bac21ff04c1edc119ee43d113747dfcd59c519bdc3b4cec5"} Apr 17 16:28:37.894467 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:37.894417 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-b9z8q" podStartSLOduration=1.742120447 podStartE2EDuration="26.894401583s" podCreationTimestamp="2026-04-17 16:28:11 +0000 UTC" firstStartedPulling="2026-04-17 16:28:12.01350171 +0000 UTC m=+509.353989774" lastFinishedPulling="2026-04-17 16:28:37.165782848 +0000 UTC m=+534.506270910" observedRunningTime="2026-04-17 16:28:37.893850836 +0000 UTC m=+535.234338917" watchObservedRunningTime="2026-04-17 16:28:37.894401583 +0000 UTC m=+535.234889662" Apr 17 16:28:55.416914 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:55.416878 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:28:55.560267 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:55.560226 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:28:55.560267 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:55.560258 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:28:55.560523 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:55.560402 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-2h256" Apr 17 16:28:55.563225 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:55.563198 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 16:28:55.722043 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:55.721958 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4b2fe46d-0f76-4e12-bd63-8a431921db1f-config-file\") pod \"limitador-limitador-78c99df468-2h256\" (UID: \"4b2fe46d-0f76-4e12-bd63-8a431921db1f\") " pod="kuadrant-system/limitador-limitador-78c99df468-2h256" Apr 17 16:28:55.722043 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:55.722021 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkjp7\" (UniqueName: \"kubernetes.io/projected/4b2fe46d-0f76-4e12-bd63-8a431921db1f-kube-api-access-pkjp7\") pod \"limitador-limitador-78c99df468-2h256\" (UID: \"4b2fe46d-0f76-4e12-bd63-8a431921db1f\") " pod="kuadrant-system/limitador-limitador-78c99df468-2h256" Apr 17 16:28:55.823468 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:55.823439 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4b2fe46d-0f76-4e12-bd63-8a431921db1f-config-file\") pod \"limitador-limitador-78c99df468-2h256\" (UID: \"4b2fe46d-0f76-4e12-bd63-8a431921db1f\") " pod="kuadrant-system/limitador-limitador-78c99df468-2h256" Apr 17 16:28:55.823666 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:55.823501 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pkjp7\" (UniqueName: \"kubernetes.io/projected/4b2fe46d-0f76-4e12-bd63-8a431921db1f-kube-api-access-pkjp7\") pod \"limitador-limitador-78c99df468-2h256\" (UID: \"4b2fe46d-0f76-4e12-bd63-8a431921db1f\") " pod="kuadrant-system/limitador-limitador-78c99df468-2h256" Apr 17 16:28:55.824240 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:55.824215 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/4b2fe46d-0f76-4e12-bd63-8a431921db1f-config-file\") pod \"limitador-limitador-78c99df468-2h256\" (UID: \"4b2fe46d-0f76-4e12-bd63-8a431921db1f\") " pod="kuadrant-system/limitador-limitador-78c99df468-2h256" Apr 17 16:28:55.832890 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:55.832856 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkjp7\" (UniqueName: \"kubernetes.io/projected/4b2fe46d-0f76-4e12-bd63-8a431921db1f-kube-api-access-pkjp7\") pod \"limitador-limitador-78c99df468-2h256\" (UID: \"4b2fe46d-0f76-4e12-bd63-8a431921db1f\") " pod="kuadrant-system/limitador-limitador-78c99df468-2h256" Apr 17 16:28:55.870340 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:55.870312 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-2h256" Apr 17 16:28:55.994134 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:55.993944 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:28:55.996774 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:28:55.996744 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b2fe46d_0f76_4e12_bd63_8a431921db1f.slice/crio-5eb6e1e6ddb644592564b938d8064de39b8c7dd2d6034b8bbb49c10a528e29ae WatchSource:0}: Error finding container 5eb6e1e6ddb644592564b938d8064de39b8c7dd2d6034b8bbb49c10a528e29ae: Status 404 returned error can't find the container with id 5eb6e1e6ddb644592564b938d8064de39b8c7dd2d6034b8bbb49c10a528e29ae Apr 17 16:28:56.000439 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:56.000417 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-lpw45"] Apr 17 16:28:56.027435 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:56.027407 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-lpw45"] Apr 17 16:28:56.027569 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:56.027517 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-lpw45" Apr 17 16:28:56.030346 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:56.030325 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-q4stg\"" Apr 17 16:28:56.126740 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:56.126703 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96dfq\" (UniqueName: \"kubernetes.io/projected/b0283bdd-fa44-4075-820b-15aa2a131736-kube-api-access-96dfq\") pod \"authorino-7498df8756-lpw45\" (UID: \"b0283bdd-fa44-4075-820b-15aa2a131736\") " pod="kuadrant-system/authorino-7498df8756-lpw45" Apr 17 16:28:56.227663 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:56.227620 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96dfq\" (UniqueName: \"kubernetes.io/projected/b0283bdd-fa44-4075-820b-15aa2a131736-kube-api-access-96dfq\") pod \"authorino-7498df8756-lpw45\" (UID: \"b0283bdd-fa44-4075-820b-15aa2a131736\") " pod="kuadrant-system/authorino-7498df8756-lpw45" Apr 17 16:28:56.235628 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:56.235593 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96dfq\" (UniqueName: \"kubernetes.io/projected/b0283bdd-fa44-4075-820b-15aa2a131736-kube-api-access-96dfq\") pod \"authorino-7498df8756-lpw45\" (UID: \"b0283bdd-fa44-4075-820b-15aa2a131736\") " pod="kuadrant-system/authorino-7498df8756-lpw45" Apr 17 16:28:56.336491 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:56.336404 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-lpw45" Apr 17 16:28:56.458625 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:56.458446 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-lpw45"] Apr 17 16:28:56.461036 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:28:56.461013 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0283bdd_fa44_4075_820b_15aa2a131736.slice/crio-e9742344dd3a24d1f1e76414d7fc0ead21a93d01ebc71eb263769c5849bb9b30 WatchSource:0}: Error finding container e9742344dd3a24d1f1e76414d7fc0ead21a93d01ebc71eb263769c5849bb9b30: Status 404 returned error can't find the container with id e9742344dd3a24d1f1e76414d7fc0ead21a93d01ebc71eb263769c5849bb9b30 Apr 17 16:28:56.940716 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:56.940681 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-lpw45" event={"ID":"b0283bdd-fa44-4075-820b-15aa2a131736","Type":"ContainerStarted","Data":"e9742344dd3a24d1f1e76414d7fc0ead21a93d01ebc71eb263769c5849bb9b30"} Apr 17 16:28:56.941964 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:28:56.941933 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-2h256" event={"ID":"4b2fe46d-0f76-4e12-bd63-8a431921db1f","Type":"ContainerStarted","Data":"5eb6e1e6ddb644592564b938d8064de39b8c7dd2d6034b8bbb49c10a528e29ae"} Apr 17 16:29:01.961716 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:01.961674 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-lpw45" event={"ID":"b0283bdd-fa44-4075-820b-15aa2a131736","Type":"ContainerStarted","Data":"27638e077cc31316b5ef5a60635226017f33e299cf4071b31c43f09898ab1ee3"} Apr 17 16:29:01.963068 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:01.963042 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-2h256" event={"ID":"4b2fe46d-0f76-4e12-bd63-8a431921db1f","Type":"ContainerStarted","Data":"51a573435065de35a1951d41248ad15b1eab7eb5b59103112ec6da321ccfca69"} Apr 17 16:29:01.963183 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:01.963170 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-2h256" Apr 17 16:29:01.976801 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:01.976755 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-lpw45" podStartSLOduration=1.642294237 podStartE2EDuration="6.976742176s" podCreationTimestamp="2026-04-17 16:28:55 +0000 UTC" firstStartedPulling="2026-04-17 16:28:56.462445686 +0000 UTC m=+553.802933743" lastFinishedPulling="2026-04-17 16:29:01.796893624 +0000 UTC m=+559.137381682" observedRunningTime="2026-04-17 16:29:01.97580103 +0000 UTC m=+559.316289110" watchObservedRunningTime="2026-04-17 16:29:01.976742176 +0000 UTC m=+559.317230249" Apr 17 16:29:01.992949 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:01.992899 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-2h256" podStartSLOduration=1.143437386 podStartE2EDuration="6.99288644s" podCreationTimestamp="2026-04-17 16:28:55 +0000 UTC" firstStartedPulling="2026-04-17 16:28:55.999012137 +0000 UTC m=+553.339500193" lastFinishedPulling="2026-04-17 16:29:01.848461189 +0000 UTC m=+559.188949247" observedRunningTime="2026-04-17 16:29:01.991239309 +0000 UTC m=+559.331727429" watchObservedRunningTime="2026-04-17 16:29:01.99288644 +0000 UTC m=+559.333374518" Apr 17 16:29:12.969190 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:12.969161 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-2h256" Apr 17 16:29:30.381589 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:30.381557 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-lpw45"] Apr 17 16:29:30.381978 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:30.381769 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-lpw45" podUID="b0283bdd-fa44-4075-820b-15aa2a131736" containerName="authorino" containerID="cri-o://27638e077cc31316b5ef5a60635226017f33e299cf4071b31c43f09898ab1ee3" gracePeriod=30 Apr 17 16:29:30.626058 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:30.626028 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-lpw45" Apr 17 16:29:30.735093 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:30.734983 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96dfq\" (UniqueName: \"kubernetes.io/projected/b0283bdd-fa44-4075-820b-15aa2a131736-kube-api-access-96dfq\") pod \"b0283bdd-fa44-4075-820b-15aa2a131736\" (UID: \"b0283bdd-fa44-4075-820b-15aa2a131736\") " Apr 17 16:29:30.737185 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:30.737150 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0283bdd-fa44-4075-820b-15aa2a131736-kube-api-access-96dfq" (OuterVolumeSpecName: "kube-api-access-96dfq") pod "b0283bdd-fa44-4075-820b-15aa2a131736" (UID: "b0283bdd-fa44-4075-820b-15aa2a131736"). InnerVolumeSpecName "kube-api-access-96dfq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 16:29:30.836489 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:30.836454 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-96dfq\" (UniqueName: \"kubernetes.io/projected/b0283bdd-fa44-4075-820b-15aa2a131736-kube-api-access-96dfq\") on node \"ip-10-0-132-179.ec2.internal\" DevicePath \"\"" Apr 17 16:29:31.061514 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:31.061431 2572 generic.go:358] "Generic (PLEG): container finished" podID="b0283bdd-fa44-4075-820b-15aa2a131736" containerID="27638e077cc31316b5ef5a60635226017f33e299cf4071b31c43f09898ab1ee3" exitCode=0 Apr 17 16:29:31.061514 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:31.061473 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-lpw45" Apr 17 16:29:31.061720 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:31.061517 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-lpw45" event={"ID":"b0283bdd-fa44-4075-820b-15aa2a131736","Type":"ContainerDied","Data":"27638e077cc31316b5ef5a60635226017f33e299cf4071b31c43f09898ab1ee3"} Apr 17 16:29:31.061720 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:31.061553 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-lpw45" event={"ID":"b0283bdd-fa44-4075-820b-15aa2a131736","Type":"ContainerDied","Data":"e9742344dd3a24d1f1e76414d7fc0ead21a93d01ebc71eb263769c5849bb9b30"} Apr 17 16:29:31.061720 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:31.061569 2572 scope.go:117] "RemoveContainer" containerID="27638e077cc31316b5ef5a60635226017f33e299cf4071b31c43f09898ab1ee3" Apr 17 16:29:31.069956 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:31.069936 2572 scope.go:117] "RemoveContainer" containerID="27638e077cc31316b5ef5a60635226017f33e299cf4071b31c43f09898ab1ee3" Apr 17 16:29:31.070241 ip-10-0-132-179 kubenswrapper[2572]: E0417 16:29:31.070221 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27638e077cc31316b5ef5a60635226017f33e299cf4071b31c43f09898ab1ee3\": container with ID starting with 27638e077cc31316b5ef5a60635226017f33e299cf4071b31c43f09898ab1ee3 not found: ID does not exist" containerID="27638e077cc31316b5ef5a60635226017f33e299cf4071b31c43f09898ab1ee3" Apr 17 16:29:31.070297 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:31.070251 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27638e077cc31316b5ef5a60635226017f33e299cf4071b31c43f09898ab1ee3"} err="failed to get container status \"27638e077cc31316b5ef5a60635226017f33e299cf4071b31c43f09898ab1ee3\": rpc error: code = NotFound desc = could not find container \"27638e077cc31316b5ef5a60635226017f33e299cf4071b31c43f09898ab1ee3\": container with ID starting with 27638e077cc31316b5ef5a60635226017f33e299cf4071b31c43f09898ab1ee3 not found: ID does not exist" Apr 17 16:29:31.084264 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:31.084234 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-lpw45"] Apr 17 16:29:31.088228 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:31.088204 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-lpw45"] Apr 17 16:29:31.179120 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:31.179091 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0283bdd-fa44-4075-820b-15aa2a131736" path="/var/lib/kubelet/pods/b0283bdd-fa44-4075-820b-15aa2a131736/volumes" Apr 17 16:29:37.321833 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:37.321799 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:29:43.086662 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:43.086638 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ljhb6_72bbd866-8c40-48e3-9eb3-b34ae76679de/console-operator/2.log" Apr 17 16:29:43.088717 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:29:43.088695 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ljhb6_72bbd866-8c40-48e3-9eb3-b34ae76679de/console-operator/2.log" Apr 17 16:30:16.139109 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:30:16.139059 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:30:19.183899 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:30:19.183869 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:30:29.079395 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:30:29.079356 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:30:33.580279 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:30:33.580244 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:30:50.987216 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:30:50.987182 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:31:18.769650 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:31:18.769610 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-678cbf4fcb-sh6r5"] Apr 17 16:31:18.770014 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:31:18.769972 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0283bdd-fa44-4075-820b-15aa2a131736" containerName="authorino" Apr 17 16:31:18.770014 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:31:18.769986 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0283bdd-fa44-4075-820b-15aa2a131736" containerName="authorino" Apr 17 16:31:18.770128 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:31:18.770088 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0283bdd-fa44-4075-820b-15aa2a131736" containerName="authorino" Apr 17 16:31:18.771920 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:31:18.771905 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-678cbf4fcb-sh6r5" Apr 17 16:31:18.775722 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:31:18.775700 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-q4stg\"" Apr 17 16:31:18.775846 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:31:18.775704 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 17 16:31:18.779757 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:31:18.779290 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-678cbf4fcb-sh6r5"] Apr 17 16:31:18.895574 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:31:18.895540 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g7lf\" (UniqueName: \"kubernetes.io/projected/074070a8-5b28-42ed-8363-d799ab319a72-kube-api-access-7g7lf\") pod \"authorino-678cbf4fcb-sh6r5\" (UID: \"074070a8-5b28-42ed-8363-d799ab319a72\") " pod="kuadrant-system/authorino-678cbf4fcb-sh6r5" Apr 17 16:31:18.895748 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:31:18.895601 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/074070a8-5b28-42ed-8363-d799ab319a72-tls-cert\") pod \"authorino-678cbf4fcb-sh6r5\" (UID: \"074070a8-5b28-42ed-8363-d799ab319a72\") " pod="kuadrant-system/authorino-678cbf4fcb-sh6r5" Apr 17 16:31:18.996763 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:31:18.996730 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7g7lf\" (UniqueName: \"kubernetes.io/projected/074070a8-5b28-42ed-8363-d799ab319a72-kube-api-access-7g7lf\") pod \"authorino-678cbf4fcb-sh6r5\" (UID: \"074070a8-5b28-42ed-8363-d799ab319a72\") " pod="kuadrant-system/authorino-678cbf4fcb-sh6r5" Apr 17 16:31:18.996932 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:31:18.996789 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/074070a8-5b28-42ed-8363-d799ab319a72-tls-cert\") pod \"authorino-678cbf4fcb-sh6r5\" (UID: \"074070a8-5b28-42ed-8363-d799ab319a72\") " pod="kuadrant-system/authorino-678cbf4fcb-sh6r5" Apr 17 16:31:18.999248 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:31:18.999228 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/074070a8-5b28-42ed-8363-d799ab319a72-tls-cert\") pod \"authorino-678cbf4fcb-sh6r5\" (UID: \"074070a8-5b28-42ed-8363-d799ab319a72\") " pod="kuadrant-system/authorino-678cbf4fcb-sh6r5" Apr 17 16:31:19.004058 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:31:19.004036 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g7lf\" (UniqueName: \"kubernetes.io/projected/074070a8-5b28-42ed-8363-d799ab319a72-kube-api-access-7g7lf\") pod \"authorino-678cbf4fcb-sh6r5\" (UID: \"074070a8-5b28-42ed-8363-d799ab319a72\") " pod="kuadrant-system/authorino-678cbf4fcb-sh6r5" Apr 17 16:31:19.082252 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:31:19.082166 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-678cbf4fcb-sh6r5" Apr 17 16:31:19.205118 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:31:19.205094 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-678cbf4fcb-sh6r5"] Apr 17 16:31:19.207264 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:31:19.207230 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod074070a8_5b28_42ed_8363_d799ab319a72.slice/crio-53824882671c069228b243e1f7f069ded061056ae18c9d885b24b7551dd96cab WatchSource:0}: Error finding container 53824882671c069228b243e1f7f069ded061056ae18c9d885b24b7551dd96cab: Status 404 returned error can't find the container with id 53824882671c069228b243e1f7f069ded061056ae18c9d885b24b7551dd96cab Apr 17 16:31:19.208475 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:31:19.208454 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:31:19.418998 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:31:19.418961 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-678cbf4fcb-sh6r5" event={"ID":"074070a8-5b28-42ed-8363-d799ab319a72","Type":"ContainerStarted","Data":"53824882671c069228b243e1f7f069ded061056ae18c9d885b24b7551dd96cab"} Apr 17 16:31:20.425441 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:31:20.425403 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-678cbf4fcb-sh6r5" event={"ID":"074070a8-5b28-42ed-8363-d799ab319a72","Type":"ContainerStarted","Data":"f56627013d1cecd8badff9029bb06e8dcb43131fdff03e34dd62a8e220c702e7"} Apr 17 16:31:20.446752 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:31:20.446689 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-678cbf4fcb-sh6r5" podStartSLOduration=1.764796002 podStartE2EDuration="2.446673368s" podCreationTimestamp="2026-04-17 16:31:18 +0000 UTC" firstStartedPulling="2026-04-17 16:31:19.208612155 +0000 UTC m=+696.549100212" lastFinishedPulling="2026-04-17 16:31:19.890489513 +0000 UTC m=+697.230977578" observedRunningTime="2026-04-17 16:31:20.44538264 +0000 UTC m=+697.785870720" watchObservedRunningTime="2026-04-17 16:31:20.446673368 +0000 UTC m=+697.787161447" Apr 17 16:31:36.287358 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:31:36.287270 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:31:44.178038 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:31:44.178003 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:32:14.781869 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:32:14.781828 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:32:31.173128 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:32:31.173093 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:33:08.685050 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:33:08.684956 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:33:26.072459 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:33:26.072420 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:33:39.381942 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:33:39.381905 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:33:56.075873 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:33:56.075829 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:34:43.112286 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:34:43.112206 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ljhb6_72bbd866-8c40-48e3-9eb3-b34ae76679de/console-operator/2.log" Apr 17 16:34:43.115626 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:34:43.115602 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ljhb6_72bbd866-8c40-48e3-9eb3-b34ae76679de/console-operator/2.log" Apr 17 16:34:47.484284 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:34:47.484241 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:34:56.878918 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:34:56.878885 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:35:13.675789 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:35:13.675756 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:35:21.880576 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:35:21.880540 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:35:38.576227 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:35:38.576194 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:35:46.877059 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:35:46.877023 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:36:20.079327 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:36:20.079236 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:36:28.074362 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:36:28.074323 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:36:36.480184 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:36:36.480150 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:36:45.080205 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:36:45.080169 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:36:53.376492 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:36:53.376456 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:37:10.279855 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:37:10.279819 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:37:21.082927 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:37:21.082893 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:38:08.681711 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:38:08.681678 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:38:16.279678 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:38:16.279643 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:38:25.583883 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:38:25.583842 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:38:33.671994 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:38:33.671957 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:38:42.582564 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:38:42.582523 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:38:50.680523 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:38:50.680485 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:39:00.789490 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:39:00.789458 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:39:08.377801 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:39:08.377706 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:39:18.280202 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:39:18.280163 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:39:26.482721 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:39:26.482678 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:39:35.677606 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:39:35.677572 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:39:43.138860 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:39:43.138832 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ljhb6_72bbd866-8c40-48e3-9eb3-b34ae76679de/console-operator/2.log" Apr 17 16:39:43.141533 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:39:43.141506 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ljhb6_72bbd866-8c40-48e3-9eb3-b34ae76679de/console-operator/2.log" Apr 17 16:39:44.089059 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:39:44.089023 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:39:52.780689 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:39:52.780656 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:40:00.780201 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:40:00.780167 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:40:09.778980 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:40:09.778938 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:40:18.073245 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:40:18.073204 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:40:27.081762 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:40:27.081723 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:40:35.281643 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:40:35.281607 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:41:45.775296 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:41:45.775264 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:41:49.777339 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:41:49.777299 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:41:59.773058 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:41:59.773017 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:42:04.580049 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:42:04.579961 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:42:30.486407 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:42:30.486370 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:43:13.979562 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:43:13.979520 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:43:22.181935 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:43:22.181899 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:43:30.978916 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:43:30.978878 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:43:34.784129 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:43:34.784023 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:43:38.776414 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:43:38.776379 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:43:48.185951 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:43:48.185912 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:43:56.681669 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:43:56.681637 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:44:04.622272 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:44:04.622227 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:44:12.584530 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:44:12.584492 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:44:21.275701 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:44:21.275657 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:44:29.581173 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:44:29.581133 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:44:38.382909 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:44:38.382874 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:44:43.162546 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:44:43.162518 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ljhb6_72bbd866-8c40-48e3-9eb3-b34ae76679de/console-operator/2.log" Apr 17 16:44:43.165996 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:44:43.165972 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ljhb6_72bbd866-8c40-48e3-9eb3-b34ae76679de/console-operator/2.log" Apr 17 16:44:46.183463 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:44:46.183427 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:45:03.572704 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:45:03.572625 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:45:12.685098 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:45:12.685036 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:45:21.688500 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:45:21.688466 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:45:29.178978 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:45:29.178945 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:45:46.490319 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:45:46.490274 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:45:54.682054 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:45:54.682021 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:46:03.574962 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:46:03.574921 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:46:11.878345 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:46:11.878308 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:46:21.089104 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:46:21.089055 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:46:29.381865 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:46:29.381830 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:46:39.481488 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:46:39.481403 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:46:55.182714 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:46:55.182683 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:47:03.987689 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:47:03.987653 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:47:21.179456 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:47:21.179428 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:47:31.274936 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:47:31.274897 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:47:38.088445 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:47:38.088406 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:47:46.172944 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:47:46.172909 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:47:53.272540 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:47:53.272505 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:48:09.685757 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:48:09.685669 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:48:18.991209 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:48:18.991174 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:48:28.099373 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:48:28.099337 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:48:36.095525 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:48:36.095489 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:49:00.318608 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:00.318572 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:49:13.108383 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:13.108350 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-2h256"] Apr 17 16:49:14.252447 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:14.252405 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-678cbf4fcb-sh6r5_074070a8-5b28-42ed-8363-d799ab319a72/authorino/0.log" Apr 17 16:49:18.670563 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:18.670527 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-54994d49cf-jz6l5_ca5c3e8c-559a-4015-8f20-7a94b4500850/manager/0.log" Apr 17 16:49:20.103615 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:20.103562 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-678cbf4fcb-sh6r5_074070a8-5b28-42ed-8363-d799ab319a72/authorino/0.log" Apr 17 16:49:20.419380 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:20.419351 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-b9z8q_7026a0c5-2913-4045-93bc-d4dbca24d13e/kuadrant-console-plugin/0.log" Apr 17 16:49:20.771828 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:20.771748 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-2h256_4b2fe46d-0f76-4e12-bd63-8a431921db1f/limitador/0.log" Apr 17 16:49:21.532007 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:21.531973 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-667bf5bb7-xs847_2c42c1d3-83ba-4fe7-89d0-e86749bbaab6/kube-auth-proxy/0.log" Apr 17 16:49:21.754250 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:21.754172 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5948c7b7c8-rmrfc_b5767428-43d6-4cbe-9763-0731e126b82c/router/0.log" Apr 17 16:49:26.621323 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:26.621289 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-98l5c/must-gather-wk6bp"] Apr 17 16:49:26.624873 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:26.624854 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-98l5c/must-gather-wk6bp" Apr 17 16:49:26.629283 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:26.629263 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-98l5c\"/\"default-dockercfg-j7zpj\"" Apr 17 16:49:26.629477 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:26.629460 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-98l5c\"/\"kube-root-ca.crt\"" Apr 17 16:49:26.629531 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:26.629460 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-98l5c\"/\"openshift-service-ca.crt\"" Apr 17 16:49:26.643217 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:26.643192 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-98l5c/must-gather-wk6bp"] Apr 17 16:49:26.714808 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:26.714774 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqgz5\" (UniqueName: \"kubernetes.io/projected/90369813-e438-4e7b-9f23-348b427ed82a-kube-api-access-hqgz5\") pod \"must-gather-wk6bp\" (UID: \"90369813-e438-4e7b-9f23-348b427ed82a\") " pod="openshift-must-gather-98l5c/must-gather-wk6bp" Apr 17 16:49:26.714967 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:26.714821 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/90369813-e438-4e7b-9f23-348b427ed82a-must-gather-output\") pod \"must-gather-wk6bp\" (UID: \"90369813-e438-4e7b-9f23-348b427ed82a\") " pod="openshift-must-gather-98l5c/must-gather-wk6bp" Apr 17 16:49:26.815721 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:26.815677 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/90369813-e438-4e7b-9f23-348b427ed82a-must-gather-output\") pod \"must-gather-wk6bp\" (UID: \"90369813-e438-4e7b-9f23-348b427ed82a\") " pod="openshift-must-gather-98l5c/must-gather-wk6bp" Apr 17 16:49:26.815919 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:26.815794 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqgz5\" (UniqueName: \"kubernetes.io/projected/90369813-e438-4e7b-9f23-348b427ed82a-kube-api-access-hqgz5\") pod \"must-gather-wk6bp\" (UID: \"90369813-e438-4e7b-9f23-348b427ed82a\") " pod="openshift-must-gather-98l5c/must-gather-wk6bp" Apr 17 16:49:26.816120 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:26.816064 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/90369813-e438-4e7b-9f23-348b427ed82a-must-gather-output\") pod \"must-gather-wk6bp\" (UID: \"90369813-e438-4e7b-9f23-348b427ed82a\") " pod="openshift-must-gather-98l5c/must-gather-wk6bp" Apr 17 16:49:26.826724 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:26.826698 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqgz5\" (UniqueName: \"kubernetes.io/projected/90369813-e438-4e7b-9f23-348b427ed82a-kube-api-access-hqgz5\") pod \"must-gather-wk6bp\" (UID: \"90369813-e438-4e7b-9f23-348b427ed82a\") " pod="openshift-must-gather-98l5c/must-gather-wk6bp" Apr 17 16:49:26.933966 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:26.933931 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-98l5c/must-gather-wk6bp" Apr 17 16:49:27.058827 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:27.058799 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-98l5c/must-gather-wk6bp"] Apr 17 16:49:27.060871 ip-10-0-132-179 kubenswrapper[2572]: W0417 16:49:27.060826 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90369813_e438_4e7b_9f23_348b427ed82a.slice/crio-f0b25ffbc96bf08a19cf46053b86f6b4a0eec79ddf398b3078fad534ae494578 WatchSource:0}: Error finding container f0b25ffbc96bf08a19cf46053b86f6b4a0eec79ddf398b3078fad534ae494578: Status 404 returned error can't find the container with id f0b25ffbc96bf08a19cf46053b86f6b4a0eec79ddf398b3078fad534ae494578 Apr 17 16:49:27.062632 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:27.062613 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 16:49:27.111906 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:27.111870 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98l5c/must-gather-wk6bp" event={"ID":"90369813-e438-4e7b-9f23-348b427ed82a","Type":"ContainerStarted","Data":"f0b25ffbc96bf08a19cf46053b86f6b4a0eec79ddf398b3078fad534ae494578"} Apr 17 16:49:29.123661 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:29.122813 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98l5c/must-gather-wk6bp" event={"ID":"90369813-e438-4e7b-9f23-348b427ed82a","Type":"ContainerStarted","Data":"082db69f7f51b9a9018f2866b2aa36a5b17d913e38d1042ed849930b247c775a"} Apr 17 16:49:29.123661 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:29.122858 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98l5c/must-gather-wk6bp" event={"ID":"90369813-e438-4e7b-9f23-348b427ed82a","Type":"ContainerStarted","Data":"a132a1a05f9df58b26148a08f4f66346487313e843ba92cb071b92fc0b96d85f"} Apr 17 16:49:29.142002 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:29.141939 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-98l5c/must-gather-wk6bp" podStartSLOduration=2.10937663 podStartE2EDuration="3.141918447s" podCreationTimestamp="2026-04-17 16:49:26 +0000 UTC" firstStartedPulling="2026-04-17 16:49:27.062767692 +0000 UTC m=+1784.403255749" lastFinishedPulling="2026-04-17 16:49:28.095309506 +0000 UTC m=+1785.435797566" observedRunningTime="2026-04-17 16:49:29.14050901 +0000 UTC m=+1786.480997088" watchObservedRunningTime="2026-04-17 16:49:29.141918447 +0000 UTC m=+1786.482406527" Apr 17 16:49:29.733251 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:29.733219 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-dvwcd_a4442d85-8b8a-48bf-b06b-5d49262f2b07/global-pull-secret-syncer/0.log" Apr 17 16:49:29.872713 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:29.872679 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-v6lnz_a4d23535-e11c-4204-9246-5539245e51d9/konnectivity-agent/0.log" Apr 17 16:49:29.896299 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:29.896265 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-179.ec2.internal_bcd2509d09ecd17cbbfc28f51d1d45a5/haproxy/0.log" Apr 17 16:49:34.055596 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:34.055404 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-678cbf4fcb-sh6r5_074070a8-5b28-42ed-8363-d799ab319a72/authorino/0.log" Apr 17 16:49:34.131931 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:34.131893 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-b9z8q_7026a0c5-2913-4045-93bc-d4dbca24d13e/kuadrant-console-plugin/0.log" Apr 17 16:49:34.304309 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:34.304276 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-2h256_4b2fe46d-0f76-4e12-bd63-8a431921db1f/limitador/0.log" Apr 17 16:49:35.554533 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:35.554503 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7e3fe07d-1b95-427a-9bf4-df826918a7ec/alertmanager/0.log" Apr 17 16:49:35.574980 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:35.574953 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7e3fe07d-1b95-427a-9bf4-df826918a7ec/config-reloader/0.log" Apr 17 16:49:35.594154 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:35.594127 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7e3fe07d-1b95-427a-9bf4-df826918a7ec/kube-rbac-proxy-web/0.log" Apr 17 16:49:35.615690 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:35.615614 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7e3fe07d-1b95-427a-9bf4-df826918a7ec/kube-rbac-proxy/0.log" Apr 17 16:49:35.640985 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:35.640953 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7e3fe07d-1b95-427a-9bf4-df826918a7ec/kube-rbac-proxy-metric/0.log" Apr 17 16:49:35.665501 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:35.665473 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7e3fe07d-1b95-427a-9bf4-df826918a7ec/prom-label-proxy/0.log" Apr 17 16:49:35.697716 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:35.697683 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7e3fe07d-1b95-427a-9bf4-df826918a7ec/init-config-reloader/0.log" Apr 17 16:49:35.772769 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:35.772676 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mjp46_1a8eccc3-52d0-4b42-af0a-5a5338b67200/kube-state-metrics/0.log" Apr 17 16:49:35.799888 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:35.799851 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mjp46_1a8eccc3-52d0-4b42-af0a-5a5338b67200/kube-rbac-proxy-main/0.log" Apr 17 16:49:35.824864 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:35.824833 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-mjp46_1a8eccc3-52d0-4b42-af0a-5a5338b67200/kube-rbac-proxy-self/0.log" Apr 17 16:49:35.911534 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:35.911496 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-zxnm6_b39db528-5360-480e-a54c-fb959515df7c/monitoring-plugin/0.log" Apr 17 16:49:35.944129 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:35.944097 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-72l89_1ca17119-f140-4bd9-9cc4-59cb1122e37e/node-exporter/0.log" Apr 17 16:49:35.963736 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:35.963615 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-72l89_1ca17119-f140-4bd9-9cc4-59cb1122e37e/kube-rbac-proxy/0.log" Apr 17 16:49:35.986615 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:35.986585 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-72l89_1ca17119-f140-4bd9-9cc4-59cb1122e37e/init-textfile/0.log" Apr 17 16:49:36.174825 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:36.174791 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2s6bc_bd957af0-a264-4665-92db-2be4172c6ef3/kube-rbac-proxy-main/0.log" Apr 17 16:49:36.200673 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:36.200636 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2s6bc_bd957af0-a264-4665-92db-2be4172c6ef3/kube-rbac-proxy-self/0.log" Apr 17 16:49:36.234212 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:36.234170 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-2s6bc_bd957af0-a264-4665-92db-2be4172c6ef3/openshift-state-metrics/0.log" Apr 17 16:49:36.273362 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:36.273329 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f52106fd-1924-474a-99d3-a7af72e7a3ce/prometheus/0.log" Apr 17 16:49:36.294857 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:36.294823 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f52106fd-1924-474a-99d3-a7af72e7a3ce/config-reloader/0.log" Apr 17 16:49:36.319837 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:36.319811 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f52106fd-1924-474a-99d3-a7af72e7a3ce/thanos-sidecar/0.log" Apr 17 16:49:36.339173 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:36.339133 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f52106fd-1924-474a-99d3-a7af72e7a3ce/kube-rbac-proxy-web/0.log" Apr 17 16:49:36.359451 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:36.359417 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f52106fd-1924-474a-99d3-a7af72e7a3ce/kube-rbac-proxy/0.log" Apr 17 16:49:36.379829 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:36.379798 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f52106fd-1924-474a-99d3-a7af72e7a3ce/kube-rbac-proxy-thanos/0.log" Apr 17 16:49:36.410707 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:36.410611 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f52106fd-1924-474a-99d3-a7af72e7a3ce/init-config-reloader/0.log" Apr 17 16:49:37.891432 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:37.891401 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-r7w7c_96267ecb-07cc-48af-88b4-f6e710234cfb/networking-console-plugin/0.log" Apr 17 16:49:38.484372 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:38.484330 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl"] Apr 17 16:49:38.490320 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:38.490288 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl" Apr 17 16:49:38.501250 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:38.501218 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl"] Apr 17 16:49:38.531537 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:38.531483 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ljhb6_72bbd866-8c40-48e3-9eb3-b34ae76679de/console-operator/2.log" Apr 17 16:49:38.537962 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:38.537932 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ljhb6_72bbd866-8c40-48e3-9eb3-b34ae76679de/console-operator/3.log" Apr 17 16:49:38.541710 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:38.541688 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/70346ba7-da03-4a12-b250-c99c2ebfc569-podres\") pod \"perf-node-gather-daemonset-cljnl\" (UID: \"70346ba7-da03-4a12-b250-c99c2ebfc569\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl" Apr 17 16:49:38.541710 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:38.541720 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/70346ba7-da03-4a12-b250-c99c2ebfc569-lib-modules\") pod \"perf-node-gather-daemonset-cljnl\" (UID: \"70346ba7-da03-4a12-b250-c99c2ebfc569\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl" Apr 17 16:49:38.541901 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:38.541744 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf2jk\" (UniqueName: \"kubernetes.io/projected/70346ba7-da03-4a12-b250-c99c2ebfc569-kube-api-access-qf2jk\") pod \"perf-node-gather-daemonset-cljnl\" (UID: \"70346ba7-da03-4a12-b250-c99c2ebfc569\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl" Apr 17 16:49:38.541901 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:38.541803 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/70346ba7-da03-4a12-b250-c99c2ebfc569-sys\") pod \"perf-node-gather-daemonset-cljnl\" (UID: \"70346ba7-da03-4a12-b250-c99c2ebfc569\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl" Apr 17 16:49:38.541901 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:38.541857 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/70346ba7-da03-4a12-b250-c99c2ebfc569-proc\") pod \"perf-node-gather-daemonset-cljnl\" (UID: \"70346ba7-da03-4a12-b250-c99c2ebfc569\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl" Apr 17 16:49:38.642653 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:38.642620 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/70346ba7-da03-4a12-b250-c99c2ebfc569-sys\") pod \"perf-node-gather-daemonset-cljnl\" (UID: \"70346ba7-da03-4a12-b250-c99c2ebfc569\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl" Apr 17 16:49:38.642844 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:38.642672 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/70346ba7-da03-4a12-b250-c99c2ebfc569-proc\") pod \"perf-node-gather-daemonset-cljnl\" (UID: \"70346ba7-da03-4a12-b250-c99c2ebfc569\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl" Apr 17 16:49:38.642844 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:38.642734 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/70346ba7-da03-4a12-b250-c99c2ebfc569-sys\") pod \"perf-node-gather-daemonset-cljnl\" (UID: \"70346ba7-da03-4a12-b250-c99c2ebfc569\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl" Apr 17 16:49:38.642844 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:38.642765 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/70346ba7-da03-4a12-b250-c99c2ebfc569-proc\") pod \"perf-node-gather-daemonset-cljnl\" (UID: \"70346ba7-da03-4a12-b250-c99c2ebfc569\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl" Apr 17 16:49:38.642844 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:38.642766 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/70346ba7-da03-4a12-b250-c99c2ebfc569-podres\") pod \"perf-node-gather-daemonset-cljnl\" (UID: \"70346ba7-da03-4a12-b250-c99c2ebfc569\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl" Apr 17 16:49:38.642844 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:38.642826 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/70346ba7-da03-4a12-b250-c99c2ebfc569-lib-modules\") pod \"perf-node-gather-daemonset-cljnl\" (UID: \"70346ba7-da03-4a12-b250-c99c2ebfc569\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl" Apr 17 16:49:38.643064 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:38.642851 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qf2jk\" (UniqueName: \"kubernetes.io/projected/70346ba7-da03-4a12-b250-c99c2ebfc569-kube-api-access-qf2jk\") pod \"perf-node-gather-daemonset-cljnl\" (UID: \"70346ba7-da03-4a12-b250-c99c2ebfc569\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl" Apr 17 16:49:38.643064 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:38.642932 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/70346ba7-da03-4a12-b250-c99c2ebfc569-podres\") pod \"perf-node-gather-daemonset-cljnl\" (UID: \"70346ba7-da03-4a12-b250-c99c2ebfc569\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl" Apr 17 16:49:38.643064 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:38.643009 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/70346ba7-da03-4a12-b250-c99c2ebfc569-lib-modules\") pod \"perf-node-gather-daemonset-cljnl\" (UID: \"70346ba7-da03-4a12-b250-c99c2ebfc569\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl" Apr 17 16:49:38.652055 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:38.652027 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf2jk\" (UniqueName: \"kubernetes.io/projected/70346ba7-da03-4a12-b250-c99c2ebfc569-kube-api-access-qf2jk\") pod \"perf-node-gather-daemonset-cljnl\" (UID: \"70346ba7-da03-4a12-b250-c99c2ebfc569\") " pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl" Apr 17 16:49:38.802643 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:38.802546 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl" Apr 17 16:49:38.964169 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:38.962802 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl"] Apr 17 16:49:39.169784 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:39.169739 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl" event={"ID":"70346ba7-da03-4a12-b250-c99c2ebfc569","Type":"ContainerStarted","Data":"0d77e2be1bfc23ebf174ac62c0f74bbe0ac24776e31e052d2d27e5ec446fabbd"} Apr 17 16:49:39.169978 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:39.169790 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl" event={"ID":"70346ba7-da03-4a12-b250-c99c2ebfc569","Type":"ContainerStarted","Data":"a8f4a88e2e6514ce44a7d9a5c5fd2be9f7494fd7fe40216bdfeab5a121820ae4"} Apr 17 16:49:39.170695 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:39.170666 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl" Apr 17 16:49:39.186685 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:39.186638 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl" podStartSLOduration=1.18662126 podStartE2EDuration="1.18662126s" podCreationTimestamp="2026-04-17 16:49:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 16:49:39.186366697 +0000 UTC m=+1796.526854777" watchObservedRunningTime="2026-04-17 16:49:39.18662126 +0000 UTC m=+1796.527109338" Apr 17 16:49:40.418636 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:40.418606 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-v5rbd_609a9cbf-301f-406b-a26d-13ae069e0a70/dns/0.log" Apr 17 16:49:40.438775 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:40.438746 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-v5rbd_609a9cbf-301f-406b-a26d-13ae069e0a70/kube-rbac-proxy/0.log" Apr 17 16:49:40.511140 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:40.511108 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-s4nbk_b0030b4f-f856-49ec-87a6-eca6a00291ad/dns-node-resolver/0.log" Apr 17 16:49:40.979031 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:40.979004 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-59b97fb566-dw8gb_6eb7a238-0086-416f-a60f-1d6af3198eb9/registry/0.log" Apr 17 16:49:41.044440 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:41.044403 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fwdvx_ef613c01-eb3a-451b-b2f2-9eee9ab808cc/node-ca/0.log" Apr 17 16:49:42.010671 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:42.010640 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-667bf5bb7-xs847_2c42c1d3-83ba-4fe7-89d0-e86749bbaab6/kube-auth-proxy/0.log" Apr 17 16:49:42.088180 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:42.088138 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5948c7b7c8-rmrfc_b5767428-43d6-4cbe-9763-0731e126b82c/router/0.log" Apr 17 16:49:42.591477 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:42.591445 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9sgrn_5caf5aa7-4606-4fa1-8754-cab1cd67eac0/serve-healthcheck-canary/0.log" Apr 17 16:49:43.043383 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:43.043327 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-mqbqx_80fc0241-d0ad-42e2-9e15-932722a75ffa/insights-operator/0.log" Apr 17 16:49:43.043922 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:43.043732 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-mqbqx_80fc0241-d0ad-42e2-9e15-932722a75ffa/insights-operator/1.log" Apr 17 16:49:43.197910 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:43.197878 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ljhb6_72bbd866-8c40-48e3-9eb3-b34ae76679de/console-operator/2.log" Apr 17 16:49:43.198632 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:43.198609 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ljhb6_72bbd866-8c40-48e3-9eb3-b34ae76679de/console-operator/2.log" Apr 17 16:49:43.205497 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:43.205473 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n6gcq_1779d4f2-fa2d-48b1-8e85-3b6c94c91d30/kube-rbac-proxy/0.log" Apr 17 16:49:43.224533 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:43.224495 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n6gcq_1779d4f2-fa2d-48b1-8e85-3b6c94c91d30/exporter/0.log" Apr 17 16:49:43.244343 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:43.244318 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-n6gcq_1779d4f2-fa2d-48b1-8e85-3b6c94c91d30/extractor/0.log" Apr 17 16:49:45.287506 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:45.287474 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-54994d49cf-jz6l5_ca5c3e8c-559a-4015-8f20-7a94b4500850/manager/0.log" Apr 17 16:49:46.187714 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:46.187689 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-98l5c/perf-node-gather-daemonset-cljnl" Apr 17 16:49:46.471345 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:46.471262 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7bf4f445d7-w9z9l_fd7f7eed-6e93-49b8-8304-e883b3d51258/manager/0.log" Apr 17 16:49:51.374477 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:51.374444 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-z4bpg_4eb06b52-b2db-4c75-8034-44b127e20319/kube-storage-version-migrator-operator/1.log" Apr 17 16:49:51.376400 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:51.376370 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-z4bpg_4eb06b52-b2db-4c75-8034-44b127e20319/kube-storage-version-migrator-operator/0.log" Apr 17 16:49:52.686088 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:52.686055 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vt8g2_8de8591f-0659-4b29-abd0-982ba1568fa2/kube-multus-additional-cni-plugins/0.log" Apr 17 16:49:52.704888 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:52.704857 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vt8g2_8de8591f-0659-4b29-abd0-982ba1568fa2/egress-router-binary-copy/0.log" Apr 17 16:49:52.723007 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:52.722979 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vt8g2_8de8591f-0659-4b29-abd0-982ba1568fa2/cni-plugins/0.log" Apr 17 16:49:52.741463 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:52.741440 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vt8g2_8de8591f-0659-4b29-abd0-982ba1568fa2/bond-cni-plugin/0.log" Apr 17 16:49:52.762170 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:52.762136 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vt8g2_8de8591f-0659-4b29-abd0-982ba1568fa2/routeoverride-cni/0.log" Apr 17 16:49:52.784141 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:52.784106 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vt8g2_8de8591f-0659-4b29-abd0-982ba1568fa2/whereabouts-cni-bincopy/0.log" Apr 17 16:49:52.804986 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:52.804963 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vt8g2_8de8591f-0659-4b29-abd0-982ba1568fa2/whereabouts-cni/0.log" Apr 17 16:49:52.838947 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:52.838920 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hgr6r_edcb65df-bdda-4e5d-acba-2ef0eb3d8f51/kube-multus/0.log" Apr 17 16:49:52.957378 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:52.957291 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-j89hr_dbd283d5-ff0b-4c8f-b1be-15a75816e953/network-metrics-daemon/0.log" Apr 17 16:49:52.976454 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:52.976427 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-j89hr_dbd283d5-ff0b-4c8f-b1be-15a75816e953/kube-rbac-proxy/0.log" Apr 17 16:49:54.332496 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:54.332469 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nhjqx_3e371a5a-2d19-4c74-8b51-d4ac6484410c/ovn-controller/0.log" Apr 17 16:49:54.367145 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:54.367114 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nhjqx_3e371a5a-2d19-4c74-8b51-d4ac6484410c/ovn-acl-logging/0.log" Apr 17 16:49:54.388257 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:54.388228 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nhjqx_3e371a5a-2d19-4c74-8b51-d4ac6484410c/kube-rbac-proxy-node/0.log" Apr 17 16:49:54.408483 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:54.408447 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nhjqx_3e371a5a-2d19-4c74-8b51-d4ac6484410c/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 16:49:54.423466 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:54.423444 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nhjqx_3e371a5a-2d19-4c74-8b51-d4ac6484410c/northd/0.log" Apr 17 16:49:54.442602 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:54.442577 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nhjqx_3e371a5a-2d19-4c74-8b51-d4ac6484410c/nbdb/0.log" Apr 17 16:49:54.461092 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:54.461031 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nhjqx_3e371a5a-2d19-4c74-8b51-d4ac6484410c/sbdb/0.log" Apr 17 16:49:54.631215 ip-10-0-132-179 kubenswrapper[2572]: I0417 16:49:54.631178 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nhjqx_3e371a5a-2d19-4c74-8b51-d4ac6484410c/ovnkube-controller/0.log"