Apr 24 16:38:32.168382 ip-10-0-131-37 systemd[1]: Starting Kubernetes Kubelet... Apr 24 16:38:32.601339 ip-10-0-131-37 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:38:32.601339 ip-10-0-131-37 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 16:38:32.601339 ip-10-0-131-37 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:38:32.601339 ip-10-0-131-37 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 16:38:32.601339 ip-10-0-131-37 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:38:32.602222 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.602118 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 16:38:32.605857 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605802 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:38:32.605857 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605826 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:38:32.605857 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605830 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:38:32.605857 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605833 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:38:32.605857 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605836 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:38:32.605857 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605840 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:38:32.605857 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605843 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:38:32.605857 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605846 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:38:32.605857 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605849 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:38:32.605857 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605851 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:38:32.605857 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605854 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:38:32.605857 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605858 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:38:32.605857 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605861 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:38:32.605857 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605864 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:38:32.606244 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605868 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:38:32.606244 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605871 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:38:32.606244 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605876 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:38:32.606244 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605880 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:38:32.606244 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605883 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:38:32.606244 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605886 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:38:32.606244 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605889 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:38:32.606244 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605892 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:38:32.606244 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605895 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:38:32.606244 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605897 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:38:32.606244 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605900 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:38:32.606244 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605902 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:38:32.606244 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605905 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:38:32.606244 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605907 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:38:32.606244 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605910 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:38:32.606244 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605913 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:38:32.606244 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605915 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:38:32.606244 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605918 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:38:32.606244 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605920 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:38:32.606726 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605923 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:38:32.606726 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605925 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:38:32.606726 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605928 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:38:32.606726 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605930 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:38:32.606726 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605935 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:38:32.606726 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605938 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:38:32.606726 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605941 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:38:32.606726 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605944 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:38:32.606726 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605947 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:38:32.606726 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605951 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:38:32.606726 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605955 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:38:32.606726 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605958 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:38:32.606726 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605961 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:38:32.606726 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605966 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:38:32.606726 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605969 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:38:32.606726 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605973 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:38:32.606726 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605976 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:38:32.606726 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605978 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:38:32.606726 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605981 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:38:32.606726 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605984 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:38:32.607253 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605986 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:38:32.607253 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605988 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:38:32.607253 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605991 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:38:32.607253 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605993 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:38:32.607253 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605996 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:38:32.607253 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.605999 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:38:32.607253 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606002 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:38:32.607253 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606004 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:38:32.607253 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606007 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:38:32.607253 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606010 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:38:32.607253 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606012 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:38:32.607253 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606015 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:38:32.607253 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606017 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:38:32.607253 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606020 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:38:32.607253 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606023 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:38:32.607253 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606026 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:38:32.607253 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606030 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:38:32.607253 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606033 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:38:32.607253 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606036 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:38:32.607253 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606039 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:38:32.607922 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606041 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:38:32.607922 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606044 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:38:32.607922 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606046 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:38:32.607922 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606049 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:38:32.607922 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606052 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:38:32.607922 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606054 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:38:32.607922 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606057 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:38:32.607922 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606060 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:38:32.607922 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606063 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:38:32.607922 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606065 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:38:32.607922 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606068 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:38:32.607922 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606071 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:38:32.607922 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606073 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:38:32.607922 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606543 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:38:32.607922 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606549 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:38:32.607922 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606552 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:38:32.607922 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606555 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:38:32.607922 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606558 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:38:32.607922 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606560 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:38:32.608506 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606563 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:38:32.608506 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606566 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:38:32.608506 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606569 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:38:32.608506 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606572 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:38:32.608506 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606574 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:38:32.608506 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606577 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:38:32.608506 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606581 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:38:32.608506 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606584 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:38:32.608506 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606587 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:38:32.608506 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606590 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:38:32.608506 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606593 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:38:32.608506 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606596 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:38:32.608506 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606599 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:38:32.608506 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606602 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:38:32.608506 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606605 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:38:32.608506 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606607 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:38:32.608506 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606610 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:38:32.608506 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606612 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:38:32.608506 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606615 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:38:32.608506 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606617 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:38:32.609197 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606620 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:38:32.609197 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606622 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:38:32.609197 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606625 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:38:32.609197 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606629 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:38:32.609197 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606633 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:38:32.609197 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606636 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:38:32.609197 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606639 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:38:32.609197 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606643 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:38:32.609197 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606646 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:38:32.609197 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606649 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:38:32.609197 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606652 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:38:32.609197 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606654 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:38:32.609197 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606657 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:38:32.609197 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606660 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:38:32.609197 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606663 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:38:32.609197 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606666 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:38:32.609197 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606668 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:38:32.609197 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606671 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:38:32.609197 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606673 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:38:32.609776 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606677 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:38:32.609776 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606679 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:38:32.609776 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606682 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:38:32.609776 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606684 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:38:32.609776 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606687 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:38:32.609776 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606690 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:38:32.609776 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606692 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:38:32.609776 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606695 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:38:32.609776 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606697 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:38:32.609776 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606699 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:38:32.609776 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606702 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:38:32.609776 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606705 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:38:32.609776 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606707 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:38:32.609776 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606709 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:38:32.609776 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606712 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:38:32.609776 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606714 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:38:32.609776 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606717 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:38:32.609776 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606719 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:38:32.609776 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606722 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:38:32.610279 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606725 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:38:32.610279 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606727 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:38:32.610279 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606731 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:38:32.610279 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606735 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:38:32.610279 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606738 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:38:32.610279 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606741 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:38:32.610279 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606743 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:38:32.610279 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606747 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:38:32.610279 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606749 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:38:32.610279 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606753 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:38:32.610279 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606755 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:38:32.610279 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606758 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:38:32.610279 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606760 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:38:32.610279 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606762 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:38:32.610279 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606768 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:38:32.610279 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606772 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:38:32.610279 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606775 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:38:32.610279 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606778 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:38:32.610279 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606780 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:38:32.610279 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606783 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606785 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.606788 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606872 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606879 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606887 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606892 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606896 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606900 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606904 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606909 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606913 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606916 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606920 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606923 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606926 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606929 2575 flags.go:64] FLAG: --cgroup-root="" Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606932 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606935 2575 flags.go:64] FLAG: --client-ca-file="" Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606938 2575 flags.go:64] FLAG: --cloud-config="" Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606941 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606944 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606949 2575 flags.go:64] FLAG: --cluster-domain="" Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606951 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 16:38:32.610869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606955 2575 flags.go:64] FLAG: --config-dir="" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606958 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606961 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606965 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606969 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606973 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606976 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606980 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606983 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606986 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606989 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606992 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.606997 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607000 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607003 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607005 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607009 2575 flags.go:64] FLAG: --enable-server="true" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607012 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607017 2575 flags.go:64] FLAG: --event-burst="100" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607020 2575 flags.go:64] FLAG: --event-qps="50" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607023 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607027 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607030 2575 flags.go:64] FLAG: --eviction-hard="" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607034 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 16:38:32.611480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607037 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607040 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607043 2575 flags.go:64] FLAG: --eviction-soft="" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607046 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607051 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607054 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607057 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607060 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607063 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607066 2575 flags.go:64] FLAG: --feature-gates="" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607070 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607073 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607077 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607098 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607102 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607105 2575 flags.go:64] FLAG: --help="false" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607108 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-131-37.ec2.internal" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607111 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607115 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607118 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607121 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607125 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607128 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 16:38:32.612072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607131 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607133 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607136 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607139 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607142 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607145 2575 flags.go:64] FLAG: --kube-reserved="" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607148 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607151 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607154 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607157 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607160 2575 flags.go:64] FLAG: --lock-file="" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607163 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607166 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607174 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607180 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607183 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607185 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607188 2575 flags.go:64] FLAG: --logging-format="text" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607191 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607195 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607198 2575 flags.go:64] FLAG: --manifest-url="" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607201 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607205 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607209 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607213 2575 flags.go:64] FLAG: --max-pods="110" Apr 24 16:38:32.612647 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607216 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607220 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607223 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607226 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607229 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607232 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607251 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607260 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607264 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607267 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607270 2575 flags.go:64] FLAG: --pod-cidr="" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607273 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607280 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607283 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607286 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607289 2575 flags.go:64] FLAG: --port="10250" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607292 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607295 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0be717bc202f652e5" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607298 2575 flags.go:64] FLAG: --qos-reserved="" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607301 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607306 2575 flags.go:64] FLAG: --register-node="true" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607309 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607312 2575 flags.go:64] FLAG: --register-with-taints="" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607315 2575 flags.go:64] FLAG: --registry-burst="10" Apr 24 16:38:32.613269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607318 2575 flags.go:64] FLAG: --registry-qps="5" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607321 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607324 2575 flags.go:64] FLAG: --reserved-memory="" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607328 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607331 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607337 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607340 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607349 2575 flags.go:64] FLAG: --runonce="false" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607353 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607356 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607359 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607362 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607364 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607368 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607371 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607374 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607377 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607380 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607383 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607386 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607389 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607391 2575 flags.go:64] FLAG: --system-cgroups="" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607394 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607400 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607403 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 24 16:38:32.613948 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607406 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 16:38:32.615010 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607410 2575 flags.go:64] FLAG: --tls-min-version="" Apr 24 16:38:32.615010 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607412 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 16:38:32.615010 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607416 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 16:38:32.615010 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607419 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 16:38:32.615010 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607422 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 16:38:32.615010 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607425 2575 flags.go:64] FLAG: --v="2" Apr 24 16:38:32.615010 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607429 2575 flags.go:64] FLAG: --version="false" Apr 24 16:38:32.615010 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607434 2575 flags.go:64] FLAG: --vmodule="" Apr 24 16:38:32.615010 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607438 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 16:38:32.615010 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607442 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 16:38:32.615010 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607552 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:38:32.615010 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607557 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:38:32.615010 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607560 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:38:32.615010 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607563 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:38:32.615010 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607572 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:38:32.615010 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607575 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:38:32.615010 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607578 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:38:32.615010 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607580 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:38:32.615010 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607584 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:38:32.615010 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607586 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:38:32.615010 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607589 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:38:32.615010 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607591 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:38:32.615924 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607594 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:38:32.615924 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607597 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:38:32.615924 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607599 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:38:32.615924 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607602 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:38:32.615924 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607604 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:38:32.615924 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607607 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:38:32.615924 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607610 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:38:32.615924 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607613 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:38:32.615924 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607615 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:38:32.615924 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607618 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:38:32.615924 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607620 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:38:32.615924 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607623 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:38:32.615924 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607627 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:38:32.615924 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607629 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:38:32.615924 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607632 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:38:32.615924 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607634 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:38:32.615924 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607636 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:38:32.615924 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607639 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:38:32.615924 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607642 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:38:32.615924 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607644 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:38:32.616839 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607646 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:38:32.616839 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607650 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:38:32.616839 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607653 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:38:32.616839 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607656 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:38:32.616839 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607658 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:38:32.616839 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607661 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:38:32.616839 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607664 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:38:32.616839 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607666 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:38:32.616839 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607669 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:38:32.616839 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607703 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:38:32.616839 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607728 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:38:32.616839 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607735 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:38:32.616839 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607740 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:38:32.616839 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607745 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:38:32.616839 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607749 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:38:32.616839 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607757 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:38:32.616839 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607761 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:38:32.616839 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607766 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:38:32.616839 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607770 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:38:32.616839 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607774 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:38:32.617544 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607778 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:38:32.617544 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607783 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:38:32.617544 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607787 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:38:32.617544 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607791 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:38:32.617544 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607798 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:38:32.617544 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607806 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:38:32.617544 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607811 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:38:32.617544 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607816 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:38:32.617544 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607821 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:38:32.617544 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607825 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:38:32.617544 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607830 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:38:32.617544 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607834 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:38:32.617544 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607839 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:38:32.617544 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607843 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:38:32.617544 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607848 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:38:32.617544 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607855 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:38:32.617544 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607861 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:38:32.617544 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607866 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:38:32.617544 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607870 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:38:32.618071 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607875 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:38:32.618071 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607879 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:38:32.618071 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607883 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:38:32.618071 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607890 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:38:32.618071 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607895 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:38:32.618071 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607899 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:38:32.618071 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607904 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:38:32.618071 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607909 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:38:32.618071 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607913 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:38:32.618071 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607918 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:38:32.618071 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607922 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:38:32.618071 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607926 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:38:32.618071 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607930 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:38:32.618071 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607934 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:38:32.618071 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.607938 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:38:32.618071 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.607954 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:38:32.618612 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.615895 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 16:38:32.618612 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.615923 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 16:38:32.618612 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616003 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:38:32.618612 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616012 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:38:32.618612 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616017 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:38:32.618612 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616025 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:38:32.618612 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616032 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:38:32.618612 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616037 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:38:32.618612 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616042 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:38:32.618612 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616047 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:38:32.618612 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616052 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:38:32.618612 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616056 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:38:32.618612 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616060 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:38:32.618612 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616065 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:38:32.618612 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616068 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:38:32.618612 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616072 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:38:32.618612 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616077 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:38:32.618612 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616082 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:38:32.618612 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616105 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:38:32.618612 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616119 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:38:32.619450 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616124 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:38:32.619450 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616129 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:38:32.619450 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616133 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:38:32.619450 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616138 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:38:32.619450 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616142 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:38:32.619450 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616146 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:38:32.619450 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616150 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:38:32.619450 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616156 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:38:32.619450 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616160 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:38:32.619450 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616164 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:38:32.619450 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616169 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:38:32.619450 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616173 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:38:32.619450 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616177 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:38:32.619450 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616181 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:38:32.619450 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616185 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:38:32.619450 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616190 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:38:32.619450 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616194 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:38:32.619450 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616199 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:38:32.619450 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616203 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:38:32.619450 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616207 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:38:32.620481 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616211 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:38:32.620481 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616216 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:38:32.620481 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616220 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:38:32.620481 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616224 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:38:32.620481 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616229 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:38:32.620481 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616233 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:38:32.620481 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616237 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:38:32.620481 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616241 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:38:32.620481 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616245 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:38:32.620481 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616250 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:38:32.620481 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616254 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:38:32.620481 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616258 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:38:32.620481 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616274 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:38:32.620481 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616279 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:38:32.620481 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616284 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:38:32.620481 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616288 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:38:32.620481 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616292 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:38:32.620481 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616296 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:38:32.620481 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616300 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:38:32.621033 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616305 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:38:32.621033 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616309 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:38:32.621033 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616313 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:38:32.621033 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616317 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:38:32.621033 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616322 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:38:32.621033 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616327 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:38:32.621033 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616331 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:38:32.621033 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616336 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:38:32.621033 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616340 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:38:32.621033 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616344 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:38:32.621033 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616349 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:38:32.621033 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616353 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:38:32.621033 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616357 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:38:32.621033 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616362 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:38:32.621033 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616366 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:38:32.621033 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616370 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:38:32.621033 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616374 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:38:32.621033 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616381 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:38:32.621033 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616386 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:38:32.621033 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616392 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:38:32.621649 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616396 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:38:32.621649 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616400 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:38:32.621649 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616404 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:38:32.621649 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616417 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:38:32.621649 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616422 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:38:32.621649 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616427 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:38:32.621649 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616439 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:38:32.621649 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616444 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:38:32.621649 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616448 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:38:32.621649 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.616457 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:38:32.621649 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616654 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:38:32.621649 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616663 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:38:32.621649 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616669 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:38:32.621649 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616675 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:38:32.621649 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616680 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:38:32.622042 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616686 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:38:32.622042 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616691 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:38:32.622042 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616696 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:38:32.622042 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616701 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:38:32.622042 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616707 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:38:32.622042 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616712 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:38:32.622042 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616717 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:38:32.622042 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616721 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:38:32.622042 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616726 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:38:32.622042 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616730 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:38:32.622042 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616734 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:38:32.622042 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616739 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:38:32.622042 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616743 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:38:32.622042 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616747 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:38:32.622042 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616751 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:38:32.622042 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616755 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:38:32.622042 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616759 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:38:32.622042 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616764 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:38:32.622042 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616768 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:38:32.622594 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616772 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:38:32.622594 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616776 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:38:32.622594 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616781 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:38:32.622594 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616786 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:38:32.622594 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616790 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:38:32.622594 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616802 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:38:32.622594 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616807 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:38:32.622594 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616812 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:38:32.622594 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616816 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:38:32.622594 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616820 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:38:32.622594 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616824 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:38:32.622594 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616828 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:38:32.622594 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616832 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:38:32.622594 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616835 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:38:32.622594 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616839 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:38:32.622594 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616843 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:38:32.622594 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616847 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:38:32.622594 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616851 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:38:32.622594 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616855 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:38:32.623103 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616859 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:38:32.623103 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616862 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:38:32.623103 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616866 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:38:32.623103 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616870 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:38:32.623103 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616874 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:38:32.623103 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616879 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:38:32.623103 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616883 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:38:32.623103 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616888 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:38:32.623103 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616892 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:38:32.623103 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616896 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:38:32.623103 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616901 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:38:32.623103 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616905 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:38:32.623103 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616909 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:38:32.623103 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616913 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:38:32.623103 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616916 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:38:32.623103 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616921 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:38:32.623103 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616925 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:38:32.623103 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616929 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:38:32.623103 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616933 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:38:32.623103 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616945 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:38:32.623605 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616950 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:38:32.623605 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616954 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:38:32.623605 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616958 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:38:32.623605 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616963 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:38:32.623605 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616967 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:38:32.623605 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616971 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:38:32.623605 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616975 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:38:32.623605 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616979 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:38:32.623605 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616983 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:38:32.623605 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616987 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:38:32.623605 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616991 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:38:32.623605 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.616996 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:38:32.623605 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.617000 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:38:32.623605 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.617004 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:38:32.623605 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.617008 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:38:32.623605 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.617012 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:38:32.623605 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.617016 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:38:32.623605 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.617021 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:38:32.623605 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.617025 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:38:32.623605 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.617029 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:38:32.624156 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.617033 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:38:32.624156 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.617038 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:38:32.624156 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:32.617042 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:38:32.624156 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.617050 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:38:32.624156 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.617928 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 16:38:32.624156 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.620920 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 16:38:32.624156 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.622236 2575 server.go:1019] "Starting client certificate rotation" Apr 24 16:38:32.624156 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.622356 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 16:38:32.624156 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.622806 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 16:38:32.648193 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.648159 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 16:38:32.655844 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.655814 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 16:38:32.672764 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.672736 2575 log.go:25] "Validated CRI v1 runtime API" Apr 24 16:38:32.678620 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.678598 2575 log.go:25] "Validated CRI v1 image API" Apr 24 16:38:32.679893 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.679865 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 16:38:32.680671 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.680655 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 16:38:32.684210 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.684184 2575 fs.go:135] Filesystem UUIDs: map[0a62331c-8632-487b-9eea-5d17609d9108:/dev/nvme0n1p4 4b400d4b-60ef-4e89-b870-af8a213763d8:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 24 16:38:32.684269 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.684208 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 16:38:32.690928 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.690656 2575 manager.go:217] Machine: {Timestamp:2026-04-24 16:38:32.688651952 +0000 UTC m=+0.402719979 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098734 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2dbb71c3f05737bb2c2a71470c91e9 SystemUUID:ec2dbb71-c3f0-5737-bb2c-2a71470c91e9 BootID:3a30a625-f549-4676-b39f-24b5d4be03a7 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:77:ab:96:4a:01 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:77:ab:96:4a:01 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:7e:13:bc:0f:f2:c0 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 16:38:32.690928 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.690921 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 16:38:32.691053 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.691020 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 16:38:32.692159 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.692127 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 16:38:32.692315 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.692162 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-37.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 16:38:32.692366 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.692325 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 16:38:32.692366 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.692332 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 16:38:32.692366 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.692346 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 16:38:32.693679 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.693668 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 16:38:32.695134 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.695121 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 24 16:38:32.695266 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.695256 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 16:38:32.697484 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.697473 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 24 16:38:32.697528 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.697495 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 16:38:32.697528 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.697511 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 16:38:32.697528 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.697521 2575 kubelet.go:397] "Adding apiserver pod source" Apr 24 16:38:32.697602 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.697530 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 16:38:32.699098 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.699077 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 16:38:32.699140 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.699125 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 16:38:32.703556 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.703533 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 16:38:32.705434 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.705415 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 16:38:32.706617 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.706605 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 16:38:32.706664 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.706622 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 16:38:32.706664 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.706628 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 16:38:32.706664 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.706635 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 16:38:32.706664 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.706640 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 16:38:32.706664 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.706646 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 16:38:32.706664 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.706652 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 16:38:32.706664 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.706657 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 16:38:32.706664 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.706664 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 16:38:32.706883 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.706671 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 16:38:32.706883 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.706680 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 16:38:32.706883 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.706689 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 16:38:32.708172 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.708159 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 16:38:32.708172 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.708172 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 16:38:32.711980 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.711964 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 16:38:32.712068 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.712016 2575 server.go:1295] "Started kubelet" Apr 24 16:38:32.712167 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.712132 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 16:38:32.712303 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.712238 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 16:38:32.712370 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.712324 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 16:38:32.713023 ip-10-0-131-37 systemd[1]: Started Kubernetes Kubelet. Apr 24 16:38:32.713710 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.713678 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-37.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 16:38:32.714041 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:32.713979 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 16:38:32.714115 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.714050 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 16:38:32.714271 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:32.714248 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-37.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 16:38:32.714871 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.714854 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 24 16:38:32.717970 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:32.717132 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-37.ec2.internal.18a9586395c33a88 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-37.ec2.internal,UID:ip-10-0-131-37.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-131-37.ec2.internal,},FirstTimestamp:2026-04-24 16:38:32.711977608 +0000 UTC m=+0.426045638,LastTimestamp:2026-04-24 16:38:32.711977608 +0000 UTC m=+0.426045638,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-37.ec2.internal,}" Apr 24 16:38:32.719117 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.719080 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 16:38:32.719621 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.719601 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 16:38:32.720426 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.720398 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 16:38:32.720426 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.720419 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 16:38:32.720574 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.720443 2575 factory.go:55] Registering systemd factory Apr 24 16:38:32.720574 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.720508 2575 factory.go:223] Registration of the systemd container factory successfully Apr 24 16:38:32.720574 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.720514 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 16:38:32.720702 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.720579 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 24 16:38:32.720702 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.720587 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 24 16:38:32.720790 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.720764 2575 factory.go:153] Registering CRI-O factory Apr 24 16:38:32.720790 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.720778 2575 factory.go:223] Registration of the crio container factory successfully Apr 24 16:38:32.720881 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.720843 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 16:38:32.720881 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.720872 2575 factory.go:103] Registering Raw factory Apr 24 16:38:32.720972 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.720888 2575 manager.go:1196] Started watching for new ooms in manager Apr 24 16:38:32.720972 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:32.720941 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-37.ec2.internal\" not found" Apr 24 16:38:32.720972 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:32.720950 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 16:38:32.721348 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.721332 2575 manager.go:319] Starting recovery of all containers Apr 24 16:38:32.724329 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:32.724294 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 16:38:32.725077 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:32.725050 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-131-37.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 16:38:32.731640 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.731607 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wkwvv" Apr 24 16:38:32.734557 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.734540 2575 manager.go:324] Recovery completed Apr 24 16:38:32.739000 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.738981 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:32.739759 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.739742 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wkwvv" Apr 24 16:38:32.741425 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.741408 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-37.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:32.741515 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.741437 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-37.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:32.741515 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.741448 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-37.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:32.741967 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.741953 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 16:38:32.742008 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.741968 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 16:38:32.742008 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.741984 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 24 16:38:32.743294 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:32.743219 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-131-37.ec2.internal.18a9586397848888 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-131-37.ec2.internal,UID:ip-10-0-131-37.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-131-37.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-131-37.ec2.internal,},FirstTimestamp:2026-04-24 16:38:32.74142324 +0000 UTC m=+0.455491268,LastTimestamp:2026-04-24 16:38:32.74142324 +0000 UTC m=+0.455491268,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-131-37.ec2.internal,}" Apr 24 16:38:32.744395 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.744381 2575 policy_none.go:49] "None policy: Start" Apr 24 16:38:32.744447 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.744398 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 16:38:32.744447 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.744409 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 24 16:38:32.782908 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.782889 2575 manager.go:341] "Starting Device Plugin manager" Apr 24 16:38:32.795961 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:32.783007 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 16:38:32.795961 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.783023 2575 server.go:85] "Starting device plugin registration server" Apr 24 16:38:32.795961 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.783370 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 16:38:32.795961 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.783387 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 16:38:32.795961 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.783477 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 16:38:32.795961 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.783542 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 16:38:32.795961 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.783550 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 16:38:32.795961 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:32.784581 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 16:38:32.795961 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:32.784626 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-37.ec2.internal\" not found" Apr 24 16:38:32.825234 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.825191 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 16:38:32.826494 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.826473 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 16:38:32.826618 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.826502 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 16:38:32.826618 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.826532 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 16:38:32.826618 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.826538 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 16:38:32.826618 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:32.826579 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 16:38:32.829417 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.829392 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:38:32.883745 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.883657 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:32.884816 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.884799 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-37.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:32.884889 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.884832 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-37.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:32.884889 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.884844 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-37.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:32.884889 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.884869 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-37.ec2.internal" Apr 24 16:38:32.891486 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.891467 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-37.ec2.internal" Apr 24 16:38:32.891548 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:32.891495 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-37.ec2.internal\": node \"ip-10-0-131-37.ec2.internal\" not found" Apr 24 16:38:32.904437 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:32.904404 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-37.ec2.internal\" not found" Apr 24 16:38:32.927038 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.927011 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-37.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-37.ec2.internal"] Apr 24 16:38:32.927134 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.927103 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:32.930979 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.930961 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-37.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:32.931041 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.930992 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-37.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:32.931041 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.931002 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-37.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:32.932318 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.932304 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:32.932485 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.932453 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-37.ec2.internal" Apr 24 16:38:32.932520 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.932502 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:32.934068 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.934054 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-37.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:32.934168 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.934066 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-37.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:32.934168 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.934080 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-37.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:32.934168 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.934108 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-37.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:32.934168 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.934109 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-37.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:32.934356 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.934124 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-37.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:32.935503 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.935485 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-37.ec2.internal" Apr 24 16:38:32.935592 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.935520 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:38:32.936580 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.936565 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-37.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:38:32.936671 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.936593 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-37.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:38:32.936671 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:32.936611 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-37.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:38:32.951772 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:32.951739 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-37.ec2.internal\" not found" node="ip-10-0-131-37.ec2.internal" Apr 24 16:38:32.956072 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:32.956052 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-37.ec2.internal\" not found" node="ip-10-0-131-37.ec2.internal" Apr 24 16:38:33.004921 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:33.004890 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-37.ec2.internal\" not found" Apr 24 16:38:33.022109 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:33.022050 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d832024cbf852529dc33765f192190f2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-37.ec2.internal\" (UID: \"d832024cbf852529dc33765f192190f2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-37.ec2.internal" Apr 24 16:38:33.022262 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:33.022121 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d832024cbf852529dc33765f192190f2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-37.ec2.internal\" (UID: \"d832024cbf852529dc33765f192190f2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-37.ec2.internal" Apr 24 16:38:33.022262 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:33.022176 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dcc90d88bf75fe88b8ea0db46e250029-config\") pod \"kube-apiserver-proxy-ip-10-0-131-37.ec2.internal\" (UID: \"dcc90d88bf75fe88b8ea0db46e250029\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-37.ec2.internal" Apr 24 16:38:33.105796 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:33.105765 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-37.ec2.internal\" not found" Apr 24 16:38:33.123226 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:33.123190 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d832024cbf852529dc33765f192190f2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-37.ec2.internal\" (UID: \"d832024cbf852529dc33765f192190f2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-37.ec2.internal" Apr 24 16:38:33.123325 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:33.123237 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dcc90d88bf75fe88b8ea0db46e250029-config\") pod \"kube-apiserver-proxy-ip-10-0-131-37.ec2.internal\" (UID: \"dcc90d88bf75fe88b8ea0db46e250029\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-37.ec2.internal" Apr 24 16:38:33.123325 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:33.123307 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d832024cbf852529dc33765f192190f2-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-37.ec2.internal\" (UID: \"d832024cbf852529dc33765f192190f2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-37.ec2.internal" Apr 24 16:38:33.123427 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:33.123377 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d832024cbf852529dc33765f192190f2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-37.ec2.internal\" (UID: \"d832024cbf852529dc33765f192190f2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-37.ec2.internal" Apr 24 16:38:33.123427 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:33.123416 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d832024cbf852529dc33765f192190f2-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-37.ec2.internal\" (UID: \"d832024cbf852529dc33765f192190f2\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-37.ec2.internal" Apr 24 16:38:33.123510 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:33.123454 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dcc90d88bf75fe88b8ea0db46e250029-config\") pod \"kube-apiserver-proxy-ip-10-0-131-37.ec2.internal\" (UID: \"dcc90d88bf75fe88b8ea0db46e250029\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-37.ec2.internal" Apr 24 16:38:33.206691 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:33.206623 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-37.ec2.internal\" not found" Apr 24 16:38:33.254131 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:33.254082 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-37.ec2.internal" Apr 24 16:38:33.258004 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:33.257913 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-37.ec2.internal" Apr 24 16:38:33.306752 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:33.306716 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-37.ec2.internal\" not found" Apr 24 16:38:33.407182 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:33.407147 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-37.ec2.internal\" not found" Apr 24 16:38:33.507786 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:33.507722 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-37.ec2.internal\" not found" Apr 24 16:38:33.608249 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:33.608221 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-37.ec2.internal\" not found" Apr 24 16:38:33.621524 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:33.621490 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 16:38:33.621675 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:33.621656 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 16:38:33.709321 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:33.709288 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-37.ec2.internal\" not found" Apr 24 16:38:33.719963 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:33.719935 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 16:38:33.732116 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:33.732079 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 16:38:33.741543 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:33.741517 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 16:33:32 +0000 UTC" deadline="2027-09-25 14:12:51.091699527 +0000 UTC" Apr 24 16:38:33.741543 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:33.741540 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12453h34m17.350162456s" Apr 24 16:38:33.755247 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:33.755226 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-bqpsx" Apr 24 16:38:33.762822 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:33.762768 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-bqpsx" Apr 24 16:38:33.810182 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:33.810154 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-37.ec2.internal\" not found" Apr 24 16:38:33.826539 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:33.826517 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:38:33.834324 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:33.834305 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:38:33.875798 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:33.875764 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd832024cbf852529dc33765f192190f2.slice/crio-aa4e521b660e0c6bf51b6dede3287de7701178930b68fc63067e600e72075db2 WatchSource:0}: Error finding container aa4e521b660e0c6bf51b6dede3287de7701178930b68fc63067e600e72075db2: Status 404 returned error can't find the container with id aa4e521b660e0c6bf51b6dede3287de7701178930b68fc63067e600e72075db2 Apr 24 16:38:33.876144 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:33.876119 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcc90d88bf75fe88b8ea0db46e250029.slice/crio-1c778c9047f2281f69c9a5897ee4e09afbdb52cf4f0b79a0005ae40781be3b82 WatchSource:0}: Error finding container 1c778c9047f2281f69c9a5897ee4e09afbdb52cf4f0b79a0005ae40781be3b82: Status 404 returned error can't find the container with id 1c778c9047f2281f69c9a5897ee4e09afbdb52cf4f0b79a0005ae40781be3b82 Apr 24 16:38:33.881392 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:33.881374 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:38:33.910834 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:33.910803 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-37.ec2.internal\" not found" Apr 24 16:38:34.011322 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:34.011272 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-37.ec2.internal\" not found" Apr 24 16:38:34.111898 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:34.111830 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-37.ec2.internal\" not found" Apr 24 16:38:34.189106 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.189064 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:38:34.220373 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.220336 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-37.ec2.internal" Apr 24 16:38:34.233584 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.233554 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 16:38:34.234772 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.234754 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-37.ec2.internal" Apr 24 16:38:34.243075 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.243054 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 16:38:34.583348 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.583260 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:38:34.698055 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.698020 2575 apiserver.go:52] "Watching apiserver" Apr 24 16:38:34.708786 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.708753 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 16:38:34.710653 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.710623 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-f9bsr","openshift-ovn-kubernetes/ovnkube-node-jt59q","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6","openshift-cluster-node-tuning-operator/tuned-8sqwm","openshift-image-registry/node-ca-psxtk","openshift-multus/multus-9hv6v","openshift-network-diagnostics/network-check-target-85nzt","openshift-network-operator/iptables-alerter-f5ds8","kube-system/konnectivity-agent-phgx4","kube-system/kube-apiserver-proxy-ip-10-0-131-37.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-37.ec2.internal","openshift-multus/multus-additional-cni-plugins-5hcrw"] Apr 24 16:38:34.713792 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.713766 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:38:34.713935 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:34.713857 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85nzt" podUID="afaef099-a861-4606-97c5-485da57c818f" Apr 24 16:38:34.716003 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.715967 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:34.716141 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:34.716058 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f9bsr" podUID="10ab450a-933f-4b41-8316-09109770ac99" Apr 24 16:38:34.718498 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.718467 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.722113 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.720907 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:34.722113 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.721190 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 16:38:34.722113 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.721373 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 16:38:34.722113 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.721559 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 16:38:34.722113 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.721599 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dvkg8\"" Apr 24 16:38:34.722113 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.721700 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 16:38:34.722113 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.721912 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 16:38:34.722113 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.721896 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 16:38:34.723483 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.723142 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.723637 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.723528 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 16:38:34.723637 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.723583 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-cgfwf\"" Apr 24 16:38:34.723637 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.723629 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 16:38:34.723912 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.723889 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 16:38:34.725309 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.725285 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 16:38:34.725309 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.725304 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-tgjhv\"" Apr 24 16:38:34.725475 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.725386 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:38:34.725946 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.725580 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-psxtk" Apr 24 16:38:34.727641 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.727622 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 16:38:34.727871 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.727850 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 16:38:34.727871 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.727863 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 16:38:34.728014 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.727853 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-lqszl\"" Apr 24 16:38:34.728014 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.727936 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.732217 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.730285 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 16:38:34.732217 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.730434 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-p7chn\"" Apr 24 16:38:34.732217 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.730590 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 16:38:34.732217 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.731077 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 16:38:34.732217 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.731177 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 16:38:34.732528 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.732222 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-host-kubelet\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.732528 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.732289 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-etc-openvswitch\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.732528 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.732382 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-host-cni-netd\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.732528 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.732429 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c2b71d82-71f3-4cf6-95a9-73b3d509e492-ovnkube-script-lib\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.732528 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.732489 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7587d0da-cb0b-407d-b7b1-2f580492c29e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5n7s6\" (UID: \"7587d0da-cb0b-407d-b7b1-2f580492c29e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:34.732757 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.732533 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-724bk\" (UniqueName: \"kubernetes.io/projected/7587d0da-cb0b-407d-b7b1-2f580492c29e-kube-api-access-724bk\") pod \"aws-ebs-csi-driver-node-5n7s6\" (UID: \"7587d0da-cb0b-407d-b7b1-2f580492c29e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:34.732757 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.732605 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh54r\" (UniqueName: \"kubernetes.io/projected/afaef099-a861-4606-97c5-485da57c818f-kube-api-access-bh54r\") pod \"network-check-target-85nzt\" (UID: \"afaef099-a861-4606-97c5-485da57c818f\") " pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:38:34.732757 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.732648 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c2b71d82-71f3-4cf6-95a9-73b3d509e492-env-overrides\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.732757 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.732732 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtfrv\" (UniqueName: \"kubernetes.io/projected/c2b71d82-71f3-4cf6-95a9-73b3d509e492-kube-api-access-jtfrv\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.732945 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.732831 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-run-openvswitch\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.732945 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.732883 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-node-log\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.732945 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.732921 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c2b71d82-71f3-4cf6-95a9-73b3d509e492-ovnkube-config\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.733163 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.732988 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7587d0da-cb0b-407d-b7b1-2f580492c29e-sys-fs\") pod \"aws-ebs-csi-driver-node-5n7s6\" (UID: \"7587d0da-cb0b-407d-b7b1-2f580492c29e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:34.733163 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.733047 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhxk5\" (UniqueName: \"kubernetes.io/projected/10ab450a-933f-4b41-8316-09109770ac99-kube-api-access-mhxk5\") pod \"network-metrics-daemon-f9bsr\" (UID: \"10ab450a-933f-4b41-8316-09109770ac99\") " pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:34.733163 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.733076 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-systemd-units\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.733163 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.733129 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-host-run-netns\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.733366 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.733206 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.733366 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.733261 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-run-systemd\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.733454 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.733404 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-run-ovn\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.733498 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.733467 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c2b71d82-71f3-4cf6-95a9-73b3d509e492-ovn-node-metrics-cert\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.733542 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.733501 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7587d0da-cb0b-407d-b7b1-2f580492c29e-device-dir\") pod \"aws-ebs-csi-driver-node-5n7s6\" (UID: \"7587d0da-cb0b-407d-b7b1-2f580492c29e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:34.733590 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.733544 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-host-slash\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.733590 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.733575 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-host-cni-bin\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.733716 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.733693 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-host-run-ovn-kubernetes\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.733784 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.733772 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7587d0da-cb0b-407d-b7b1-2f580492c29e-registration-dir\") pod \"aws-ebs-csi-driver-node-5n7s6\" (UID: \"7587d0da-cb0b-407d-b7b1-2f580492c29e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:34.733840 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.733829 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7587d0da-cb0b-407d-b7b1-2f580492c29e-etc-selinux\") pod \"aws-ebs-csi-driver-node-5n7s6\" (UID: \"7587d0da-cb0b-407d-b7b1-2f580492c29e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:34.733939 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.733868 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-var-lib-openvswitch\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.733998 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.733985 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-log-socket\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.734052 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.734018 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7587d0da-cb0b-407d-b7b1-2f580492c29e-socket-dir\") pod \"aws-ebs-csi-driver-node-5n7s6\" (UID: \"7587d0da-cb0b-407d-b7b1-2f580492c29e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:34.734120 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.734047 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs\") pod \"network-metrics-daemon-f9bsr\" (UID: \"10ab450a-933f-4b41-8316-09109770ac99\") " pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:34.734458 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.734436 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-f5ds8" Apr 24 16:38:34.735000 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.734970 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-phgx4" Apr 24 16:38:34.737785 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.737764 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 16:38:34.737897 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.737764 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 16:38:34.737897 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.737861 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 16:38:34.738158 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.738003 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:38:34.738158 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.738141 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jgsvz\"" Apr 24 16:38:34.738669 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.738649 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 16:38:34.738754 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.738647 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8md7w\"" Apr 24 16:38:34.740103 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.740068 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.743171 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.742979 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 16:38:34.743171 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.743075 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-gc979\"" Apr 24 16:38:34.743325 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.743238 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 16:38:34.764002 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.763968 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 16:33:33 +0000 UTC" deadline="2028-01-23 16:51:21.814997494 +0000 UTC" Apr 24 16:38:34.764002 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.763999 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15336h12m47.051001676s" Apr 24 16:38:34.821757 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.821727 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 16:38:34.831202 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.831143 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-37.ec2.internal" event={"ID":"dcc90d88bf75fe88b8ea0db46e250029","Type":"ContainerStarted","Data":"1c778c9047f2281f69c9a5897ee4e09afbdb52cf4f0b79a0005ae40781be3b82"} Apr 24 16:38:34.832234 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.832202 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-37.ec2.internal" event={"ID":"d832024cbf852529dc33765f192190f2","Type":"ContainerStarted","Data":"aa4e521b660e0c6bf51b6dede3287de7701178930b68fc63067e600e72075db2"} Apr 24 16:38:34.834695 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.834633 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-host-run-ovn-kubernetes\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.834695 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.834670 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-run\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.834878 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.834702 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-var-lib-kubelet\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.834878 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.834724 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-var-lib-openvswitch\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.834878 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.834754 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-host-run-ovn-kubernetes\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.834878 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.834768 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7fd4e710-ea9f-4927-b943-ca92fb5629da-iptables-alerter-script\") pod \"iptables-alerter-f5ds8\" (UID: \"7fd4e710-ea9f-4927-b943-ca92fb5629da\") " pod="openshift-network-operator/iptables-alerter-f5ds8" Apr 24 16:38:34.834878 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.834782 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-var-lib-openvswitch\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.834878 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.834818 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/34943398-acf4-440b-900d-999cb567a483-serviceca\") pod \"node-ca-psxtk\" (UID: \"34943398-acf4-440b-900d-999cb567a483\") " pod="openshift-image-registry/node-ca-psxtk" Apr 24 16:38:34.834878 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.834854 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp7lv\" (UniqueName: \"kubernetes.io/projected/34943398-acf4-440b-900d-999cb567a483-kube-api-access-qp7lv\") pod \"node-ca-psxtk\" (UID: \"34943398-acf4-440b-900d-999cb567a483\") " pod="openshift-image-registry/node-ca-psxtk" Apr 24 16:38:34.835223 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.834886 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/050c36c1-b0b2-434a-91f9-c53a02f67059-os-release\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.835223 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.834923 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/050c36c1-b0b2-434a-91f9-c53a02f67059-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.835223 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.834949 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs\") pod \"network-metrics-daemon-f9bsr\" (UID: \"10ab450a-933f-4b41-8316-09109770ac99\") " pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:34.835223 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.834977 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-host-kubelet\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.835223 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.835001 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-724bk\" (UniqueName: \"kubernetes.io/projected/7587d0da-cb0b-407d-b7b1-2f580492c29e-kube-api-access-724bk\") pod \"aws-ebs-csi-driver-node-5n7s6\" (UID: \"7587d0da-cb0b-407d-b7b1-2f580492c29e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:34.835223 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.835030 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7fd4e710-ea9f-4927-b943-ca92fb5629da-host-slash\") pod \"iptables-alerter-f5ds8\" (UID: \"7fd4e710-ea9f-4927-b943-ca92fb5629da\") " pod="openshift-network-operator/iptables-alerter-f5ds8" Apr 24 16:38:34.835223 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.835057 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-etc-modprobe-d\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.835223 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.835058 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-host-kubelet\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.835223 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.835084 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-etc-sysctl-conf\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.835223 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.835132 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-etc-tuned\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.835223 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.835182 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dg8h\" (UniqueName: \"kubernetes.io/projected/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-kube-api-access-7dg8h\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.835223 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:34.835200 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:34.835223 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.835217 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gmcn\" (UniqueName: \"kubernetes.io/projected/7fd4e710-ea9f-4927-b943-ca92fb5629da-kube-api-access-4gmcn\") pod \"iptables-alerter-f5ds8\" (UID: \"7fd4e710-ea9f-4927-b943-ca92fb5629da\") " pod="openshift-network-operator/iptables-alerter-f5ds8" Apr 24 16:38:34.835750 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:34.835304 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs podName:10ab450a-933f-4b41-8316-09109770ac99 nodeName:}" failed. No retries permitted until 2026-04-24 16:38:35.335263052 +0000 UTC m=+3.049331066 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs") pod "network-metrics-daemon-f9bsr" (UID: "10ab450a-933f-4b41-8316-09109770ac99") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:34.835750 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.835343 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-host\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.835750 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.835389 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-host-run-netns\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.835750 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.835448 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-host-var-lib-cni-multus\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.835750 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.835473 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-etc-kubernetes\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.835750 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.835503 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/050c36c1-b0b2-434a-91f9-c53a02f67059-cni-binary-copy\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.835750 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.835543 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/050c36c1-b0b2-434a-91f9-c53a02f67059-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.835750 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.835582 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-run-openvswitch\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.835750 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.835623 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-run-openvswitch\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.835750 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.835638 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c2b71d82-71f3-4cf6-95a9-73b3d509e492-ovnkube-config\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.835750 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.835664 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7587d0da-cb0b-407d-b7b1-2f580492c29e-sys-fs\") pod \"aws-ebs-csi-driver-node-5n7s6\" (UID: \"7587d0da-cb0b-407d-b7b1-2f580492c29e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:34.836195 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.836148 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7587d0da-cb0b-407d-b7b1-2f580492c29e-sys-fs\") pod \"aws-ebs-csi-driver-node-5n7s6\" (UID: \"7587d0da-cb0b-407d-b7b1-2f580492c29e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:34.836749 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.836723 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-etc-sysctl-d\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.836882 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.836778 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-system-cni-dir\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.836882 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.836813 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-host-run-multus-certs\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.837002 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.836853 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-systemd-units\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.837002 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.836928 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-host-run-netns\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.837002 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.836961 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.837002 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.836998 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-lib-modules\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.837186 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837024 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/050c36c1-b0b2-434a-91f9-c53a02f67059-cnibin\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.837186 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837060 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-run-ovn\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.837186 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837113 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.837186 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837114 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c2b71d82-71f3-4cf6-95a9-73b3d509e492-ovn-node-metrics-cert\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.837186 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837160 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-run-ovn\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.837186 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837177 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7587d0da-cb0b-407d-b7b1-2f580492c29e-device-dir\") pod \"aws-ebs-csi-driver-node-5n7s6\" (UID: \"7587d0da-cb0b-407d-b7b1-2f580492c29e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:34.837431 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837214 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-systemd-units\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.837431 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837220 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-multus-cni-dir\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.837431 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837265 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-host-run-netns\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.837431 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837286 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-cnibin\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.837431 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837316 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c2b71d82-71f3-4cf6-95a9-73b3d509e492-ovnkube-config\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.837431 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837326 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7587d0da-cb0b-407d-b7b1-2f580492c29e-device-dir\") pod \"aws-ebs-csi-driver-node-5n7s6\" (UID: \"7587d0da-cb0b-407d-b7b1-2f580492c29e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:34.837431 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837333 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0306168c-6c00-4a89-9e2b-fff3d030b0e2-cni-binary-copy\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.837431 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837368 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-host-slash\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.837431 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837414 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-host-cni-bin\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.837787 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837441 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-host-cni-bin\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.837787 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837462 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 16:38:34.837787 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837476 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-etc-sysconfig\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.837787 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837514 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0306168c-6c00-4a89-9e2b-fff3d030b0e2-multus-daemon-config\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.837787 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837549 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7587d0da-cb0b-407d-b7b1-2f580492c29e-registration-dir\") pod \"aws-ebs-csi-driver-node-5n7s6\" (UID: \"7587d0da-cb0b-407d-b7b1-2f580492c29e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:34.837787 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837579 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-host-slash\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.837787 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837611 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7587d0da-cb0b-407d-b7b1-2f580492c29e-etc-selinux\") pod \"aws-ebs-csi-driver-node-5n7s6\" (UID: \"7587d0da-cb0b-407d-b7b1-2f580492c29e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:34.837787 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837646 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7587d0da-cb0b-407d-b7b1-2f580492c29e-registration-dir\") pod \"aws-ebs-csi-driver-node-5n7s6\" (UID: \"7587d0da-cb0b-407d-b7b1-2f580492c29e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:34.837787 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837689 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/52f93581-75a0-4ae1-92b6-3ce3e189cd48-agent-certs\") pod \"konnectivity-agent-phgx4\" (UID: \"52f93581-75a0-4ae1-92b6-3ce3e189cd48\") " pod="kube-system/konnectivity-agent-phgx4" Apr 24 16:38:34.837787 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837723 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-log-socket\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.837787 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837779 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7587d0da-cb0b-407d-b7b1-2f580492c29e-etc-selinux\") pod \"aws-ebs-csi-driver-node-5n7s6\" (UID: \"7587d0da-cb0b-407d-b7b1-2f580492c29e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:34.838324 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837809 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7587d0da-cb0b-407d-b7b1-2f580492c29e-socket-dir\") pod \"aws-ebs-csi-driver-node-5n7s6\" (UID: \"7587d0da-cb0b-407d-b7b1-2f580492c29e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:34.838324 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837839 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-log-socket\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.838324 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837903 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34943398-acf4-440b-900d-999cb567a483-host\") pod \"node-ca-psxtk\" (UID: \"34943398-acf4-440b-900d-999cb567a483\") " pod="openshift-image-registry/node-ca-psxtk" Apr 24 16:38:34.838324 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.837935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-etc-openvswitch\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.838324 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.838058 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7587d0da-cb0b-407d-b7b1-2f580492c29e-socket-dir\") pod \"aws-ebs-csi-driver-node-5n7s6\" (UID: \"7587d0da-cb0b-407d-b7b1-2f580492c29e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:34.838324 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.838124 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-host-cni-netd\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.838324 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.838160 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c2b71d82-71f3-4cf6-95a9-73b3d509e492-ovnkube-script-lib\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.838324 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.838244 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7587d0da-cb0b-407d-b7b1-2f580492c29e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5n7s6\" (UID: \"7587d0da-cb0b-407d-b7b1-2f580492c29e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:34.838324 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.838274 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-os-release\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.838324 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.838306 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-multus-conf-dir\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.838754 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.838365 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/050c36c1-b0b2-434a-91f9-c53a02f67059-system-cni-dir\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.838754 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.838421 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bh54r\" (UniqueName: \"kubernetes.io/projected/afaef099-a861-4606-97c5-485da57c818f-kube-api-access-bh54r\") pod \"network-check-target-85nzt\" (UID: \"afaef099-a861-4606-97c5-485da57c818f\") " pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:38:34.838754 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.838458 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c2b71d82-71f3-4cf6-95a9-73b3d509e492-env-overrides\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.838754 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.838492 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtfrv\" (UniqueName: \"kubernetes.io/projected/c2b71d82-71f3-4cf6-95a9-73b3d509e492-kube-api-access-jtfrv\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.838754 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.838557 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-host-var-lib-cni-bin\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.838754 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.838606 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-node-log\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.838754 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.838640 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-etc-kubernetes\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.838754 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.838676 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-multus-socket-dir-parent\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.838754 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.838711 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-host-run-k8s-cni-cncf-io\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.839153 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.838832 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlzvd\" (UniqueName: \"kubernetes.io/projected/050c36c1-b0b2-434a-91f9-c53a02f67059-kube-api-access-tlzvd\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.839153 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.838870 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhxk5\" (UniqueName: \"kubernetes.io/projected/10ab450a-933f-4b41-8316-09109770ac99-kube-api-access-mhxk5\") pod \"network-metrics-daemon-f9bsr\" (UID: \"10ab450a-933f-4b41-8316-09109770ac99\") " pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:34.839153 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.838903 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-tmp\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.839153 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.838934 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62b72\" (UniqueName: \"kubernetes.io/projected/0306168c-6c00-4a89-9e2b-fff3d030b0e2-kube-api-access-62b72\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.839153 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.838968 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-run-systemd\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.839153 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.838993 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-host-var-lib-kubelet\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.839153 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.839029 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/050c36c1-b0b2-434a-91f9-c53a02f67059-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.839153 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.839065 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/52f93581-75a0-4ae1-92b6-3ce3e189cd48-konnectivity-ca\") pod \"konnectivity-agent-phgx4\" (UID: \"52f93581-75a0-4ae1-92b6-3ce3e189cd48\") " pod="kube-system/konnectivity-agent-phgx4" Apr 24 16:38:34.839153 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.839129 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-etc-systemd\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.839527 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.839163 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-sys\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.839527 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.839190 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-hostroot\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.839805 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.839782 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-etc-openvswitch\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.839904 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.839825 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c2b71d82-71f3-4cf6-95a9-73b3d509e492-ovnkube-script-lib\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.839904 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.839840 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-host-cni-netd\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.839904 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.839887 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7587d0da-cb0b-407d-b7b1-2f580492c29e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5n7s6\" (UID: \"7587d0da-cb0b-407d-b7b1-2f580492c29e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:34.840196 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.840170 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-node-log\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.840317 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.839913 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c2b71d82-71f3-4cf6-95a9-73b3d509e492-run-systemd\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.841037 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.841007 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c2b71d82-71f3-4cf6-95a9-73b3d509e492-env-overrides\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.841164 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.841045 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c2b71d82-71f3-4cf6-95a9-73b3d509e492-ovn-node-metrics-cert\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.847971 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:34.847933 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:38:34.847971 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:34.847964 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:38:34.847971 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:34.847981 2575 projected.go:194] Error preparing data for projected volume kube-api-access-bh54r for pod openshift-network-diagnostics/network-check-target-85nzt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:34.848273 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:34.848067 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afaef099-a861-4606-97c5-485da57c818f-kube-api-access-bh54r podName:afaef099-a861-4606-97c5-485da57c818f nodeName:}" failed. No retries permitted until 2026-04-24 16:38:35.348050135 +0000 UTC m=+3.062118165 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-bh54r" (UniqueName: "kubernetes.io/projected/afaef099-a861-4606-97c5-485da57c818f-kube-api-access-bh54r") pod "network-check-target-85nzt" (UID: "afaef099-a861-4606-97c5-485da57c818f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:34.850453 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.850417 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-724bk\" (UniqueName: \"kubernetes.io/projected/7587d0da-cb0b-407d-b7b1-2f580492c29e-kube-api-access-724bk\") pod \"aws-ebs-csi-driver-node-5n7s6\" (UID: \"7587d0da-cb0b-407d-b7b1-2f580492c29e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:34.850897 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.850874 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtfrv\" (UniqueName: \"kubernetes.io/projected/c2b71d82-71f3-4cf6-95a9-73b3d509e492-kube-api-access-jtfrv\") pod \"ovnkube-node-jt59q\" (UID: \"c2b71d82-71f3-4cf6-95a9-73b3d509e492\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:34.851131 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.851113 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhxk5\" (UniqueName: \"kubernetes.io/projected/10ab450a-933f-4b41-8316-09109770ac99-kube-api-access-mhxk5\") pod \"network-metrics-daemon-f9bsr\" (UID: \"10ab450a-933f-4b41-8316-09109770ac99\") " pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:34.940146 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940104 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/050c36c1-b0b2-434a-91f9-c53a02f67059-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.940335 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940158 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-etc-sysctl-d\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.940335 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940182 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-system-cni-dir\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.940335 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940204 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-host-run-multus-certs\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.940335 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940226 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-lib-modules\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.940335 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940246 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/050c36c1-b0b2-434a-91f9-c53a02f67059-cnibin\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.940335 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940299 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/050c36c1-b0b2-434a-91f9-c53a02f67059-cnibin\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.940627 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940323 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-host-run-multus-certs\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.940627 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940343 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-system-cni-dir\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.940627 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940354 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-multus-cni-dir\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.940627 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940384 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-lib-modules\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.940627 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-cnibin\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.940627 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940409 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-multus-cni-dir\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.940627 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940442 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0306168c-6c00-4a89-9e2b-fff3d030b0e2-cni-binary-copy\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.940627 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940462 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-cnibin\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.940627 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940446 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-etc-sysctl-d\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.940627 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940475 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-etc-sysconfig\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.940627 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940500 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0306168c-6c00-4a89-9e2b-fff3d030b0e2-multus-daemon-config\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.940627 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940529 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/52f93581-75a0-4ae1-92b6-3ce3e189cd48-agent-certs\") pod \"konnectivity-agent-phgx4\" (UID: \"52f93581-75a0-4ae1-92b6-3ce3e189cd48\") " pod="kube-system/konnectivity-agent-phgx4" Apr 24 16:38:34.940627 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940558 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34943398-acf4-440b-900d-999cb567a483-host\") pod \"node-ca-psxtk\" (UID: \"34943398-acf4-440b-900d-999cb567a483\") " pod="openshift-image-registry/node-ca-psxtk" Apr 24 16:38:34.940627 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940561 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-etc-sysconfig\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.940627 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940586 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-os-release\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.940627 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940610 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-multus-conf-dir\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.940627 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940614 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34943398-acf4-440b-900d-999cb567a483-host\") pod \"node-ca-psxtk\" (UID: \"34943398-acf4-440b-900d-999cb567a483\") " pod="openshift-image-registry/node-ca-psxtk" Apr 24 16:38:34.940627 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940635 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/050c36c1-b0b2-434a-91f9-c53a02f67059-system-cni-dir\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.941462 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940674 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-host-var-lib-cni-bin\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.941462 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940698 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-etc-kubernetes\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.941462 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940712 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-multus-conf-dir\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.941462 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940723 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-multus-socket-dir-parent\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.941462 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940751 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-host-run-k8s-cni-cncf-io\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.941462 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940759 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-host-var-lib-cni-bin\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.941462 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940767 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/050c36c1-b0b2-434a-91f9-c53a02f67059-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.941462 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940779 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlzvd\" (UniqueName: \"kubernetes.io/projected/050c36c1-b0b2-434a-91f9-c53a02f67059-kube-api-access-tlzvd\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.941462 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940805 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-tmp\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.941462 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940830 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62b72\" (UniqueName: \"kubernetes.io/projected/0306168c-6c00-4a89-9e2b-fff3d030b0e2-kube-api-access-62b72\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.941462 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940841 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-host-run-k8s-cni-cncf-io\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.941462 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940850 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-multus-socket-dir-parent\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.941462 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940858 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-host-var-lib-kubelet\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.941462 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940676 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-os-release\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.941462 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940884 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/050c36c1-b0b2-434a-91f9-c53a02f67059-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.941462 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940910 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/52f93581-75a0-4ae1-92b6-3ce3e189cd48-konnectivity-ca\") pod \"konnectivity-agent-phgx4\" (UID: \"52f93581-75a0-4ae1-92b6-3ce3e189cd48\") " pod="kube-system/konnectivity-agent-phgx4" Apr 24 16:38:34.941462 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940913 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-etc-kubernetes\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.942247 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940937 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-etc-systemd\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.942247 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940961 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-sys\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.942247 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-hostroot\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.942247 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.940804 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/050c36c1-b0b2-434a-91f9-c53a02f67059-system-cni-dir\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.942247 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941059 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-run\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.942247 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941139 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-run\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.942247 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941188 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-host-var-lib-kubelet\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.942247 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941027 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0306168c-6c00-4a89-9e2b-fff3d030b0e2-cni-binary-copy\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.942247 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941245 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-var-lib-kubelet\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.942247 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941261 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-hostroot\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.942247 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941272 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-etc-systemd\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.942247 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7fd4e710-ea9f-4927-b943-ca92fb5629da-iptables-alerter-script\") pod \"iptables-alerter-f5ds8\" (UID: \"7fd4e710-ea9f-4927-b943-ca92fb5629da\") " pod="openshift-network-operator/iptables-alerter-f5ds8" Apr 24 16:38:34.942247 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941306 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/34943398-acf4-440b-900d-999cb567a483-serviceca\") pod \"node-ca-psxtk\" (UID: \"34943398-acf4-440b-900d-999cb567a483\") " pod="openshift-image-registry/node-ca-psxtk" Apr 24 16:38:34.942247 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941324 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-var-lib-kubelet\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.942247 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941331 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qp7lv\" (UniqueName: \"kubernetes.io/projected/34943398-acf4-440b-900d-999cb567a483-kube-api-access-qp7lv\") pod \"node-ca-psxtk\" (UID: \"34943398-acf4-440b-900d-999cb567a483\") " pod="openshift-image-registry/node-ca-psxtk" Apr 24 16:38:34.942247 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941365 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/050c36c1-b0b2-434a-91f9-c53a02f67059-os-release\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.942247 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941392 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/050c36c1-b0b2-434a-91f9-c53a02f67059-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.942247 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941413 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/050c36c1-b0b2-434a-91f9-c53a02f67059-tuning-conf-dir\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.943138 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941450 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7fd4e710-ea9f-4927-b943-ca92fb5629da-host-slash\") pod \"iptables-alerter-f5ds8\" (UID: \"7fd4e710-ea9f-4927-b943-ca92fb5629da\") " pod="openshift-network-operator/iptables-alerter-f5ds8" Apr 24 16:38:34.943138 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941477 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-etc-modprobe-d\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.943138 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941504 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-etc-sysctl-conf\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.943138 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941526 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-etc-tuned\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.943138 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941324 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-sys\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.943138 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941553 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dg8h\" (UniqueName: \"kubernetes.io/projected/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-kube-api-access-7dg8h\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.943138 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941581 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4gmcn\" (UniqueName: \"kubernetes.io/projected/7fd4e710-ea9f-4927-b943-ca92fb5629da-kube-api-access-4gmcn\") pod \"iptables-alerter-f5ds8\" (UID: \"7fd4e710-ea9f-4927-b943-ca92fb5629da\") " pod="openshift-network-operator/iptables-alerter-f5ds8" Apr 24 16:38:34.943138 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941609 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/050c36c1-b0b2-434a-91f9-c53a02f67059-os-release\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.943138 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941626 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-host\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.943138 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941650 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-host-run-netns\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.943138 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941674 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-host-var-lib-cni-multus\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.943138 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941699 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-etc-kubernetes\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.943138 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941706 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/52f93581-75a0-4ae1-92b6-3ce3e189cd48-konnectivity-ca\") pod \"konnectivity-agent-phgx4\" (UID: \"52f93581-75a0-4ae1-92b6-3ce3e189cd48\") " pod="kube-system/konnectivity-agent-phgx4" Apr 24 16:38:34.943138 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941727 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/050c36c1-b0b2-434a-91f9-c53a02f67059-cni-binary-copy\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.943138 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941742 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-etc-sysctl-conf\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.943138 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941747 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/34943398-acf4-440b-900d-999cb567a483-serviceca\") pod \"node-ca-psxtk\" (UID: \"34943398-acf4-440b-900d-999cb567a483\") " pod="openshift-image-registry/node-ca-psxtk" Apr 24 16:38:34.943138 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941772 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-host\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.943138 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941800 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-host-run-netns\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.943956 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941866 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/050c36c1-b0b2-434a-91f9-c53a02f67059-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.943956 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941898 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-etc-modprobe-d\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.943956 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941919 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-host-var-lib-cni-multus\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.943956 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941943 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/7fd4e710-ea9f-4927-b943-ca92fb5629da-iptables-alerter-script\") pod \"iptables-alerter-f5ds8\" (UID: \"7fd4e710-ea9f-4927-b943-ca92fb5629da\") " pod="openshift-network-operator/iptables-alerter-f5ds8" Apr 24 16:38:34.943956 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.941975 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7fd4e710-ea9f-4927-b943-ca92fb5629da-host-slash\") pod \"iptables-alerter-f5ds8\" (UID: \"7fd4e710-ea9f-4927-b943-ca92fb5629da\") " pod="openshift-network-operator/iptables-alerter-f5ds8" Apr 24 16:38:34.943956 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.942103 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0306168c-6c00-4a89-9e2b-fff3d030b0e2-etc-kubernetes\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.943956 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.942261 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/050c36c1-b0b2-434a-91f9-c53a02f67059-cni-binary-copy\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.943956 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.942827 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0306168c-6c00-4a89-9e2b-fff3d030b0e2-multus-daemon-config\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.943956 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.943910 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-tmp\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.944396 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.944235 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-etc-tuned\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.944396 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.944310 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/52f93581-75a0-4ae1-92b6-3ce3e189cd48-agent-certs\") pod \"konnectivity-agent-phgx4\" (UID: \"52f93581-75a0-4ae1-92b6-3ce3e189cd48\") " pod="kube-system/konnectivity-agent-phgx4" Apr 24 16:38:34.954227 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.954189 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gmcn\" (UniqueName: \"kubernetes.io/projected/7fd4e710-ea9f-4927-b943-ca92fb5629da-kube-api-access-4gmcn\") pod \"iptables-alerter-f5ds8\" (UID: \"7fd4e710-ea9f-4927-b943-ca92fb5629da\") " pod="openshift-network-operator/iptables-alerter-f5ds8" Apr 24 16:38:34.954386 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.954187 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62b72\" (UniqueName: \"kubernetes.io/projected/0306168c-6c00-4a89-9e2b-fff3d030b0e2-kube-api-access-62b72\") pod \"multus-9hv6v\" (UID: \"0306168c-6c00-4a89-9e2b-fff3d030b0e2\") " pod="openshift-multus/multus-9hv6v" Apr 24 16:38:34.954386 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.954185 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dg8h\" (UniqueName: \"kubernetes.io/projected/59d1c6f6-5f2f-4c93-b15d-47374722a0fe-kube-api-access-7dg8h\") pod \"tuned-8sqwm\" (UID: \"59d1c6f6-5f2f-4c93-b15d-47374722a0fe\") " pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:34.954488 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.954368 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlzvd\" (UniqueName: \"kubernetes.io/projected/050c36c1-b0b2-434a-91f9-c53a02f67059-kube-api-access-tlzvd\") pod \"multus-additional-cni-plugins-5hcrw\" (UID: \"050c36c1-b0b2-434a-91f9-c53a02f67059\") " pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:34.954488 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:34.954466 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp7lv\" (UniqueName: \"kubernetes.io/projected/34943398-acf4-440b-900d-999cb567a483-kube-api-access-qp7lv\") pod \"node-ca-psxtk\" (UID: \"34943398-acf4-440b-900d-999cb567a483\") " pod="openshift-image-registry/node-ca-psxtk" Apr 24 16:38:35.039699 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.039642 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:35.047818 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.047609 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" Apr 24 16:38:35.056685 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.056648 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" Apr 24 16:38:35.063434 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.063400 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-psxtk" Apr 24 16:38:35.073280 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.073245 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9hv6v" Apr 24 16:38:35.080990 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.080961 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-f5ds8" Apr 24 16:38:35.089879 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.089780 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-phgx4" Apr 24 16:38:35.096634 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.096599 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-5hcrw" Apr 24 16:38:35.345072 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.344974 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs\") pod \"network-metrics-daemon-f9bsr\" (UID: \"10ab450a-933f-4b41-8316-09109770ac99\") " pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:35.345259 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:35.345160 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:35.345259 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:35.345228 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs podName:10ab450a-933f-4b41-8316-09109770ac99 nodeName:}" failed. No retries permitted until 2026-04-24 16:38:36.345210934 +0000 UTC m=+4.059278948 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs") pod "network-metrics-daemon-f9bsr" (UID: "10ab450a-933f-4b41-8316-09109770ac99") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:35.446170 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.446125 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bh54r\" (UniqueName: \"kubernetes.io/projected/afaef099-a861-4606-97c5-485da57c818f-kube-api-access-bh54r\") pod \"network-check-target-85nzt\" (UID: \"afaef099-a861-4606-97c5-485da57c818f\") " pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:38:35.446345 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:35.446283 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:38:35.446345 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:35.446306 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:38:35.446345 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:35.446319 2575 projected.go:194] Error preparing data for projected volume kube-api-access-bh54r for pod openshift-network-diagnostics/network-check-target-85nzt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:35.446476 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:35.446405 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afaef099-a861-4606-97c5-485da57c818f-kube-api-access-bh54r podName:afaef099-a861-4606-97c5-485da57c818f nodeName:}" failed. No retries permitted until 2026-04-24 16:38:36.446381952 +0000 UTC m=+4.160449981 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-bh54r" (UniqueName: "kubernetes.io/projected/afaef099-a861-4606-97c5-485da57c818f-kube-api-access-bh54r") pod "network-check-target-85nzt" (UID: "afaef099-a861-4606-97c5-485da57c818f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:35.645059 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:35.645023 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0306168c_6c00_4a89_9e2b_fff3d030b0e2.slice/crio-2b39ca672b50c4cb840ffb4e79ac4c9947542653f202b273c5ff54053d03bb40 WatchSource:0}: Error finding container 2b39ca672b50c4cb840ffb4e79ac4c9947542653f202b273c5ff54053d03bb40: Status 404 returned error can't find the container with id 2b39ca672b50c4cb840ffb4e79ac4c9947542653f202b273c5ff54053d03bb40 Apr 24 16:38:35.646652 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:35.646623 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod050c36c1_b0b2_434a_91f9_c53a02f67059.slice/crio-0d6bf9a0d75861623c97e3e969eca513cd0db53899748a82cbb593a70a87dc31 WatchSource:0}: Error finding container 0d6bf9a0d75861623c97e3e969eca513cd0db53899748a82cbb593a70a87dc31: Status 404 returned error can't find the container with id 0d6bf9a0d75861623c97e3e969eca513cd0db53899748a82cbb593a70a87dc31 Apr 24 16:38:35.649678 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:35.649650 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34943398_acf4_440b_900d_999cb567a483.slice/crio-6f12a8d853be587a74fa28e4353d4e953a86e8f111cfe5313a95146b23818896 WatchSource:0}: Error finding container 6f12a8d853be587a74fa28e4353d4e953a86e8f111cfe5313a95146b23818896: Status 404 returned error can't find the container with id 6f12a8d853be587a74fa28e4353d4e953a86e8f111cfe5313a95146b23818896 Apr 24 16:38:35.650720 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:35.650692 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2b71d82_71f3_4cf6_95a9_73b3d509e492.slice/crio-62b2c5f28189e7eac00c3d93b113a7e0eb8129c0f13d489aa127173e7073db4e WatchSource:0}: Error finding container 62b2c5f28189e7eac00c3d93b113a7e0eb8129c0f13d489aa127173e7073db4e: Status 404 returned error can't find the container with id 62b2c5f28189e7eac00c3d93b113a7e0eb8129c0f13d489aa127173e7073db4e Apr 24 16:38:35.680976 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:35.680928 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7587d0da_cb0b_407d_b7b1_2f580492c29e.slice/crio-b9d5561ba62512cff695d26aff525a7f0fc3cc6d25fb5de36d99945b45e72bf6 WatchSource:0}: Error finding container b9d5561ba62512cff695d26aff525a7f0fc3cc6d25fb5de36d99945b45e72bf6: Status 404 returned error can't find the container with id b9d5561ba62512cff695d26aff525a7f0fc3cc6d25fb5de36d99945b45e72bf6 Apr 24 16:38:35.682232 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:35.682175 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59d1c6f6_5f2f_4c93_b15d_47374722a0fe.slice/crio-1ff1162407a819657f887cfa7613177605f96311d7af503e4a266aef922c662d WatchSource:0}: Error finding container 1ff1162407a819657f887cfa7613177605f96311d7af503e4a266aef922c662d: Status 404 returned error can't find the container with id 1ff1162407a819657f887cfa7613177605f96311d7af503e4a266aef922c662d Apr 24 16:38:35.683759 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:35.683657 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52f93581_75a0_4ae1_92b6_3ce3e189cd48.slice/crio-efde6fb5dfce8d5fe8741a94e2d4f2d1ba9b9395313be6281bad70e608bc0021 WatchSource:0}: Error finding container efde6fb5dfce8d5fe8741a94e2d4f2d1ba9b9395313be6281bad70e608bc0021: Status 404 returned error can't find the container with id efde6fb5dfce8d5fe8741a94e2d4f2d1ba9b9395313be6281bad70e608bc0021 Apr 24 16:38:35.684462 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:35.684442 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fd4e710_ea9f_4927_b943_ca92fb5629da.slice/crio-31c88adfcdc5f9f0183184147cf863b9495c6ca93a79aa8348d6e2facd993bee WatchSource:0}: Error finding container 31c88adfcdc5f9f0183184147cf863b9495c6ca93a79aa8348d6e2facd993bee: Status 404 returned error can't find the container with id 31c88adfcdc5f9f0183184147cf863b9495c6ca93a79aa8348d6e2facd993bee Apr 24 16:38:35.764352 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.764309 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 16:33:33 +0000 UTC" deadline="2027-10-14 11:21:01.526647332 +0000 UTC" Apr 24 16:38:35.764352 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.764346 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12906h42m25.762304611s" Apr 24 16:38:35.827376 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.827347 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:35.827517 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:35.827458 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f9bsr" podUID="10ab450a-933f-4b41-8316-09109770ac99" Apr 24 16:38:35.835226 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.835190 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-phgx4" event={"ID":"52f93581-75a0-4ae1-92b6-3ce3e189cd48","Type":"ContainerStarted","Data":"efde6fb5dfce8d5fe8741a94e2d4f2d1ba9b9395313be6281bad70e608bc0021"} Apr 24 16:38:35.836043 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.836013 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" event={"ID":"59d1c6f6-5f2f-4c93-b15d-47374722a0fe","Type":"ContainerStarted","Data":"1ff1162407a819657f887cfa7613177605f96311d7af503e4a266aef922c662d"} Apr 24 16:38:35.836891 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.836868 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" event={"ID":"7587d0da-cb0b-407d-b7b1-2f580492c29e","Type":"ContainerStarted","Data":"b9d5561ba62512cff695d26aff525a7f0fc3cc6d25fb5de36d99945b45e72bf6"} Apr 24 16:38:35.837773 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.837750 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" event={"ID":"c2b71d82-71f3-4cf6-95a9-73b3d509e492","Type":"ContainerStarted","Data":"62b2c5f28189e7eac00c3d93b113a7e0eb8129c0f13d489aa127173e7073db4e"} Apr 24 16:38:35.838777 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.838755 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-psxtk" event={"ID":"34943398-acf4-440b-900d-999cb567a483","Type":"ContainerStarted","Data":"6f12a8d853be587a74fa28e4353d4e953a86e8f111cfe5313a95146b23818896"} Apr 24 16:38:35.839684 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.839666 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5hcrw" event={"ID":"050c36c1-b0b2-434a-91f9-c53a02f67059","Type":"ContainerStarted","Data":"0d6bf9a0d75861623c97e3e969eca513cd0db53899748a82cbb593a70a87dc31"} Apr 24 16:38:35.840557 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.840534 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9hv6v" event={"ID":"0306168c-6c00-4a89-9e2b-fff3d030b0e2","Type":"ContainerStarted","Data":"2b39ca672b50c4cb840ffb4e79ac4c9947542653f202b273c5ff54053d03bb40"} Apr 24 16:38:35.842025 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.842004 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-37.ec2.internal" event={"ID":"dcc90d88bf75fe88b8ea0db46e250029","Type":"ContainerStarted","Data":"d417801bb54ecb8490426d9adcbb9f3c5741cc04d1ddf02aee42a302ea545849"} Apr 24 16:38:35.842965 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.842946 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-f5ds8" event={"ID":"7fd4e710-ea9f-4927-b943-ca92fb5629da","Type":"ContainerStarted","Data":"31c88adfcdc5f9f0183184147cf863b9495c6ca93a79aa8348d6e2facd993bee"} Apr 24 16:38:35.855712 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:35.855653 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-37.ec2.internal" podStartSLOduration=1.855634942 podStartE2EDuration="1.855634942s" podCreationTimestamp="2026-04-24 16:38:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:38:35.854430689 +0000 UTC m=+3.568498727" watchObservedRunningTime="2026-04-24 16:38:35.855634942 +0000 UTC m=+3.569702982" Apr 24 16:38:36.353724 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:36.353677 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs\") pod \"network-metrics-daemon-f9bsr\" (UID: \"10ab450a-933f-4b41-8316-09109770ac99\") " pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:36.353907 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:36.353845 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:36.353967 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:36.353909 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs podName:10ab450a-933f-4b41-8316-09109770ac99 nodeName:}" failed. No retries permitted until 2026-04-24 16:38:38.353888763 +0000 UTC m=+6.067956784 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs") pod "network-metrics-daemon-f9bsr" (UID: "10ab450a-933f-4b41-8316-09109770ac99") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:36.454894 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:36.454212 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bh54r\" (UniqueName: \"kubernetes.io/projected/afaef099-a861-4606-97c5-485da57c818f-kube-api-access-bh54r\") pod \"network-check-target-85nzt\" (UID: \"afaef099-a861-4606-97c5-485da57c818f\") " pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:38:36.454894 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:36.454385 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:38:36.454894 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:36.454409 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:38:36.454894 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:36.454424 2575 projected.go:194] Error preparing data for projected volume kube-api-access-bh54r for pod openshift-network-diagnostics/network-check-target-85nzt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:36.454894 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:36.454488 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afaef099-a861-4606-97c5-485da57c818f-kube-api-access-bh54r podName:afaef099-a861-4606-97c5-485da57c818f nodeName:}" failed. No retries permitted until 2026-04-24 16:38:38.454469973 +0000 UTC m=+6.168538003 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-bh54r" (UniqueName: "kubernetes.io/projected/afaef099-a861-4606-97c5-485da57c818f-kube-api-access-bh54r") pod "network-check-target-85nzt" (UID: "afaef099-a861-4606-97c5-485da57c818f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:36.829357 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:36.829306 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:38:36.829888 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:36.829437 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85nzt" podUID="afaef099-a861-4606-97c5-485da57c818f" Apr 24 16:38:36.853696 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:36.853654 2575 generic.go:358] "Generic (PLEG): container finished" podID="d832024cbf852529dc33765f192190f2" containerID="35a381f34fe07fcc4bb93663824809658ed2b0de3b3eef52b7f03f57ecce4320" exitCode=0 Apr 24 16:38:36.854362 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:36.854273 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-37.ec2.internal" event={"ID":"d832024cbf852529dc33765f192190f2","Type":"ContainerDied","Data":"35a381f34fe07fcc4bb93663824809658ed2b0de3b3eef52b7f03f57ecce4320"} Apr 24 16:38:37.828183 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:37.827649 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:37.828183 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:37.827795 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f9bsr" podUID="10ab450a-933f-4b41-8316-09109770ac99" Apr 24 16:38:37.865453 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:37.864867 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-37.ec2.internal" event={"ID":"d832024cbf852529dc33765f192190f2","Type":"ContainerStarted","Data":"7eedfe93b94332c6f5ae8041c010f833d49517f8e506189603ffed0785cf1fef"} Apr 24 16:38:37.998748 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:37.998692 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-37.ec2.internal" podStartSLOduration=3.998669468 podStartE2EDuration="3.998669468s" podCreationTimestamp="2026-04-24 16:38:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:38:37.888210388 +0000 UTC m=+5.602278424" watchObservedRunningTime="2026-04-24 16:38:37.998669468 +0000 UTC m=+5.712737505" Apr 24 16:38:37.999592 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:37.999566 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-7hmxs"] Apr 24 16:38:38.002615 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:38.002582 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7hmxs" Apr 24 16:38:38.006686 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:38.006663 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-mpw6l\"" Apr 24 16:38:38.006954 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:38.006938 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 16:38:38.007045 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:38.006669 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 16:38:38.068607 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:38.068565 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/06388f4e-daeb-4db0-906e-01adfa3e3b97-hosts-file\") pod \"node-resolver-7hmxs\" (UID: \"06388f4e-daeb-4db0-906e-01adfa3e3b97\") " pod="openshift-dns/node-resolver-7hmxs" Apr 24 16:38:38.068800 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:38.068627 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mrpf\" (UniqueName: \"kubernetes.io/projected/06388f4e-daeb-4db0-906e-01adfa3e3b97-kube-api-access-2mrpf\") pod \"node-resolver-7hmxs\" (UID: \"06388f4e-daeb-4db0-906e-01adfa3e3b97\") " pod="openshift-dns/node-resolver-7hmxs" Apr 24 16:38:38.068800 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:38.068694 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/06388f4e-daeb-4db0-906e-01adfa3e3b97-tmp-dir\") pod \"node-resolver-7hmxs\" (UID: \"06388f4e-daeb-4db0-906e-01adfa3e3b97\") " pod="openshift-dns/node-resolver-7hmxs" Apr 24 16:38:38.169518 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:38.169412 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/06388f4e-daeb-4db0-906e-01adfa3e3b97-tmp-dir\") pod \"node-resolver-7hmxs\" (UID: \"06388f4e-daeb-4db0-906e-01adfa3e3b97\") " pod="openshift-dns/node-resolver-7hmxs" Apr 24 16:38:38.169518 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:38.169511 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/06388f4e-daeb-4db0-906e-01adfa3e3b97-hosts-file\") pod \"node-resolver-7hmxs\" (UID: \"06388f4e-daeb-4db0-906e-01adfa3e3b97\") " pod="openshift-dns/node-resolver-7hmxs" Apr 24 16:38:38.169745 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:38.169549 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mrpf\" (UniqueName: \"kubernetes.io/projected/06388f4e-daeb-4db0-906e-01adfa3e3b97-kube-api-access-2mrpf\") pod \"node-resolver-7hmxs\" (UID: \"06388f4e-daeb-4db0-906e-01adfa3e3b97\") " pod="openshift-dns/node-resolver-7hmxs" Apr 24 16:38:38.170261 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:38.170237 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/06388f4e-daeb-4db0-906e-01adfa3e3b97-tmp-dir\") pod \"node-resolver-7hmxs\" (UID: \"06388f4e-daeb-4db0-906e-01adfa3e3b97\") " pod="openshift-dns/node-resolver-7hmxs" Apr 24 16:38:38.170388 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:38.170331 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/06388f4e-daeb-4db0-906e-01adfa3e3b97-hosts-file\") pod \"node-resolver-7hmxs\" (UID: \"06388f4e-daeb-4db0-906e-01adfa3e3b97\") " pod="openshift-dns/node-resolver-7hmxs" Apr 24 16:38:38.187908 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:38.187386 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mrpf\" (UniqueName: \"kubernetes.io/projected/06388f4e-daeb-4db0-906e-01adfa3e3b97-kube-api-access-2mrpf\") pod \"node-resolver-7hmxs\" (UID: \"06388f4e-daeb-4db0-906e-01adfa3e3b97\") " pod="openshift-dns/node-resolver-7hmxs" Apr 24 16:38:38.318130 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:38.317803 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7hmxs" Apr 24 16:38:38.371140 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:38.371016 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs\") pod \"network-metrics-daemon-f9bsr\" (UID: \"10ab450a-933f-4b41-8316-09109770ac99\") " pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:38.371306 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:38.371229 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:38.371306 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:38.371306 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs podName:10ab450a-933f-4b41-8316-09109770ac99 nodeName:}" failed. No retries permitted until 2026-04-24 16:38:42.371285116 +0000 UTC m=+10.085353133 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs") pod "network-metrics-daemon-f9bsr" (UID: "10ab450a-933f-4b41-8316-09109770ac99") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:38.472251 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:38.471554 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bh54r\" (UniqueName: \"kubernetes.io/projected/afaef099-a861-4606-97c5-485da57c818f-kube-api-access-bh54r\") pod \"network-check-target-85nzt\" (UID: \"afaef099-a861-4606-97c5-485da57c818f\") " pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:38:38.472251 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:38.471764 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:38:38.472251 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:38.471785 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:38:38.472251 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:38.471797 2575 projected.go:194] Error preparing data for projected volume kube-api-access-bh54r for pod openshift-network-diagnostics/network-check-target-85nzt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:38.472251 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:38.471864 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afaef099-a861-4606-97c5-485da57c818f-kube-api-access-bh54r podName:afaef099-a861-4606-97c5-485da57c818f nodeName:}" failed. No retries permitted until 2026-04-24 16:38:42.471843824 +0000 UTC m=+10.185911854 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-bh54r" (UniqueName: "kubernetes.io/projected/afaef099-a861-4606-97c5-485da57c818f-kube-api-access-bh54r") pod "network-check-target-85nzt" (UID: "afaef099-a861-4606-97c5-485da57c818f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:38.827733 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:38.827650 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:38:38.827902 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:38.827788 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85nzt" podUID="afaef099-a861-4606-97c5-485da57c818f" Apr 24 16:38:39.827078 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:39.827042 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:39.827556 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:39.827204 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f9bsr" podUID="10ab450a-933f-4b41-8316-09109770ac99" Apr 24 16:38:40.827993 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:40.827611 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:38:40.827993 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:40.827756 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85nzt" podUID="afaef099-a861-4606-97c5-485da57c818f" Apr 24 16:38:41.827488 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:41.827449 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:41.827683 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:41.827603 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f9bsr" podUID="10ab450a-933f-4b41-8316-09109770ac99" Apr 24 16:38:42.408703 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:42.408030 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs\") pod \"network-metrics-daemon-f9bsr\" (UID: \"10ab450a-933f-4b41-8316-09109770ac99\") " pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:42.408703 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:42.408220 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:42.408703 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:42.408345 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs podName:10ab450a-933f-4b41-8316-09109770ac99 nodeName:}" failed. No retries permitted until 2026-04-24 16:38:50.408298643 +0000 UTC m=+18.122366660 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs") pod "network-metrics-daemon-f9bsr" (UID: "10ab450a-933f-4b41-8316-09109770ac99") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:42.509789 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:42.509123 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bh54r\" (UniqueName: \"kubernetes.io/projected/afaef099-a861-4606-97c5-485da57c818f-kube-api-access-bh54r\") pod \"network-check-target-85nzt\" (UID: \"afaef099-a861-4606-97c5-485da57c818f\") " pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:38:42.509789 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:42.509332 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:38:42.509789 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:42.509349 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:38:42.509789 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:42.509364 2575 projected.go:194] Error preparing data for projected volume kube-api-access-bh54r for pod openshift-network-diagnostics/network-check-target-85nzt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:42.509789 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:42.509429 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afaef099-a861-4606-97c5-485da57c818f-kube-api-access-bh54r podName:afaef099-a861-4606-97c5-485da57c818f nodeName:}" failed. No retries permitted until 2026-04-24 16:38:50.509410465 +0000 UTC m=+18.223478481 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-bh54r" (UniqueName: "kubernetes.io/projected/afaef099-a861-4606-97c5-485da57c818f-kube-api-access-bh54r") pod "network-check-target-85nzt" (UID: "afaef099-a861-4606-97c5-485da57c818f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:42.828854 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:42.828313 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:38:42.828854 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:42.828430 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85nzt" podUID="afaef099-a861-4606-97c5-485da57c818f" Apr 24 16:38:43.826829 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:43.826775 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:43.827324 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:43.826924 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f9bsr" podUID="10ab450a-933f-4b41-8316-09109770ac99" Apr 24 16:38:44.827913 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:44.827828 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:38:44.828370 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:44.827968 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85nzt" podUID="afaef099-a861-4606-97c5-485da57c818f" Apr 24 16:38:45.826967 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:45.826930 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:45.827179 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:45.827070 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f9bsr" podUID="10ab450a-933f-4b41-8316-09109770ac99" Apr 24 16:38:46.827239 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:46.827199 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:38:46.827701 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:46.827341 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85nzt" podUID="afaef099-a861-4606-97c5-485da57c818f" Apr 24 16:38:47.827761 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:47.827726 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:47.828189 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:47.827856 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f9bsr" podUID="10ab450a-933f-4b41-8316-09109770ac99" Apr 24 16:38:48.827024 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:48.826982 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:38:48.827221 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:48.827142 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85nzt" podUID="afaef099-a861-4606-97c5-485da57c818f" Apr 24 16:38:49.827338 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:49.827296 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:49.827768 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:49.827465 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f9bsr" podUID="10ab450a-933f-4b41-8316-09109770ac99" Apr 24 16:38:50.466161 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:50.466125 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs\") pod \"network-metrics-daemon-f9bsr\" (UID: \"10ab450a-933f-4b41-8316-09109770ac99\") " pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:50.466348 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:50.466316 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:50.466410 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:50.466398 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs podName:10ab450a-933f-4b41-8316-09109770ac99 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:06.466376094 +0000 UTC m=+34.180444112 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs") pod "network-metrics-daemon-f9bsr" (UID: "10ab450a-933f-4b41-8316-09109770ac99") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:38:50.566532 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:50.566486 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bh54r\" (UniqueName: \"kubernetes.io/projected/afaef099-a861-4606-97c5-485da57c818f-kube-api-access-bh54r\") pod \"network-check-target-85nzt\" (UID: \"afaef099-a861-4606-97c5-485da57c818f\") " pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:38:50.566722 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:50.566641 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:38:50.566722 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:50.566664 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:38:50.566722 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:50.566677 2575 projected.go:194] Error preparing data for projected volume kube-api-access-bh54r for pod openshift-network-diagnostics/network-check-target-85nzt: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:50.566887 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:50.566740 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/afaef099-a861-4606-97c5-485da57c818f-kube-api-access-bh54r podName:afaef099-a861-4606-97c5-485da57c818f nodeName:}" failed. No retries permitted until 2026-04-24 16:39:06.566719031 +0000 UTC m=+34.280787068 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-bh54r" (UniqueName: "kubernetes.io/projected/afaef099-a861-4606-97c5-485da57c818f-kube-api-access-bh54r") pod "network-check-target-85nzt" (UID: "afaef099-a861-4606-97c5-485da57c818f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:38:50.620402 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:50.620371 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-cwpk5"] Apr 24 16:38:50.671312 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:50.671280 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:38:50.671570 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:50.671372 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cwpk5" podUID="f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1" Apr 24 16:38:50.768060 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:50.767975 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-kubelet-config\") pod \"global-pull-secret-syncer-cwpk5\" (UID: \"f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1\") " pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:38:50.768272 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:50.768104 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-dbus\") pod \"global-pull-secret-syncer-cwpk5\" (UID: \"f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1\") " pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:38:50.768272 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:50.768149 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-original-pull-secret\") pod \"global-pull-secret-syncer-cwpk5\" (UID: \"f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1\") " pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:38:50.827628 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:50.827579 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:38:50.828110 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:50.827718 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85nzt" podUID="afaef099-a861-4606-97c5-485da57c818f" Apr 24 16:38:50.868464 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:50.868431 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-kubelet-config\") pod \"global-pull-secret-syncer-cwpk5\" (UID: \"f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1\") " pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:38:50.868635 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:50.868487 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-dbus\") pod \"global-pull-secret-syncer-cwpk5\" (UID: \"f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1\") " pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:38:50.868635 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:50.868519 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-original-pull-secret\") pod \"global-pull-secret-syncer-cwpk5\" (UID: \"f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1\") " pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:38:50.868635 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:50.868558 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-kubelet-config\") pod \"global-pull-secret-syncer-cwpk5\" (UID: \"f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1\") " pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:38:50.868749 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:50.868640 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:38:50.868749 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:50.868688 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-original-pull-secret podName:f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1 nodeName:}" failed. No retries permitted until 2026-04-24 16:38:51.368674592 +0000 UTC m=+19.082742607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-original-pull-secret") pod "global-pull-secret-syncer-cwpk5" (UID: "f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:38:50.868749 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:50.868701 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-dbus\") pod \"global-pull-secret-syncer-cwpk5\" (UID: \"f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1\") " pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:38:51.373045 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:51.372995 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-original-pull-secret\") pod \"global-pull-secret-syncer-cwpk5\" (UID: \"f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1\") " pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:38:51.373209 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:51.373160 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:38:51.373268 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:51.373237 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-original-pull-secret podName:f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1 nodeName:}" failed. No retries permitted until 2026-04-24 16:38:52.373219361 +0000 UTC m=+20.087287377 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-original-pull-secret") pod "global-pull-secret-syncer-cwpk5" (UID: "f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:38:51.827065 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:51.826972 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:51.827249 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:51.827135 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f9bsr" podUID="10ab450a-933f-4b41-8316-09109770ac99" Apr 24 16:38:51.827249 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:51.827180 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:38:51.827354 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:51.827298 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cwpk5" podUID="f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1" Apr 24 16:38:52.130998 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:38:52.130916 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06388f4e_daeb_4db0_906e_01adfa3e3b97.slice/crio-81a14a44d5097bf979097f001eb35332e1a7bcb23c2c0d286f233ee36bb489e8 WatchSource:0}: Error finding container 81a14a44d5097bf979097f001eb35332e1a7bcb23c2c0d286f233ee36bb489e8: Status 404 returned error can't find the container with id 81a14a44d5097bf979097f001eb35332e1a7bcb23c2c0d286f233ee36bb489e8 Apr 24 16:38:52.380461 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:52.380415 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-original-pull-secret\") pod \"global-pull-secret-syncer-cwpk5\" (UID: \"f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1\") " pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:38:52.380648 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:52.380607 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:38:52.380714 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:52.380673 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-original-pull-secret podName:f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1 nodeName:}" failed. No retries permitted until 2026-04-24 16:38:54.380655316 +0000 UTC m=+22.094723353 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-original-pull-secret") pod "global-pull-secret-syncer-cwpk5" (UID: "f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:38:52.827906 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:52.827876 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:38:52.828044 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:52.827959 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85nzt" podUID="afaef099-a861-4606-97c5-485da57c818f" Apr 24 16:38:52.894363 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:52.894242 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-phgx4" event={"ID":"52f93581-75a0-4ae1-92b6-3ce3e189cd48","Type":"ContainerStarted","Data":"8b842c8c9384938b856b9bca62fd3bd8421351b225635ad75603cdc158694595"} Apr 24 16:38:52.895624 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:52.895596 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" event={"ID":"59d1c6f6-5f2f-4c93-b15d-47374722a0fe","Type":"ContainerStarted","Data":"66c3aaf7cff2a519595dbd622aab9970b262d521b3a77ccd0358a885ebe2b240"} Apr 24 16:38:52.896887 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:52.896864 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" event={"ID":"7587d0da-cb0b-407d-b7b1-2f580492c29e","Type":"ContainerStarted","Data":"0072193d0d40cd22440c1726961be0d6e8d7b9ba9526bcf77937a570e171a005"} Apr 24 16:38:52.898504 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:52.898485 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 16:38:52.898824 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:52.898800 2575 generic.go:358] "Generic (PLEG): container finished" podID="c2b71d82-71f3-4cf6-95a9-73b3d509e492" containerID="484767268ef0cdb34df9400f2961bcfcaf9d178abddd8b06e3acd0ed09153ba0" exitCode=1 Apr 24 16:38:52.898896 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:52.898832 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" event={"ID":"c2b71d82-71f3-4cf6-95a9-73b3d509e492","Type":"ContainerStarted","Data":"89d8d2aad092e48f6a18854bf5b4ffe48e4995fcf414f290f6198da7ea8b2e9c"} Apr 24 16:38:52.898896 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:52.898870 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" event={"ID":"c2b71d82-71f3-4cf6-95a9-73b3d509e492","Type":"ContainerDied","Data":"484767268ef0cdb34df9400f2961bcfcaf9d178abddd8b06e3acd0ed09153ba0"} Apr 24 16:38:52.898896 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:52.898888 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" event={"ID":"c2b71d82-71f3-4cf6-95a9-73b3d509e492","Type":"ContainerStarted","Data":"cb14ab0163d669a6d15171d757f3bdc768b49e0d117a8d06aff5830525d0ddbf"} Apr 24 16:38:52.900137 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:52.900112 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-psxtk" event={"ID":"34943398-acf4-440b-900d-999cb567a483","Type":"ContainerStarted","Data":"31f2654be5dde85afa0e6e8583a387a29d2efd137f2f9ae95304db74331f41f0"} Apr 24 16:38:52.901283 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:52.901257 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5hcrw" event={"ID":"050c36c1-b0b2-434a-91f9-c53a02f67059","Type":"ContainerStarted","Data":"10232d39c2c474d30bdc5afa1c3f48d012d39ee7ef34847fd72ac8531a09ea34"} Apr 24 16:38:52.902878 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:52.902857 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9hv6v" event={"ID":"0306168c-6c00-4a89-9e2b-fff3d030b0e2","Type":"ContainerStarted","Data":"d672f63f530d46496a2c941bcabee4c002dd37b64f79ed903fc4301cb1e72ee8"} Apr 24 16:38:52.904766 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:52.904741 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7hmxs" event={"ID":"06388f4e-daeb-4db0-906e-01adfa3e3b97","Type":"ContainerStarted","Data":"6706955f1cf92620be54b2f9796cb9cbc422e0bf816e6d109c7524c4972934de"} Apr 24 16:38:52.904870 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:52.904774 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7hmxs" event={"ID":"06388f4e-daeb-4db0-906e-01adfa3e3b97","Type":"ContainerStarted","Data":"81a14a44d5097bf979097f001eb35332e1a7bcb23c2c0d286f233ee36bb489e8"} Apr 24 16:38:52.914719 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:52.914663 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-psxtk" podStartSLOduration=3.313599589 podStartE2EDuration="19.914645721s" podCreationTimestamp="2026-04-24 16:38:33 +0000 UTC" firstStartedPulling="2026-04-24 16:38:35.651532826 +0000 UTC m=+3.365600844" lastFinishedPulling="2026-04-24 16:38:52.252578957 +0000 UTC m=+19.966646976" observedRunningTime="2026-04-24 16:38:52.91399097 +0000 UTC m=+20.628059005" watchObservedRunningTime="2026-04-24 16:38:52.914645721 +0000 UTC m=+20.628713758" Apr 24 16:38:52.948773 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:52.948706 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-8sqwm" podStartSLOduration=3.354237172 podStartE2EDuration="19.94868455s" podCreationTimestamp="2026-04-24 16:38:33 +0000 UTC" firstStartedPulling="2026-04-24 16:38:35.707007876 +0000 UTC m=+3.421075902" lastFinishedPulling="2026-04-24 16:38:52.301455252 +0000 UTC m=+20.015523280" observedRunningTime="2026-04-24 16:38:52.94799906 +0000 UTC m=+20.662067097" watchObservedRunningTime="2026-04-24 16:38:52.94868455 +0000 UTC m=+20.662752641" Apr 24 16:38:52.961273 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:52.961220 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-phgx4" podStartSLOduration=3.41573774 podStartE2EDuration="19.961205468s" podCreationTimestamp="2026-04-24 16:38:33 +0000 UTC" firstStartedPulling="2026-04-24 16:38:35.707049865 +0000 UTC m=+3.421117880" lastFinishedPulling="2026-04-24 16:38:52.252517581 +0000 UTC m=+19.966585608" observedRunningTime="2026-04-24 16:38:52.961016476 +0000 UTC m=+20.675084513" watchObservedRunningTime="2026-04-24 16:38:52.961205468 +0000 UTC m=+20.675273526" Apr 24 16:38:52.977499 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:52.977453 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9hv6v" podStartSLOduration=3.305401223 podStartE2EDuration="19.977437497s" podCreationTimestamp="2026-04-24 16:38:33 +0000 UTC" firstStartedPulling="2026-04-24 16:38:35.647368582 +0000 UTC m=+3.361436611" lastFinishedPulling="2026-04-24 16:38:52.319404857 +0000 UTC m=+20.033472885" observedRunningTime="2026-04-24 16:38:52.976474565 +0000 UTC m=+20.690542613" watchObservedRunningTime="2026-04-24 16:38:52.977437497 +0000 UTC m=+20.691505515" Apr 24 16:38:52.998449 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:52.998389 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7hmxs" podStartSLOduration=15.998370753 podStartE2EDuration="15.998370753s" podCreationTimestamp="2026-04-24 16:38:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:38:52.997795978 +0000 UTC m=+20.711864016" watchObservedRunningTime="2026-04-24 16:38:52.998370753 +0000 UTC m=+20.712438788" Apr 24 16:38:53.677713 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:53.677670 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-phgx4" Apr 24 16:38:53.678413 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:53.678294 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-phgx4" Apr 24 16:38:53.826760 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:53.826722 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:38:53.826910 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:53.826835 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cwpk5" podUID="f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1" Apr 24 16:38:53.826910 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:53.826886 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:53.827032 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:53.826969 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f9bsr" podUID="10ab450a-933f-4b41-8316-09109770ac99" Apr 24 16:38:53.907701 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:53.907667 2575 generic.go:358] "Generic (PLEG): container finished" podID="050c36c1-b0b2-434a-91f9-c53a02f67059" containerID="10232d39c2c474d30bdc5afa1c3f48d012d39ee7ef34847fd72ac8531a09ea34" exitCode=0 Apr 24 16:38:53.907892 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:53.907741 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5hcrw" event={"ID":"050c36c1-b0b2-434a-91f9-c53a02f67059","Type":"ContainerDied","Data":"10232d39c2c474d30bdc5afa1c3f48d012d39ee7ef34847fd72ac8531a09ea34"} Apr 24 16:38:53.908997 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:53.908971 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-f5ds8" event={"ID":"7fd4e710-ea9f-4927-b943-ca92fb5629da","Type":"ContainerStarted","Data":"1f9f3ac69c4d3a58cc7aa041a63a70b33f2f7faf8142067bf7f6e280d55684da"} Apr 24 16:38:53.911314 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:53.911296 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 16:38:53.911656 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:53.911627 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" event={"ID":"c2b71d82-71f3-4cf6-95a9-73b3d509e492","Type":"ContainerStarted","Data":"a087cb756a5de1645c63272520aebc7d4e300d406ed0d03dd7f861d6ed37a9aa"} Apr 24 16:38:53.911727 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:53.911663 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" event={"ID":"c2b71d82-71f3-4cf6-95a9-73b3d509e492","Type":"ContainerStarted","Data":"fd06baf76e752911c2811fd4d40931df95107999da9cfe53d4e78d2c45b0540e"} Apr 24 16:38:53.911727 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:53.911678 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" event={"ID":"c2b71d82-71f3-4cf6-95a9-73b3d509e492","Type":"ContainerStarted","Data":"14c8220011b15cdbe441722c753cefcee2ffae63a09c4e526c67f48ba2afcfd9"} Apr 24 16:38:53.911829 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:53.911792 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-phgx4" Apr 24 16:38:53.912393 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:53.912375 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-phgx4" Apr 24 16:38:53.950484 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:53.950431 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-f5ds8" podStartSLOduration=4.355799221 podStartE2EDuration="20.950414821s" podCreationTimestamp="2026-04-24 16:38:33 +0000 UTC" firstStartedPulling="2026-04-24 16:38:35.706832722 +0000 UTC m=+3.420900736" lastFinishedPulling="2026-04-24 16:38:52.301448309 +0000 UTC m=+20.015516336" observedRunningTime="2026-04-24 16:38:53.950312975 +0000 UTC m=+21.664381013" watchObservedRunningTime="2026-04-24 16:38:53.950414821 +0000 UTC m=+21.664482856" Apr 24 16:38:54.205499 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:54.205340 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 16:38:54.399616 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:54.399536 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-original-pull-secret\") pod \"global-pull-secret-syncer-cwpk5\" (UID: \"f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1\") " pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:38:54.399782 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:54.399682 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:38:54.399782 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:54.399750 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-original-pull-secret podName:f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1 nodeName:}" failed. No retries permitted until 2026-04-24 16:38:58.399735194 +0000 UTC m=+26.113803209 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-original-pull-secret") pod "global-pull-secret-syncer-cwpk5" (UID: "f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:38:54.795133 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:54.794965 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T16:38:54.205493042Z","UUID":"bf101a92-908a-4a37-9d1e-ae1eee9728b1","Handler":null,"Name":"","Endpoint":""} Apr 24 16:38:54.797044 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:54.797001 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 16:38:54.797044 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:54.797035 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 16:38:54.827256 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:54.827222 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:38:54.827435 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:54.827359 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85nzt" podUID="afaef099-a861-4606-97c5-485da57c818f" Apr 24 16:38:54.915414 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:54.915375 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" event={"ID":"7587d0da-cb0b-407d-b7b1-2f580492c29e","Type":"ContainerStarted","Data":"cd180296ec4ac4cc421cc243ffbdf558a2419576234fa67540f25657512028bd"} Apr 24 16:38:55.827223 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:55.827117 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:38:55.827223 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:55.827141 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:55.827914 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:55.827259 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cwpk5" podUID="f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1" Apr 24 16:38:55.827914 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:55.827293 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f9bsr" podUID="10ab450a-933f-4b41-8316-09109770ac99" Apr 24 16:38:55.918835 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:55.918797 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" event={"ID":"7587d0da-cb0b-407d-b7b1-2f580492c29e","Type":"ContainerStarted","Data":"41241e49ae9cda727b04256c97d7a4883907ba150f5e33bfb19e0ced37b276af"} Apr 24 16:38:55.921357 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:55.921333 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 16:38:55.921720 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:55.921697 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" event={"ID":"c2b71d82-71f3-4cf6-95a9-73b3d509e492","Type":"ContainerStarted","Data":"15137966d611dcf15e9e262464c68e27c24db52af90edf75582a4a01020c5624"} Apr 24 16:38:55.939356 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:55.939307 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5n7s6" podStartSLOduration=3.048185185 podStartE2EDuration="22.93929436s" podCreationTimestamp="2026-04-24 16:38:33 +0000 UTC" firstStartedPulling="2026-04-24 16:38:35.682751974 +0000 UTC m=+3.396819988" lastFinishedPulling="2026-04-24 16:38:55.57386114 +0000 UTC m=+23.287929163" observedRunningTime="2026-04-24 16:38:55.938988282 +0000 UTC m=+23.653056317" watchObservedRunningTime="2026-04-24 16:38:55.93929436 +0000 UTC m=+23.653362395" Apr 24 16:38:56.827278 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:56.827241 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:38:56.827800 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:56.827360 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85nzt" podUID="afaef099-a861-4606-97c5-485da57c818f" Apr 24 16:38:57.827852 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:57.827735 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:57.827852 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:57.827735 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:38:57.828309 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:57.827873 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f9bsr" podUID="10ab450a-933f-4b41-8316-09109770ac99" Apr 24 16:38:57.828309 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:57.827936 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cwpk5" podUID="f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1" Apr 24 16:38:57.928634 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:57.928606 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 16:38:57.929008 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:57.928963 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" event={"ID":"c2b71d82-71f3-4cf6-95a9-73b3d509e492","Type":"ContainerStarted","Data":"bdab80c672c2a4073e32c572a248a6d5a3483f7338f670d2e5b9c4049b63ee54"} Apr 24 16:38:57.929421 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:57.929399 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:57.929565 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:57.929549 2575 scope.go:117] "RemoveContainer" containerID="484767268ef0cdb34df9400f2961bcfcaf9d178abddd8b06e3acd0ed09153ba0" Apr 24 16:38:57.945633 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:57.945606 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:58.432559 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:58.432515 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-original-pull-secret\") pod \"global-pull-secret-syncer-cwpk5\" (UID: \"f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1\") " pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:38:58.432759 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:58.432673 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:38:58.432759 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:58.432755 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-original-pull-secret podName:f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:06.432734355 +0000 UTC m=+34.146802383 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-original-pull-secret") pod "global-pull-secret-syncer-cwpk5" (UID: "f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:38:58.827238 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:58.827150 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:38:58.827447 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:58.827290 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85nzt" podUID="afaef099-a861-4606-97c5-485da57c818f" Apr 24 16:38:59.706279 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:59.706198 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f9bsr"] Apr 24 16:38:59.706678 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:59.706350 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:38:59.706678 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:59.706472 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f9bsr" podUID="10ab450a-933f-4b41-8316-09109770ac99" Apr 24 16:38:59.709139 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:59.709077 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cwpk5"] Apr 24 16:38:59.709299 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:59.709278 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:38:59.709439 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:59.709404 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cwpk5" podUID="f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1" Apr 24 16:38:59.709706 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:59.709686 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-85nzt"] Apr 24 16:38:59.709792 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:59.709764 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:38:59.709860 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:38:59.709841 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85nzt" podUID="afaef099-a861-4606-97c5-485da57c818f" Apr 24 16:38:59.935861 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:59.935835 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 16:38:59.936183 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:59.936157 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" event={"ID":"c2b71d82-71f3-4cf6-95a9-73b3d509e492","Type":"ContainerStarted","Data":"5f374b43e256e6dfcc335f891804c2be1f4eab1c0028d4e69206719f6f783b58"} Apr 24 16:38:59.936402 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:59.936384 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:59.936464 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:59.936411 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:59.937857 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:59.937823 2575 generic.go:358] "Generic (PLEG): container finished" podID="050c36c1-b0b2-434a-91f9-c53a02f67059" containerID="49ba46b4a1662857ee97bd099d6a0ec1565b77d82a4483185ce416789db2d48f" exitCode=0 Apr 24 16:38:59.937956 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:59.937863 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5hcrw" event={"ID":"050c36c1-b0b2-434a-91f9-c53a02f67059","Type":"ContainerDied","Data":"49ba46b4a1662857ee97bd099d6a0ec1565b77d82a4483185ce416789db2d48f"} Apr 24 16:38:59.951717 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:59.951689 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:38:59.965416 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:38:59.965324 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" podStartSLOduration=10.261775758 podStartE2EDuration="26.965308852s" podCreationTimestamp="2026-04-24 16:38:33 +0000 UTC" firstStartedPulling="2026-04-24 16:38:35.652229494 +0000 UTC m=+3.366297508" lastFinishedPulling="2026-04-24 16:38:52.355762573 +0000 UTC m=+20.069830602" observedRunningTime="2026-04-24 16:38:59.964122774 +0000 UTC m=+27.678190810" watchObservedRunningTime="2026-04-24 16:38:59.965308852 +0000 UTC m=+27.679376889" Apr 24 16:39:00.941688 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:00.941654 2575 generic.go:358] "Generic (PLEG): container finished" podID="050c36c1-b0b2-434a-91f9-c53a02f67059" containerID="b5e4a2f6dcb21d335ee4c09117160ec82edbd42ea9b43d819d93a22b351fefaf" exitCode=0 Apr 24 16:39:00.942056 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:00.941746 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5hcrw" event={"ID":"050c36c1-b0b2-434a-91f9-c53a02f67059","Type":"ContainerDied","Data":"b5e4a2f6dcb21d335ee4c09117160ec82edbd42ea9b43d819d93a22b351fefaf"} Apr 24 16:39:01.827109 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:01.827064 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:39:01.827254 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:01.827199 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:39:01.827254 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:01.827204 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85nzt" podUID="afaef099-a861-4606-97c5-485da57c818f" Apr 24 16:39:01.827345 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:01.827299 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f9bsr" podUID="10ab450a-933f-4b41-8316-09109770ac99" Apr 24 16:39:01.827345 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:01.827342 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:39:01.827407 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:01.827394 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cwpk5" podUID="f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1" Apr 24 16:39:01.945683 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:01.945644 2575 generic.go:358] "Generic (PLEG): container finished" podID="050c36c1-b0b2-434a-91f9-c53a02f67059" containerID="78ab07f3789b025255bc0660054f424f911572a9c9b175e42e6369d70a4c723a" exitCode=0 Apr 24 16:39:01.946060 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:01.945775 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5hcrw" event={"ID":"050c36c1-b0b2-434a-91f9-c53a02f67059","Type":"ContainerDied","Data":"78ab07f3789b025255bc0660054f424f911572a9c9b175e42e6369d70a4c723a"} Apr 24 16:39:03.826851 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:03.826785 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:39:03.826851 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:03.826839 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:39:03.827755 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:03.826785 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:39:03.827755 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:03.826941 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-85nzt" podUID="afaef099-a861-4606-97c5-485da57c818f" Apr 24 16:39:03.827755 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:03.827014 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-f9bsr" podUID="10ab450a-933f-4b41-8316-09109770ac99" Apr 24 16:39:03.827755 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:03.827108 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cwpk5" podUID="f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1" Apr 24 16:39:05.615506 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.615475 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-37.ec2.internal" event="NodeReady" Apr 24 16:39:05.616121 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.615631 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 16:39:05.650740 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.650701 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66c895d4b7-5gft4"] Apr 24 16:39:05.654888 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.654862 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.658841 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.657749 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6j55m\"" Apr 24 16:39:05.658841 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.658065 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 16:39:05.658841 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.658288 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 16:39:05.658841 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.658594 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 16:39:05.663577 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.663551 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 16:39:05.667208 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.667138 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66c895d4b7-5gft4"] Apr 24 16:39:05.667934 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.667908 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ckvvn"] Apr 24 16:39:05.672823 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.672801 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fwvss"] Apr 24 16:39:05.672996 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.672976 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ckvvn" Apr 24 16:39:05.676266 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.676240 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4d6qk"] Apr 24 16:39:05.676434 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.676410 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" Apr 24 16:39:05.676782 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.676751 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 16:39:05.676978 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.676762 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 16:39:05.677112 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.676827 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zzj8j\"" Apr 24 16:39:05.679238 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.679214 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4d6qk"] Apr 24 16:39:05.679359 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.679279 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 16:39:05.679359 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.679327 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4d6qk" Apr 24 16:39:05.679610 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.679593 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 16:39:05.679837 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.679818 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-qzvrt\"" Apr 24 16:39:05.682751 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.682687 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 16:39:05.682848 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.682805 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 16:39:05.683190 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.683042 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-5vrnz\"" Apr 24 16:39:05.683922 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.683902 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ckvvn"] Apr 24 16:39:05.684983 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.684960 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fwvss"] Apr 24 16:39:05.690451 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.690413 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-z6w7t"] Apr 24 16:39:05.693628 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.693602 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z6w7t" Apr 24 16:39:05.696859 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.696622 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 16:39:05.696859 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.696635 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 16:39:05.696859 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.696662 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 16:39:05.696859 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.696759 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lhhc7\"" Apr 24 16:39:05.703654 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.703625 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z6w7t"] Apr 24 16:39:05.789678 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.789639 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-trusted-ca\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.789678 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.789678 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-installation-pull-secrets\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.789905 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.789700 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtjkm\" (UniqueName: \"kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-kube-api-access-jtjkm\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.789905 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.789800 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed7942f7-292f-402b-af38-8f0c16de0ee3-config-volume\") pod \"dns-default-ckvvn\" (UID: \"ed7942f7-292f-402b-af38-8f0c16de0ee3\") " pod="openshift-dns/dns-default-ckvvn" Apr 24 16:39:05.789905 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.789834 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9de18b4c-44be-4c2e-9b15-1b3401784bcd-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fwvss\" (UID: \"9de18b4c-44be-4c2e-9b15-1b3401784bcd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" Apr 24 16:39:05.789905 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.789871 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ed7942f7-292f-402b-af38-8f0c16de0ee3-tmp-dir\") pod \"dns-default-ckvvn\" (UID: \"ed7942f7-292f-402b-af38-8f0c16de0ee3\") " pod="openshift-dns/dns-default-ckvvn" Apr 24 16:39:05.790061 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.789911 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl688\" (UniqueName: \"kubernetes.io/projected/47d71194-f92b-4ce8-a112-c73134f86aa4-kube-api-access-tl688\") pod \"network-check-source-8894fc9bd-4d6qk\" (UID: \"47d71194-f92b-4ce8-a112-c73134f86aa4\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4d6qk" Apr 24 16:39:05.790061 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.789943 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-image-registry-private-configuration\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.790061 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.789969 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.790061 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.789988 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert\") pod \"ingress-canary-z6w7t\" (UID: \"cdbe4c96-edde-4285-9466-eeb5fd3f169b\") " pod="openshift-ingress-canary/ingress-canary-z6w7t" Apr 24 16:39:05.790061 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.790003 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls\") pod \"dns-default-ckvvn\" (UID: \"ed7942f7-292f-402b-af38-8f0c16de0ee3\") " pod="openshift-dns/dns-default-ckvvn" Apr 24 16:39:05.790061 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.790024 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czgrr\" (UniqueName: \"kubernetes.io/projected/ed7942f7-292f-402b-af38-8f0c16de0ee3-kube-api-access-czgrr\") pod \"dns-default-ckvvn\" (UID: \"ed7942f7-292f-402b-af38-8f0c16de0ee3\") " pod="openshift-dns/dns-default-ckvvn" Apr 24 16:39:05.790348 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.790058 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-certificates\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.790348 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.790110 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-bound-sa-token\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.790348 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.790136 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxt6r\" (UniqueName: \"kubernetes.io/projected/cdbe4c96-edde-4285-9466-eeb5fd3f169b-kube-api-access-pxt6r\") pod \"ingress-canary-z6w7t\" (UID: \"cdbe4c96-edde-4285-9466-eeb5fd3f169b\") " pod="openshift-ingress-canary/ingress-canary-z6w7t" Apr 24 16:39:05.790348 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.790157 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-ca-trust-extracted\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.790348 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.790182 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fwvss\" (UID: \"9de18b4c-44be-4c2e-9b15-1b3401784bcd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" Apr 24 16:39:05.827034 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.826991 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:39:05.827034 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.827026 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:39:05.827287 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.826994 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:39:05.829504 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.829478 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 16:39:05.829655 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.829579 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 16:39:05.829655 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.829624 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zln8m\"" Apr 24 16:39:05.829655 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.829579 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-rvcx9\"" Apr 24 16:39:05.891258 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.891159 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-trusted-ca\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.891258 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.891207 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-installation-pull-secrets\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.891258 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.891230 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtjkm\" (UniqueName: \"kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-kube-api-access-jtjkm\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.891536 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.891275 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed7942f7-292f-402b-af38-8f0c16de0ee3-config-volume\") pod \"dns-default-ckvvn\" (UID: \"ed7942f7-292f-402b-af38-8f0c16de0ee3\") " pod="openshift-dns/dns-default-ckvvn" Apr 24 16:39:05.891536 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.891304 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9de18b4c-44be-4c2e-9b15-1b3401784bcd-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fwvss\" (UID: \"9de18b4c-44be-4c2e-9b15-1b3401784bcd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" Apr 24 16:39:05.891536 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.891338 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ed7942f7-292f-402b-af38-8f0c16de0ee3-tmp-dir\") pod \"dns-default-ckvvn\" (UID: \"ed7942f7-292f-402b-af38-8f0c16de0ee3\") " pod="openshift-dns/dns-default-ckvvn" Apr 24 16:39:05.891536 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.891387 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tl688\" (UniqueName: \"kubernetes.io/projected/47d71194-f92b-4ce8-a112-c73134f86aa4-kube-api-access-tl688\") pod \"network-check-source-8894fc9bd-4d6qk\" (UID: \"47d71194-f92b-4ce8-a112-c73134f86aa4\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4d6qk" Apr 24 16:39:05.891536 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.891414 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-image-registry-private-configuration\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.891536 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.891441 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.891536 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.891472 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert\") pod \"ingress-canary-z6w7t\" (UID: \"cdbe4c96-edde-4285-9466-eeb5fd3f169b\") " pod="openshift-ingress-canary/ingress-canary-z6w7t" Apr 24 16:39:05.891536 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.891496 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls\") pod \"dns-default-ckvvn\" (UID: \"ed7942f7-292f-402b-af38-8f0c16de0ee3\") " pod="openshift-dns/dns-default-ckvvn" Apr 24 16:39:05.891536 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.891526 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-czgrr\" (UniqueName: \"kubernetes.io/projected/ed7942f7-292f-402b-af38-8f0c16de0ee3-kube-api-access-czgrr\") pod \"dns-default-ckvvn\" (UID: \"ed7942f7-292f-402b-af38-8f0c16de0ee3\") " pod="openshift-dns/dns-default-ckvvn" Apr 24 16:39:05.891951 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.891550 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-certificates\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.891951 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.891576 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-bound-sa-token\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.891951 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.891603 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxt6r\" (UniqueName: \"kubernetes.io/projected/cdbe4c96-edde-4285-9466-eeb5fd3f169b-kube-api-access-pxt6r\") pod \"ingress-canary-z6w7t\" (UID: \"cdbe4c96-edde-4285-9466-eeb5fd3f169b\") " pod="openshift-ingress-canary/ingress-canary-z6w7t" Apr 24 16:39:05.891951 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.891630 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-ca-trust-extracted\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.891951 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.891659 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fwvss\" (UID: \"9de18b4c-44be-4c2e-9b15-1b3401784bcd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" Apr 24 16:39:05.891951 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:05.891778 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 16:39:05.891951 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:05.891855 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert podName:9de18b4c-44be-4c2e-9b15-1b3401784bcd nodeName:}" failed. No retries permitted until 2026-04-24 16:39:06.391835286 +0000 UTC m=+34.105903309 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fwvss" (UID: "9de18b4c-44be-4c2e-9b15-1b3401784bcd") : secret "networking-console-plugin-cert" not found Apr 24 16:39:05.892326 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:05.892048 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:05.892326 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:05.892163 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert podName:cdbe4c96-edde-4285-9466-eeb5fd3f169b nodeName:}" failed. No retries permitted until 2026-04-24 16:39:06.392142059 +0000 UTC m=+34.106210075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert") pod "ingress-canary-z6w7t" (UID: "cdbe4c96-edde-4285-9466-eeb5fd3f169b") : secret "canary-serving-cert" not found Apr 24 16:39:05.892326 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:05.892213 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:05.892326 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:05.892261 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls podName:ed7942f7-292f-402b-af38-8f0c16de0ee3 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:06.392246228 +0000 UTC m=+34.106314245 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls") pod "dns-default-ckvvn" (UID: "ed7942f7-292f-402b-af38-8f0c16de0ee3") : secret "dns-default-metrics-tls" not found Apr 24 16:39:05.892528 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.892393 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-trusted-ca\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.892627 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:05.892580 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:05.892627 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:05.892602 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66c895d4b7-5gft4: secret "image-registry-tls" not found Apr 24 16:39:05.892809 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:05.892659 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls podName:a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:06.392638953 +0000 UTC m=+34.106706971 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls") pod "image-registry-66c895d4b7-5gft4" (UID: "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8") : secret "image-registry-tls" not found Apr 24 16:39:05.892964 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.892941 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-certificates\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.893031 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.892965 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-ca-trust-extracted\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.893031 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.892971 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed7942f7-292f-402b-af38-8f0c16de0ee3-config-volume\") pod \"dns-default-ckvvn\" (UID: \"ed7942f7-292f-402b-af38-8f0c16de0ee3\") " pod="openshift-dns/dns-default-ckvvn" Apr 24 16:39:05.893149 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.893064 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ed7942f7-292f-402b-af38-8f0c16de0ee3-tmp-dir\") pod \"dns-default-ckvvn\" (UID: \"ed7942f7-292f-402b-af38-8f0c16de0ee3\") " pod="openshift-dns/dns-default-ckvvn" Apr 24 16:39:05.893402 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.893380 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9de18b4c-44be-4c2e-9b15-1b3401784bcd-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-fwvss\" (UID: \"9de18b4c-44be-4c2e-9b15-1b3401784bcd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" Apr 24 16:39:05.896438 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.896410 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-image-registry-private-configuration\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.896567 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.896437 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-installation-pull-secrets\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.905136 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.904968 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-czgrr\" (UniqueName: \"kubernetes.io/projected/ed7942f7-292f-402b-af38-8f0c16de0ee3-kube-api-access-czgrr\") pod \"dns-default-ckvvn\" (UID: \"ed7942f7-292f-402b-af38-8f0c16de0ee3\") " pod="openshift-dns/dns-default-ckvvn" Apr 24 16:39:05.905136 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.905119 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxt6r\" (UniqueName: \"kubernetes.io/projected/cdbe4c96-edde-4285-9466-eeb5fd3f169b-kube-api-access-pxt6r\") pod \"ingress-canary-z6w7t\" (UID: \"cdbe4c96-edde-4285-9466-eeb5fd3f169b\") " pod="openshift-ingress-canary/ingress-canary-z6w7t" Apr 24 16:39:05.905511 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.905483 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtjkm\" (UniqueName: \"kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-kube-api-access-jtjkm\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:05.905599 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.905546 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl688\" (UniqueName: \"kubernetes.io/projected/47d71194-f92b-4ce8-a112-c73134f86aa4-kube-api-access-tl688\") pod \"network-check-source-8894fc9bd-4d6qk\" (UID: \"47d71194-f92b-4ce8-a112-c73134f86aa4\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4d6qk" Apr 24 16:39:05.906255 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:05.906224 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-bound-sa-token\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:06.004460 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:06.004422 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4d6qk" Apr 24 16:39:06.396539 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:06.396279 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:06.396751 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:06.396573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert\") pod \"ingress-canary-z6w7t\" (UID: \"cdbe4c96-edde-4285-9466-eeb5fd3f169b\") " pod="openshift-ingress-canary/ingress-canary-z6w7t" Apr 24 16:39:06.396751 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:06.396466 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:06.396751 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:06.396637 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66c895d4b7-5gft4: secret "image-registry-tls" not found Apr 24 16:39:06.396751 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:06.396701 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:06.396751 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:06.396709 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls podName:a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:07.39668954 +0000 UTC m=+35.110757559 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls") pod "image-registry-66c895d4b7-5gft4" (UID: "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8") : secret "image-registry-tls" not found Apr 24 16:39:06.396751 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:06.396608 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls\") pod \"dns-default-ckvvn\" (UID: \"ed7942f7-292f-402b-af38-8f0c16de0ee3\") " pod="openshift-dns/dns-default-ckvvn" Apr 24 16:39:06.397069 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:06.396707 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:06.397069 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:06.396763 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls podName:ed7942f7-292f-402b-af38-8f0c16de0ee3 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:07.396746413 +0000 UTC m=+35.110814430 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls") pod "dns-default-ckvvn" (UID: "ed7942f7-292f-402b-af38-8f0c16de0ee3") : secret "dns-default-metrics-tls" not found Apr 24 16:39:06.397069 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:06.396832 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fwvss\" (UID: \"9de18b4c-44be-4c2e-9b15-1b3401784bcd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" Apr 24 16:39:06.397069 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:06.396874 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert podName:cdbe4c96-edde-4285-9466-eeb5fd3f169b nodeName:}" failed. No retries permitted until 2026-04-24 16:39:07.39686177 +0000 UTC m=+35.110929785 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert") pod "ingress-canary-z6w7t" (UID: "cdbe4c96-edde-4285-9466-eeb5fd3f169b") : secret "canary-serving-cert" not found Apr 24 16:39:06.397069 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:06.396914 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 16:39:06.397069 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:06.396968 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert podName:9de18b4c-44be-4c2e-9b15-1b3401784bcd nodeName:}" failed. No retries permitted until 2026-04-24 16:39:07.396954078 +0000 UTC m=+35.111022106 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fwvss" (UID: "9de18b4c-44be-4c2e-9b15-1b3401784bcd") : secret "networking-console-plugin-cert" not found Apr 24 16:39:06.498212 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:06.498165 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-original-pull-secret\") pod \"global-pull-secret-syncer-cwpk5\" (UID: \"f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1\") " pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:39:06.498417 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:06.498227 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs\") pod \"network-metrics-daemon-f9bsr\" (UID: \"10ab450a-933f-4b41-8316-09109770ac99\") " pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:39:06.498417 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:06.498355 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 16:39:06.498417 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:06.498417 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs podName:10ab450a-933f-4b41-8316-09109770ac99 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:38.498402358 +0000 UTC m=+66.212470372 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs") pod "network-metrics-daemon-f9bsr" (UID: "10ab450a-933f-4b41-8316-09109770ac99") : secret "metrics-daemon-secret" not found Apr 24 16:39:06.501458 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:06.501427 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1-original-pull-secret\") pod \"global-pull-secret-syncer-cwpk5\" (UID: \"f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1\") " pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:39:06.598794 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:06.598756 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bh54r\" (UniqueName: \"kubernetes.io/projected/afaef099-a861-4606-97c5-485da57c818f-kube-api-access-bh54r\") pod \"network-check-target-85nzt\" (UID: \"afaef099-a861-4606-97c5-485da57c818f\") " pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:39:06.601876 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:06.601840 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh54r\" (UniqueName: \"kubernetes.io/projected/afaef099-a861-4606-97c5-485da57c818f-kube-api-access-bh54r\") pod \"network-check-target-85nzt\" (UID: \"afaef099-a861-4606-97c5-485da57c818f\") " pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:39:06.739414 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:06.739281 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:39:06.747201 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:06.747170 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cwpk5" Apr 24 16:39:07.404920 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:07.404869 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fwvss\" (UID: \"9de18b4c-44be-4c2e-9b15-1b3401784bcd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" Apr 24 16:39:07.405190 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:07.405035 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 16:39:07.405190 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:07.405144 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert podName:9de18b4c-44be-4c2e-9b15-1b3401784bcd nodeName:}" failed. No retries permitted until 2026-04-24 16:39:09.405119644 +0000 UTC m=+37.119187661 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fwvss" (UID: "9de18b4c-44be-4c2e-9b15-1b3401784bcd") : secret "networking-console-plugin-cert" not found Apr 24 16:39:07.405190 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:07.405169 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:07.405373 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:07.405199 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert\") pod \"ingress-canary-z6w7t\" (UID: \"cdbe4c96-edde-4285-9466-eeb5fd3f169b\") " pod="openshift-ingress-canary/ingress-canary-z6w7t" Apr 24 16:39:07.405373 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:07.405225 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls\") pod \"dns-default-ckvvn\" (UID: \"ed7942f7-292f-402b-af38-8f0c16de0ee3\") " pod="openshift-dns/dns-default-ckvvn" Apr 24 16:39:07.405373 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:07.405320 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:07.405373 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:07.405329 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:07.405373 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:07.405343 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66c895d4b7-5gft4: secret "image-registry-tls" not found Apr 24 16:39:07.405605 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:07.405331 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:07.405605 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:07.405390 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert podName:cdbe4c96-edde-4285-9466-eeb5fd3f169b nodeName:}" failed. No retries permitted until 2026-04-24 16:39:09.405374002 +0000 UTC m=+37.119442021 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert") pod "ingress-canary-z6w7t" (UID: "cdbe4c96-edde-4285-9466-eeb5fd3f169b") : secret "canary-serving-cert" not found Apr 24 16:39:07.405605 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:07.405414 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls podName:ed7942f7-292f-402b-af38-8f0c16de0ee3 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:09.405399725 +0000 UTC m=+37.119467740 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls") pod "dns-default-ckvvn" (UID: "ed7942f7-292f-402b-af38-8f0c16de0ee3") : secret "dns-default-metrics-tls" not found Apr 24 16:39:07.405605 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:07.405430 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls podName:a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:09.405421732 +0000 UTC m=+37.119489747 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls") pod "image-registry-66c895d4b7-5gft4" (UID: "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8") : secret "image-registry-tls" not found Apr 24 16:39:07.739748 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:07.739715 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4d6qk"] Apr 24 16:39:07.741772 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:07.741751 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cwpk5"] Apr 24 16:39:07.744207 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:39:07.744160 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47d71194_f92b_4ce8_a112_c73134f86aa4.slice/crio-5161a121724f468b620ad00e60cf5c767136a32af6776083f5f16fc782d13e68 WatchSource:0}: Error finding container 5161a121724f468b620ad00e60cf5c767136a32af6776083f5f16fc782d13e68: Status 404 returned error can't find the container with id 5161a121724f468b620ad00e60cf5c767136a32af6776083f5f16fc782d13e68 Apr 24 16:39:07.745145 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:07.745107 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-85nzt"] Apr 24 16:39:07.745845 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:39:07.745818 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7bcaaf9_0dfc_4e23_9bc0_fd3beb2ecbd1.slice/crio-e97bf2eea4c829b567fc6e03b38342280ec95394e0204eb6ecbd5399a0820226 WatchSource:0}: Error finding container e97bf2eea4c829b567fc6e03b38342280ec95394e0204eb6ecbd5399a0820226: Status 404 returned error can't find the container with id e97bf2eea4c829b567fc6e03b38342280ec95394e0204eb6ecbd5399a0820226 Apr 24 16:39:07.748583 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:39:07.748559 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafaef099_a861_4606_97c5_485da57c818f.slice/crio-eb95076866c801ffc476d89cbb7a85fc662215d1800a428523a66372c7df0190 WatchSource:0}: Error finding container eb95076866c801ffc476d89cbb7a85fc662215d1800a428523a66372c7df0190: Status 404 returned error can't find the container with id eb95076866c801ffc476d89cbb7a85fc662215d1800a428523a66372c7df0190 Apr 24 16:39:07.959330 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:07.959227 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-85nzt" event={"ID":"afaef099-a861-4606-97c5-485da57c818f","Type":"ContainerStarted","Data":"eb95076866c801ffc476d89cbb7a85fc662215d1800a428523a66372c7df0190"} Apr 24 16:39:07.960286 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:07.960252 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4d6qk" event={"ID":"47d71194-f92b-4ce8-a112-c73134f86aa4","Type":"ContainerStarted","Data":"5161a121724f468b620ad00e60cf5c767136a32af6776083f5f16fc782d13e68"} Apr 24 16:39:07.961129 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:07.961108 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cwpk5" event={"ID":"f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1","Type":"ContainerStarted","Data":"e97bf2eea4c829b567fc6e03b38342280ec95394e0204eb6ecbd5399a0820226"} Apr 24 16:39:07.963983 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:07.963957 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5hcrw" event={"ID":"050c36c1-b0b2-434a-91f9-c53a02f67059","Type":"ContainerStarted","Data":"afa97635e8bf904c02fbaf9f2d0a23ec5b3c490bbc67206ad75f79b6e0dd2c76"} Apr 24 16:39:08.970590 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:08.970314 2575 generic.go:358] "Generic (PLEG): container finished" podID="050c36c1-b0b2-434a-91f9-c53a02f67059" containerID="afa97635e8bf904c02fbaf9f2d0a23ec5b3c490bbc67206ad75f79b6e0dd2c76" exitCode=0 Apr 24 16:39:08.971060 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:08.970624 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5hcrw" event={"ID":"050c36c1-b0b2-434a-91f9-c53a02f67059","Type":"ContainerDied","Data":"afa97635e8bf904c02fbaf9f2d0a23ec5b3c490bbc67206ad75f79b6e0dd2c76"} Apr 24 16:39:09.424719 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:09.424674 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:09.424902 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:09.424729 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert\") pod \"ingress-canary-z6w7t\" (UID: \"cdbe4c96-edde-4285-9466-eeb5fd3f169b\") " pod="openshift-ingress-canary/ingress-canary-z6w7t" Apr 24 16:39:09.424902 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:09.424758 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls\") pod \"dns-default-ckvvn\" (UID: \"ed7942f7-292f-402b-af38-8f0c16de0ee3\") " pod="openshift-dns/dns-default-ckvvn" Apr 24 16:39:09.424902 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:09.424799 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fwvss\" (UID: \"9de18b4c-44be-4c2e-9b15-1b3401784bcd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" Apr 24 16:39:09.424902 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:09.424843 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:09.424902 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:09.424869 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66c895d4b7-5gft4: secret "image-registry-tls" not found Apr 24 16:39:09.425169 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:09.424926 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 16:39:09.425169 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:09.424938 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls podName:a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:13.424915838 +0000 UTC m=+41.138983872 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls") pod "image-registry-66c895d4b7-5gft4" (UID: "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8") : secret "image-registry-tls" not found Apr 24 16:39:09.425169 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:09.424981 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert podName:9de18b4c-44be-4c2e-9b15-1b3401784bcd nodeName:}" failed. No retries permitted until 2026-04-24 16:39:13.424964488 +0000 UTC m=+41.139032506 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fwvss" (UID: "9de18b4c-44be-4c2e-9b15-1b3401784bcd") : secret "networking-console-plugin-cert" not found Apr 24 16:39:09.425169 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:09.425042 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:09.425169 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:09.425074 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert podName:cdbe4c96-edde-4285-9466-eeb5fd3f169b nodeName:}" failed. No retries permitted until 2026-04-24 16:39:13.425062177 +0000 UTC m=+41.139130197 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert") pod "ingress-canary-z6w7t" (UID: "cdbe4c96-edde-4285-9466-eeb5fd3f169b") : secret "canary-serving-cert" not found Apr 24 16:39:09.425169 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:09.425145 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:09.425390 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:09.425176 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls podName:ed7942f7-292f-402b-af38-8f0c16de0ee3 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:13.425165676 +0000 UTC m=+41.139233694 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls") pod "dns-default-ckvvn" (UID: "ed7942f7-292f-402b-af38-8f0c16de0ee3") : secret "dns-default-metrics-tls" not found Apr 24 16:39:09.977270 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:09.977230 2575 generic.go:358] "Generic (PLEG): container finished" podID="050c36c1-b0b2-434a-91f9-c53a02f67059" containerID="bf1f66e0acd98956e2f84ace102bcb6bd8428f8313c48786d255549b9a52c0fe" exitCode=0 Apr 24 16:39:09.977794 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:09.977307 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5hcrw" event={"ID":"050c36c1-b0b2-434a-91f9-c53a02f67059","Type":"ContainerDied","Data":"bf1f66e0acd98956e2f84ace102bcb6bd8428f8313c48786d255549b9a52c0fe"} Apr 24 16:39:13.459112 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:13.458990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:13.459112 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:13.459038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert\") pod \"ingress-canary-z6w7t\" (UID: \"cdbe4c96-edde-4285-9466-eeb5fd3f169b\") " pod="openshift-ingress-canary/ingress-canary-z6w7t" Apr 24 16:39:13.459112 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:13.459058 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls\") pod \"dns-default-ckvvn\" (UID: \"ed7942f7-292f-402b-af38-8f0c16de0ee3\") " pod="openshift-dns/dns-default-ckvvn" Apr 24 16:39:13.459112 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:13.459109 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fwvss\" (UID: \"9de18b4c-44be-4c2e-9b15-1b3401784bcd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" Apr 24 16:39:13.459586 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:13.459190 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 16:39:13.459586 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:13.459196 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:13.459586 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:13.459224 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66c895d4b7-5gft4: secret "image-registry-tls" not found Apr 24 16:39:13.459586 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:13.459238 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:13.459586 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:13.459254 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert podName:9de18b4c-44be-4c2e-9b15-1b3401784bcd nodeName:}" failed. No retries permitted until 2026-04-24 16:39:21.459235496 +0000 UTC m=+49.173303513 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fwvss" (UID: "9de18b4c-44be-4c2e-9b15-1b3401784bcd") : secret "networking-console-plugin-cert" not found Apr 24 16:39:13.459586 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:13.459282 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls podName:a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:21.459264274 +0000 UTC m=+49.173332291 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls") pod "image-registry-66c895d4b7-5gft4" (UID: "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8") : secret "image-registry-tls" not found Apr 24 16:39:13.459586 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:13.459190 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:13.459586 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:13.459299 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert podName:cdbe4c96-edde-4285-9466-eeb5fd3f169b nodeName:}" failed. No retries permitted until 2026-04-24 16:39:21.459289818 +0000 UTC m=+49.173357836 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert") pod "ingress-canary-z6w7t" (UID: "cdbe4c96-edde-4285-9466-eeb5fd3f169b") : secret "canary-serving-cert" not found Apr 24 16:39:13.459586 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:13.459330 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls podName:ed7942f7-292f-402b-af38-8f0c16de0ee3 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:21.459318275 +0000 UTC m=+49.173386289 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls") pod "dns-default-ckvvn" (UID: "ed7942f7-292f-402b-af38-8f0c16de0ee3") : secret "dns-default-metrics-tls" not found Apr 24 16:39:13.987361 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:13.987321 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-85nzt" event={"ID":"afaef099-a861-4606-97c5-485da57c818f","Type":"ContainerStarted","Data":"af56ed1bb8cfc4c88f2efd3ea23bac32e6b2f576bc82e9925888c8d94160c312"} Apr 24 16:39:13.987571 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:13.987419 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:39:13.988593 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:13.988566 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4d6qk" event={"ID":"47d71194-f92b-4ce8-a112-c73134f86aa4","Type":"ContainerStarted","Data":"ec7d1ea2aba4a80bb6e8671410e795f08e5755d33e0b50f63f8a318757335901"} Apr 24 16:39:13.989846 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:13.989821 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cwpk5" event={"ID":"f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1","Type":"ContainerStarted","Data":"42f91ef1712be998f9133f46ff62a3653490046ab22347faac3f9e7e41547176"} Apr 24 16:39:13.992704 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:13.992682 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-5hcrw" event={"ID":"050c36c1-b0b2-434a-91f9-c53a02f67059","Type":"ContainerStarted","Data":"a69421ef88c7e4989fe62e8fdb860645d88231f9bdbb3f454c779d4228345002"} Apr 24 16:39:14.002358 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:14.002298 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-85nzt" podStartSLOduration=36.76693374 podStartE2EDuration="42.002282141s" podCreationTimestamp="2026-04-24 16:38:32 +0000 UTC" firstStartedPulling="2026-04-24 16:39:07.750787651 +0000 UTC m=+35.464855668" lastFinishedPulling="2026-04-24 16:39:12.986136041 +0000 UTC m=+40.700204069" observedRunningTime="2026-04-24 16:39:14.002195534 +0000 UTC m=+41.716263548" watchObservedRunningTime="2026-04-24 16:39:14.002282141 +0000 UTC m=+41.716350177" Apr 24 16:39:14.024069 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:14.024010 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-5hcrw" podStartSLOduration=8.905848986 podStartE2EDuration="41.023989278s" podCreationTimestamp="2026-04-24 16:38:33 +0000 UTC" firstStartedPulling="2026-04-24 16:38:35.648814575 +0000 UTC m=+3.362882589" lastFinishedPulling="2026-04-24 16:39:07.766954863 +0000 UTC m=+35.481022881" observedRunningTime="2026-04-24 16:39:14.022021784 +0000 UTC m=+41.736089820" watchObservedRunningTime="2026-04-24 16:39:14.023989278 +0000 UTC m=+41.738057315" Apr 24 16:39:14.036199 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:14.036147 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4d6qk" podStartSLOduration=28.791950907 podStartE2EDuration="34.036132007s" podCreationTimestamp="2026-04-24 16:38:40 +0000 UTC" firstStartedPulling="2026-04-24 16:39:07.746413317 +0000 UTC m=+35.460481334" lastFinishedPulling="2026-04-24 16:39:12.990594419 +0000 UTC m=+40.704662434" observedRunningTime="2026-04-24 16:39:14.035266793 +0000 UTC m=+41.749334827" watchObservedRunningTime="2026-04-24 16:39:14.036132007 +0000 UTC m=+41.750200042" Apr 24 16:39:21.519658 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:21.519609 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:21.519658 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:21.519658 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert\") pod \"ingress-canary-z6w7t\" (UID: \"cdbe4c96-edde-4285-9466-eeb5fd3f169b\") " pod="openshift-ingress-canary/ingress-canary-z6w7t" Apr 24 16:39:21.520249 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:21.519675 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls\") pod \"dns-default-ckvvn\" (UID: \"ed7942f7-292f-402b-af38-8f0c16de0ee3\") " pod="openshift-dns/dns-default-ckvvn" Apr 24 16:39:21.520249 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:21.519759 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:21.520249 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:21.519768 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:21.520249 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:21.519812 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls podName:ed7942f7-292f-402b-af38-8f0c16de0ee3 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:37.519797179 +0000 UTC m=+65.233865213 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls") pod "dns-default-ckvvn" (UID: "ed7942f7-292f-402b-af38-8f0c16de0ee3") : secret "dns-default-metrics-tls" not found Apr 24 16:39:21.520249 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:21.519770 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:21.520249 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:21.519825 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert podName:cdbe4c96-edde-4285-9466-eeb5fd3f169b nodeName:}" failed. No retries permitted until 2026-04-24 16:39:37.519819742 +0000 UTC m=+65.233887756 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert") pod "ingress-canary-z6w7t" (UID: "cdbe4c96-edde-4285-9466-eeb5fd3f169b") : secret "canary-serving-cert" not found Apr 24 16:39:21.520249 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:21.519826 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66c895d4b7-5gft4: secret "image-registry-tls" not found Apr 24 16:39:21.520249 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:21.519860 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls podName:a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:37.519844331 +0000 UTC m=+65.233912344 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls") pod "image-registry-66c895d4b7-5gft4" (UID: "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8") : secret "image-registry-tls" not found Apr 24 16:39:21.520249 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:21.519817 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fwvss\" (UID: \"9de18b4c-44be-4c2e-9b15-1b3401784bcd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" Apr 24 16:39:21.520249 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:21.519895 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 16:39:21.520249 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:21.519965 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert podName:9de18b4c-44be-4c2e-9b15-1b3401784bcd nodeName:}" failed. No retries permitted until 2026-04-24 16:39:37.519949394 +0000 UTC m=+65.234017409 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fwvss" (UID: "9de18b4c-44be-4c2e-9b15-1b3401784bcd") : secret "networking-console-plugin-cert" not found Apr 24 16:39:31.959883 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:31.959850 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jt59q" Apr 24 16:39:31.989532 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:31.989467 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-cwpk5" podStartSLOduration=36.751152944 podStartE2EDuration="41.989446044s" podCreationTimestamp="2026-04-24 16:38:50 +0000 UTC" firstStartedPulling="2026-04-24 16:39:07.748059053 +0000 UTC m=+35.462127070" lastFinishedPulling="2026-04-24 16:39:12.986352148 +0000 UTC m=+40.700420170" observedRunningTime="2026-04-24 16:39:14.051300849 +0000 UTC m=+41.765368883" watchObservedRunningTime="2026-04-24 16:39:31.989446044 +0000 UTC m=+59.703514081" Apr 24 16:39:37.540489 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:37.540425 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:39:37.540964 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:37.540508 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert\") pod \"ingress-canary-z6w7t\" (UID: \"cdbe4c96-edde-4285-9466-eeb5fd3f169b\") " pod="openshift-ingress-canary/ingress-canary-z6w7t" Apr 24 16:39:37.540964 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:37.540534 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls\") pod \"dns-default-ckvvn\" (UID: \"ed7942f7-292f-402b-af38-8f0c16de0ee3\") " pod="openshift-dns/dns-default-ckvvn" Apr 24 16:39:37.540964 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:37.540565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fwvss\" (UID: \"9de18b4c-44be-4c2e-9b15-1b3401784bcd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" Apr 24 16:39:37.540964 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:37.540596 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:39:37.540964 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:37.540620 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66c895d4b7-5gft4: secret "image-registry-tls" not found Apr 24 16:39:37.540964 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:37.540656 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:37.540964 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:37.540671 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 16:39:37.540964 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:37.540678 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:37.540964 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:37.540674 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls podName:a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:09.540657752 +0000 UTC m=+97.254725766 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls") pod "image-registry-66c895d4b7-5gft4" (UID: "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8") : secret "image-registry-tls" not found Apr 24 16:39:37.540964 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:37.540724 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert podName:cdbe4c96-edde-4285-9466-eeb5fd3f169b nodeName:}" failed. No retries permitted until 2026-04-24 16:40:09.540709204 +0000 UTC m=+97.254777218 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert") pod "ingress-canary-z6w7t" (UID: "cdbe4c96-edde-4285-9466-eeb5fd3f169b") : secret "canary-serving-cert" not found Apr 24 16:39:37.540964 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:37.540740 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert podName:9de18b4c-44be-4c2e-9b15-1b3401784bcd nodeName:}" failed. No retries permitted until 2026-04-24 16:40:09.540730187 +0000 UTC m=+97.254798207 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fwvss" (UID: "9de18b4c-44be-4c2e-9b15-1b3401784bcd") : secret "networking-console-plugin-cert" not found Apr 24 16:39:37.540964 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:37.540763 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls podName:ed7942f7-292f-402b-af38-8f0c16de0ee3 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:09.540753667 +0000 UTC m=+97.254821689 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls") pod "dns-default-ckvvn" (UID: "ed7942f7-292f-402b-af38-8f0c16de0ee3") : secret "dns-default-metrics-tls" not found Apr 24 16:39:38.548980 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:38.548935 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs\") pod \"network-metrics-daemon-f9bsr\" (UID: \"10ab450a-933f-4b41-8316-09109770ac99\") " pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:39:38.549540 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:38.549126 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 16:39:38.549540 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:39:38.549216 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs podName:10ab450a-933f-4b41-8316-09109770ac99 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:42.549193994 +0000 UTC m=+130.263262010 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs") pod "network-metrics-daemon-f9bsr" (UID: "10ab450a-933f-4b41-8316-09109770ac99") : secret "metrics-daemon-secret" not found Apr 24 16:39:44.997457 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:39:44.997425 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-85nzt" Apr 24 16:40:09.580908 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:40:09.580868 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:40:09.580908 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:40:09.580914 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert\") pod \"ingress-canary-z6w7t\" (UID: \"cdbe4c96-edde-4285-9466-eeb5fd3f169b\") " pod="openshift-ingress-canary/ingress-canary-z6w7t" Apr 24 16:40:09.581398 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:40:09.580932 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls\") pod \"dns-default-ckvvn\" (UID: \"ed7942f7-292f-402b-af38-8f0c16de0ee3\") " pod="openshift-dns/dns-default-ckvvn" Apr 24 16:40:09.581398 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:40:09.580994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fwvss\" (UID: \"9de18b4c-44be-4c2e-9b15-1b3401784bcd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" Apr 24 16:40:09.581398 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:40:09.581032 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:40:09.581398 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:40:09.581083 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:40:09.581398 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:40:09.581112 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert podName:cdbe4c96-edde-4285-9466-eeb5fd3f169b nodeName:}" failed. No retries permitted until 2026-04-24 16:41:13.58107632 +0000 UTC m=+161.295144353 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert") pod "ingress-canary-z6w7t" (UID: "cdbe4c96-edde-4285-9466-eeb5fd3f169b") : secret "canary-serving-cert" not found Apr 24 16:40:09.581398 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:40:09.581120 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66c895d4b7-5gft4: secret "image-registry-tls" not found Apr 24 16:40:09.581398 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:40:09.581173 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls podName:a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:13.581161131 +0000 UTC m=+161.295229166 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls") pod "image-registry-66c895d4b7-5gft4" (UID: "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8") : secret "image-registry-tls" not found Apr 24 16:40:09.581398 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:40:09.581175 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 16:40:09.581398 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:40:09.581196 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:40:09.581398 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:40:09.581266 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert podName:9de18b4c-44be-4c2e-9b15-1b3401784bcd nodeName:}" failed. No retries permitted until 2026-04-24 16:41:13.581248409 +0000 UTC m=+161.295316425 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fwvss" (UID: "9de18b4c-44be-4c2e-9b15-1b3401784bcd") : secret "networking-console-plugin-cert" not found Apr 24 16:40:09.581398 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:40:09.581292 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls podName:ed7942f7-292f-402b-af38-8f0c16de0ee3 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:13.581280568 +0000 UTC m=+161.295348581 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls") pod "dns-default-ckvvn" (UID: "ed7942f7-292f-402b-af38-8f0c16de0ee3") : secret "dns-default-metrics-tls" not found Apr 24 16:40:42.623215 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:40:42.623156 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs\") pod \"network-metrics-daemon-f9bsr\" (UID: \"10ab450a-933f-4b41-8316-09109770ac99\") " pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:40:42.623771 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:40:42.623321 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 16:40:42.624864 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:40:42.624836 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs podName:10ab450a-933f-4b41-8316-09109770ac99 nodeName:}" failed. No retries permitted until 2026-04-24 16:42:44.624254516 +0000 UTC m=+252.338322544 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs") pod "network-metrics-daemon-f9bsr" (UID: "10ab450a-933f-4b41-8316-09109770ac99") : secret "metrics-daemon-secret" not found Apr 24 16:41:00.974288 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:00.974254 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq"] Apr 24 16:41:00.977103 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:00.977074 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq" Apr 24 16:41:00.979704 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:00.979674 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 16:41:00.981857 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:00.981819 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 16:41:00.982065 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:00.982047 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 16:41:00.982065 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:00.982050 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 16:41:00.982241 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:00.982187 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-hp5ct\"" Apr 24 16:41:00.992411 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:00.992379 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq"] Apr 24 16:41:01.062243 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.062192 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/52705a67-f97f-488e-adc3-2f562fd2fd0e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ccsnq\" (UID: \"52705a67-f97f-488e-adc3-2f562fd2fd0e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq" Apr 24 16:41:01.062243 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.062231 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/52705a67-f97f-488e-adc3-2f562fd2fd0e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-ccsnq\" (UID: \"52705a67-f97f-488e-adc3-2f562fd2fd0e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq" Apr 24 16:41:01.062451 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.062253 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8qbm\" (UniqueName: \"kubernetes.io/projected/52705a67-f97f-488e-adc3-2f562fd2fd0e-kube-api-access-k8qbm\") pod \"cluster-monitoring-operator-75587bd455-ccsnq\" (UID: \"52705a67-f97f-488e-adc3-2f562fd2fd0e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq" Apr 24 16:41:01.079045 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.079010 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-df7xx"] Apr 24 16:41:01.081960 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.081941 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-df7xx" Apr 24 16:41:01.084416 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.084394 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 16:41:01.084604 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.084584 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 16:41:01.084689 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.084628 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 16:41:01.084787 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.084766 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 16:41:01.084923 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.084910 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-hzkcl\"" Apr 24 16:41:01.091225 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.091203 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 16:41:01.094027 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.094002 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-df7xx"] Apr 24 16:41:01.163377 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.163336 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1786d47-e613-4796-a98f-1ea71904bff8-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-df7xx\" (UID: \"c1786d47-e613-4796-a98f-1ea71904bff8\") " pod="openshift-insights/insights-operator-585dfdc468-df7xx" Apr 24 16:41:01.163377 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.163378 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1786d47-e613-4796-a98f-1ea71904bff8-service-ca-bundle\") pod \"insights-operator-585dfdc468-df7xx\" (UID: \"c1786d47-e613-4796-a98f-1ea71904bff8\") " pod="openshift-insights/insights-operator-585dfdc468-df7xx" Apr 24 16:41:01.163609 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.163400 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c1786d47-e613-4796-a98f-1ea71904bff8-tmp\") pod \"insights-operator-585dfdc468-df7xx\" (UID: \"c1786d47-e613-4796-a98f-1ea71904bff8\") " pod="openshift-insights/insights-operator-585dfdc468-df7xx" Apr 24 16:41:01.163609 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.163417 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfpkv\" (UniqueName: \"kubernetes.io/projected/c1786d47-e613-4796-a98f-1ea71904bff8-kube-api-access-qfpkv\") pod \"insights-operator-585dfdc468-df7xx\" (UID: \"c1786d47-e613-4796-a98f-1ea71904bff8\") " pod="openshift-insights/insights-operator-585dfdc468-df7xx" Apr 24 16:41:01.163609 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.163565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/52705a67-f97f-488e-adc3-2f562fd2fd0e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ccsnq\" (UID: \"52705a67-f97f-488e-adc3-2f562fd2fd0e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq" Apr 24 16:41:01.163609 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.163596 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c1786d47-e613-4796-a98f-1ea71904bff8-snapshots\") pod \"insights-operator-585dfdc468-df7xx\" (UID: \"c1786d47-e613-4796-a98f-1ea71904bff8\") " pod="openshift-insights/insights-operator-585dfdc468-df7xx" Apr 24 16:41:01.163733 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.163621 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/52705a67-f97f-488e-adc3-2f562fd2fd0e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-ccsnq\" (UID: \"52705a67-f97f-488e-adc3-2f562fd2fd0e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq" Apr 24 16:41:01.163733 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.163647 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8qbm\" (UniqueName: \"kubernetes.io/projected/52705a67-f97f-488e-adc3-2f562fd2fd0e-kube-api-access-k8qbm\") pod \"cluster-monitoring-operator-75587bd455-ccsnq\" (UID: \"52705a67-f97f-488e-adc3-2f562fd2fd0e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq" Apr 24 16:41:01.163733 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.163665 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1786d47-e613-4796-a98f-1ea71904bff8-serving-cert\") pod \"insights-operator-585dfdc468-df7xx\" (UID: \"c1786d47-e613-4796-a98f-1ea71904bff8\") " pod="openshift-insights/insights-operator-585dfdc468-df7xx" Apr 24 16:41:01.163820 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:01.163759 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:01.163852 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:01.163823 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52705a67-f97f-488e-adc3-2f562fd2fd0e-cluster-monitoring-operator-tls podName:52705a67-f97f-488e-adc3-2f562fd2fd0e nodeName:}" failed. No retries permitted until 2026-04-24 16:41:01.663808206 +0000 UTC m=+149.377876220 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/52705a67-f97f-488e-adc3-2f562fd2fd0e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-ccsnq" (UID: "52705a67-f97f-488e-adc3-2f562fd2fd0e") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:01.164411 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.164392 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/52705a67-f97f-488e-adc3-2f562fd2fd0e-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-ccsnq\" (UID: \"52705a67-f97f-488e-adc3-2f562fd2fd0e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq" Apr 24 16:41:01.174282 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.174249 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8qbm\" (UniqueName: \"kubernetes.io/projected/52705a67-f97f-488e-adc3-2f562fd2fd0e-kube-api-access-k8qbm\") pod \"cluster-monitoring-operator-75587bd455-ccsnq\" (UID: \"52705a67-f97f-488e-adc3-2f562fd2fd0e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq" Apr 24 16:41:01.264404 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.264305 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1786d47-e613-4796-a98f-1ea71904bff8-service-ca-bundle\") pod \"insights-operator-585dfdc468-df7xx\" (UID: \"c1786d47-e613-4796-a98f-1ea71904bff8\") " pod="openshift-insights/insights-operator-585dfdc468-df7xx" Apr 24 16:41:01.264404 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.264362 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c1786d47-e613-4796-a98f-1ea71904bff8-tmp\") pod \"insights-operator-585dfdc468-df7xx\" (UID: \"c1786d47-e613-4796-a98f-1ea71904bff8\") " pod="openshift-insights/insights-operator-585dfdc468-df7xx" Apr 24 16:41:01.264404 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.264387 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfpkv\" (UniqueName: \"kubernetes.io/projected/c1786d47-e613-4796-a98f-1ea71904bff8-kube-api-access-qfpkv\") pod \"insights-operator-585dfdc468-df7xx\" (UID: \"c1786d47-e613-4796-a98f-1ea71904bff8\") " pod="openshift-insights/insights-operator-585dfdc468-df7xx" Apr 24 16:41:01.264660 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.264506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c1786d47-e613-4796-a98f-1ea71904bff8-snapshots\") pod \"insights-operator-585dfdc468-df7xx\" (UID: \"c1786d47-e613-4796-a98f-1ea71904bff8\") " pod="openshift-insights/insights-operator-585dfdc468-df7xx" Apr 24 16:41:01.264660 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.264630 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1786d47-e613-4796-a98f-1ea71904bff8-serving-cert\") pod \"insights-operator-585dfdc468-df7xx\" (UID: \"c1786d47-e613-4796-a98f-1ea71904bff8\") " pod="openshift-insights/insights-operator-585dfdc468-df7xx" Apr 24 16:41:01.264757 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.264695 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1786d47-e613-4796-a98f-1ea71904bff8-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-df7xx\" (UID: \"c1786d47-e613-4796-a98f-1ea71904bff8\") " pod="openshift-insights/insights-operator-585dfdc468-df7xx" Apr 24 16:41:01.265037 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.265007 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1786d47-e613-4796-a98f-1ea71904bff8-service-ca-bundle\") pod \"insights-operator-585dfdc468-df7xx\" (UID: \"c1786d47-e613-4796-a98f-1ea71904bff8\") " pod="openshift-insights/insights-operator-585dfdc468-df7xx" Apr 24 16:41:01.265145 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.265035 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c1786d47-e613-4796-a98f-1ea71904bff8-tmp\") pod \"insights-operator-585dfdc468-df7xx\" (UID: \"c1786d47-e613-4796-a98f-1ea71904bff8\") " pod="openshift-insights/insights-operator-585dfdc468-df7xx" Apr 24 16:41:01.265185 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.265159 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/c1786d47-e613-4796-a98f-1ea71904bff8-snapshots\") pod \"insights-operator-585dfdc468-df7xx\" (UID: \"c1786d47-e613-4796-a98f-1ea71904bff8\") " pod="openshift-insights/insights-operator-585dfdc468-df7xx" Apr 24 16:41:01.265702 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.265684 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c1786d47-e613-4796-a98f-1ea71904bff8-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-df7xx\" (UID: \"c1786d47-e613-4796-a98f-1ea71904bff8\") " pod="openshift-insights/insights-operator-585dfdc468-df7xx" Apr 24 16:41:01.266955 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.266933 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1786d47-e613-4796-a98f-1ea71904bff8-serving-cert\") pod \"insights-operator-585dfdc468-df7xx\" (UID: \"c1786d47-e613-4796-a98f-1ea71904bff8\") " pod="openshift-insights/insights-operator-585dfdc468-df7xx" Apr 24 16:41:01.277812 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.277775 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfpkv\" (UniqueName: \"kubernetes.io/projected/c1786d47-e613-4796-a98f-1ea71904bff8-kube-api-access-qfpkv\") pod \"insights-operator-585dfdc468-df7xx\" (UID: \"c1786d47-e613-4796-a98f-1ea71904bff8\") " pod="openshift-insights/insights-operator-585dfdc468-df7xx" Apr 24 16:41:01.392621 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.392578 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-df7xx" Apr 24 16:41:01.513023 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.512991 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-df7xx"] Apr 24 16:41:01.668231 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:01.668193 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/52705a67-f97f-488e-adc3-2f562fd2fd0e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ccsnq\" (UID: \"52705a67-f97f-488e-adc3-2f562fd2fd0e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq" Apr 24 16:41:01.668427 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:01.668360 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:01.668489 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:01.668444 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52705a67-f97f-488e-adc3-2f562fd2fd0e-cluster-monitoring-operator-tls podName:52705a67-f97f-488e-adc3-2f562fd2fd0e nodeName:}" failed. No retries permitted until 2026-04-24 16:41:02.668420254 +0000 UTC m=+150.382488289 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/52705a67-f97f-488e-adc3-2f562fd2fd0e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-ccsnq" (UID: "52705a67-f97f-488e-adc3-2f562fd2fd0e") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:02.209990 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:02.209951 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-df7xx" event={"ID":"c1786d47-e613-4796-a98f-1ea71904bff8","Type":"ContainerStarted","Data":"9de4a5a861a4bd631ad476eb0fcb89884007605bc2f45194f6fb28588e120a1d"} Apr 24 16:41:02.677669 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:02.677630 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/52705a67-f97f-488e-adc3-2f562fd2fd0e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ccsnq\" (UID: \"52705a67-f97f-488e-adc3-2f562fd2fd0e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq" Apr 24 16:41:02.677858 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:02.677773 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:02.677858 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:02.677835 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52705a67-f97f-488e-adc3-2f562fd2fd0e-cluster-monitoring-operator-tls podName:52705a67-f97f-488e-adc3-2f562fd2fd0e nodeName:}" failed. No retries permitted until 2026-04-24 16:41:04.677817944 +0000 UTC m=+152.391885960 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/52705a67-f97f-488e-adc3-2f562fd2fd0e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-ccsnq" (UID: "52705a67-f97f-488e-adc3-2f562fd2fd0e") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:03.217574 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:03.217535 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-df7xx" event={"ID":"c1786d47-e613-4796-a98f-1ea71904bff8","Type":"ContainerStarted","Data":"3b9ae78b21e42abdc6034e548b7291c53378cfece72678fa2ea056a002bcec69"} Apr 24 16:41:03.234382 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:03.234324 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-df7xx" podStartSLOduration=0.637851548 podStartE2EDuration="2.234301289s" podCreationTimestamp="2026-04-24 16:41:01 +0000 UTC" firstStartedPulling="2026-04-24 16:41:01.517207385 +0000 UTC m=+149.231275398" lastFinishedPulling="2026-04-24 16:41:03.113657123 +0000 UTC m=+150.827725139" observedRunningTime="2026-04-24 16:41:03.233927996 +0000 UTC m=+150.947996033" watchObservedRunningTime="2026-04-24 16:41:03.234301289 +0000 UTC m=+150.948369324" Apr 24 16:41:04.693082 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:04.693033 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/52705a67-f97f-488e-adc3-2f562fd2fd0e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ccsnq\" (UID: \"52705a67-f97f-488e-adc3-2f562fd2fd0e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq" Apr 24 16:41:04.693505 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:04.693217 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:04.693505 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:04.693304 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52705a67-f97f-488e-adc3-2f562fd2fd0e-cluster-monitoring-operator-tls podName:52705a67-f97f-488e-adc3-2f562fd2fd0e nodeName:}" failed. No retries permitted until 2026-04-24 16:41:08.693287664 +0000 UTC m=+156.407355678 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/52705a67-f97f-488e-adc3-2f562fd2fd0e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-ccsnq" (UID: "52705a67-f97f-488e-adc3-2f562fd2fd0e") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:06.138522 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:06.138493 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7hmxs_06388f4e-daeb-4db0-906e-01adfa3e3b97/dns-node-resolver/0.log" Apr 24 16:41:07.138189 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:07.138164 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-psxtk_34943398-acf4-440b-900d-999cb567a483/node-ca/0.log" Apr 24 16:41:08.668812 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:08.668766 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" podUID="a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8" Apr 24 16:41:08.686701 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:08.686657 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-ckvvn" podUID="ed7942f7-292f-402b-af38-8f0c16de0ee3" Apr 24 16:41:08.696868 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:08.696830 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" podUID="9de18b4c-44be-4c2e-9b15-1b3401784bcd" Apr 24 16:41:08.713199 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:08.713164 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-z6w7t" podUID="cdbe4c96-edde-4285-9466-eeb5fd3f169b" Apr 24 16:41:08.721448 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:08.721412 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/52705a67-f97f-488e-adc3-2f562fd2fd0e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ccsnq\" (UID: \"52705a67-f97f-488e-adc3-2f562fd2fd0e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq" Apr 24 16:41:08.721588 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:08.721568 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:08.721644 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:08.721635 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52705a67-f97f-488e-adc3-2f562fd2fd0e-cluster-monitoring-operator-tls podName:52705a67-f97f-488e-adc3-2f562fd2fd0e nodeName:}" failed. No retries permitted until 2026-04-24 16:41:16.721619871 +0000 UTC m=+164.435687884 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/52705a67-f97f-488e-adc3-2f562fd2fd0e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-ccsnq" (UID: "52705a67-f97f-488e-adc3-2f562fd2fd0e") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:08.854444 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:08.854396 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-f9bsr" podUID="10ab450a-933f-4b41-8316-09109770ac99" Apr 24 16:41:09.229764 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:09.229732 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ckvvn" Apr 24 16:41:09.229937 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:09.229732 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z6w7t" Apr 24 16:41:09.229937 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:09.229732 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" Apr 24 16:41:09.230009 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:09.229732 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:41:11.084482 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.084440 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-whl7q"] Apr 24 16:41:11.087830 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.087804 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-whl7q" Apr 24 16:41:11.090271 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.090249 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 16:41:11.090271 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.090261 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 16:41:11.090466 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.090276 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:41:11.091176 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.091160 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-5d9jm\"" Apr 24 16:41:11.091259 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.091244 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 16:41:11.099800 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.099775 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-whl7q"] Apr 24 16:41:11.139904 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.139862 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7678b4b-fef3-4d8d-92a1-96d074b744a0-config\") pod \"service-ca-operator-d6fc45fc5-whl7q\" (UID: \"a7678b4b-fef3-4d8d-92a1-96d074b744a0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-whl7q" Apr 24 16:41:11.139904 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.139900 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7678b4b-fef3-4d8d-92a1-96d074b744a0-serving-cert\") pod \"service-ca-operator-d6fc45fc5-whl7q\" (UID: \"a7678b4b-fef3-4d8d-92a1-96d074b744a0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-whl7q" Apr 24 16:41:11.140150 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.140066 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxktn\" (UniqueName: \"kubernetes.io/projected/a7678b4b-fef3-4d8d-92a1-96d074b744a0-kube-api-access-jxktn\") pod \"service-ca-operator-d6fc45fc5-whl7q\" (UID: \"a7678b4b-fef3-4d8d-92a1-96d074b744a0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-whl7q" Apr 24 16:41:11.188850 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.188816 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-cf274"] Apr 24 16:41:11.191927 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.191908 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-cf274" Apr 24 16:41:11.195265 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.195235 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:41:11.195414 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.195265 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 16:41:11.195607 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.195275 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-7dksk\"" Apr 24 16:41:11.195759 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.195297 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 16:41:11.195869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.195818 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 16:41:11.196441 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.196424 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qtzff"] Apr 24 16:41:11.199322 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.199209 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wxbj2"] Apr 24 16:41:11.199580 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.199558 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qtzff" Apr 24 16:41:11.201813 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.201796 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 16:41:11.202271 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.202252 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:41:11.202374 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.202276 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-d5f4p\"" Apr 24 16:41:11.202434 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.202396 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-cf274"] Apr 24 16:41:11.202534 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.202520 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wxbj2" Apr 24 16:41:11.204886 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.204867 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 16:41:11.204886 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.204897 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-st957\"" Apr 24 16:41:11.205050 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.205012 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 16:41:11.205526 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.205508 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:41:11.210962 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.210938 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qtzff"] Apr 24 16:41:11.212050 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.212030 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wxbj2"] Apr 24 16:41:11.240715 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.240680 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxktn\" (UniqueName: \"kubernetes.io/projected/a7678b4b-fef3-4d8d-92a1-96d074b744a0-kube-api-access-jxktn\") pod \"service-ca-operator-d6fc45fc5-whl7q\" (UID: \"a7678b4b-fef3-4d8d-92a1-96d074b744a0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-whl7q" Apr 24 16:41:11.240715 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.240721 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vr2t\" (UniqueName: \"kubernetes.io/projected/a3504429-9f84-4f23-a196-d187ad6d16d6-kube-api-access-9vr2t\") pod \"kube-storage-version-migrator-operator-6769c5d45-cf274\" (UID: \"a3504429-9f84-4f23-a196-d187ad6d16d6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-cf274" Apr 24 16:41:11.240976 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.240756 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3504429-9f84-4f23-a196-d187ad6d16d6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-cf274\" (UID: \"a3504429-9f84-4f23-a196-d187ad6d16d6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-cf274" Apr 24 16:41:11.240976 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.240778 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7678b4b-fef3-4d8d-92a1-96d074b744a0-config\") pod \"service-ca-operator-d6fc45fc5-whl7q\" (UID: \"a7678b4b-fef3-4d8d-92a1-96d074b744a0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-whl7q" Apr 24 16:41:11.240976 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.240795 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7678b4b-fef3-4d8d-92a1-96d074b744a0-serving-cert\") pod \"service-ca-operator-d6fc45fc5-whl7q\" (UID: \"a7678b4b-fef3-4d8d-92a1-96d074b744a0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-whl7q" Apr 24 16:41:11.240976 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.240819 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3504429-9f84-4f23-a196-d187ad6d16d6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-cf274\" (UID: \"a3504429-9f84-4f23-a196-d187ad6d16d6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-cf274" Apr 24 16:41:11.241394 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.241371 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a7678b4b-fef3-4d8d-92a1-96d074b744a0-config\") pod \"service-ca-operator-d6fc45fc5-whl7q\" (UID: \"a7678b4b-fef3-4d8d-92a1-96d074b744a0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-whl7q" Apr 24 16:41:11.243250 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.243224 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a7678b4b-fef3-4d8d-92a1-96d074b744a0-serving-cert\") pod \"service-ca-operator-d6fc45fc5-whl7q\" (UID: \"a7678b4b-fef3-4d8d-92a1-96d074b744a0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-whl7q" Apr 24 16:41:11.249062 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.249036 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxktn\" (UniqueName: \"kubernetes.io/projected/a7678b4b-fef3-4d8d-92a1-96d074b744a0-kube-api-access-jxktn\") pod \"service-ca-operator-d6fc45fc5-whl7q\" (UID: \"a7678b4b-fef3-4d8d-92a1-96d074b744a0\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-whl7q" Apr 24 16:41:11.341580 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.341479 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mvbj\" (UniqueName: \"kubernetes.io/projected/d199c90e-ff14-4334-8ee4-d6f19aa8c243-kube-api-access-8mvbj\") pod \"volume-data-source-validator-7c6cbb6c87-qtzff\" (UID: \"d199c90e-ff14-4334-8ee4-d6f19aa8c243\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qtzff" Apr 24 16:41:11.341580 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.341530 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jbdj\" (UniqueName: \"kubernetes.io/projected/fd3fdb31-2e0f-48dc-8731-82c3abae5caf-kube-api-access-9jbdj\") pod \"cluster-samples-operator-6dc5bdb6b4-wxbj2\" (UID: \"fd3fdb31-2e0f-48dc-8731-82c3abae5caf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wxbj2" Apr 24 16:41:11.341580 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.341549 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vr2t\" (UniqueName: \"kubernetes.io/projected/a3504429-9f84-4f23-a196-d187ad6d16d6-kube-api-access-9vr2t\") pod \"kube-storage-version-migrator-operator-6769c5d45-cf274\" (UID: \"a3504429-9f84-4f23-a196-d187ad6d16d6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-cf274" Apr 24 16:41:11.341790 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.341625 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3504429-9f84-4f23-a196-d187ad6d16d6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-cf274\" (UID: \"a3504429-9f84-4f23-a196-d187ad6d16d6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-cf274" Apr 24 16:41:11.341790 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.341678 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3504429-9f84-4f23-a196-d187ad6d16d6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-cf274\" (UID: \"a3504429-9f84-4f23-a196-d187ad6d16d6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-cf274" Apr 24 16:41:11.341790 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.341708 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd3fdb31-2e0f-48dc-8731-82c3abae5caf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wxbj2\" (UID: \"fd3fdb31-2e0f-48dc-8731-82c3abae5caf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wxbj2" Apr 24 16:41:11.342265 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.342240 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3504429-9f84-4f23-a196-d187ad6d16d6-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-cf274\" (UID: \"a3504429-9f84-4f23-a196-d187ad6d16d6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-cf274" Apr 24 16:41:11.343810 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.343792 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3504429-9f84-4f23-a196-d187ad6d16d6-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-cf274\" (UID: \"a3504429-9f84-4f23-a196-d187ad6d16d6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-cf274" Apr 24 16:41:11.349736 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.349714 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vr2t\" (UniqueName: \"kubernetes.io/projected/a3504429-9f84-4f23-a196-d187ad6d16d6-kube-api-access-9vr2t\") pod \"kube-storage-version-migrator-operator-6769c5d45-cf274\" (UID: \"a3504429-9f84-4f23-a196-d187ad6d16d6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-cf274" Apr 24 16:41:11.396749 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.396707 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-whl7q" Apr 24 16:41:11.443135 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.442939 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mvbj\" (UniqueName: \"kubernetes.io/projected/d199c90e-ff14-4334-8ee4-d6f19aa8c243-kube-api-access-8mvbj\") pod \"volume-data-source-validator-7c6cbb6c87-qtzff\" (UID: \"d199c90e-ff14-4334-8ee4-d6f19aa8c243\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qtzff" Apr 24 16:41:11.443135 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.442989 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jbdj\" (UniqueName: \"kubernetes.io/projected/fd3fdb31-2e0f-48dc-8731-82c3abae5caf-kube-api-access-9jbdj\") pod \"cluster-samples-operator-6dc5bdb6b4-wxbj2\" (UID: \"fd3fdb31-2e0f-48dc-8731-82c3abae5caf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wxbj2" Apr 24 16:41:11.443135 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.443104 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd3fdb31-2e0f-48dc-8731-82c3abae5caf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wxbj2\" (UID: \"fd3fdb31-2e0f-48dc-8731-82c3abae5caf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wxbj2" Apr 24 16:41:11.443327 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:11.443239 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 16:41:11.443327 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:11.443302 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd3fdb31-2e0f-48dc-8731-82c3abae5caf-samples-operator-tls podName:fd3fdb31-2e0f-48dc-8731-82c3abae5caf nodeName:}" failed. No retries permitted until 2026-04-24 16:41:11.943282815 +0000 UTC m=+159.657350841 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/fd3fdb31-2e0f-48dc-8731-82c3abae5caf-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wxbj2" (UID: "fd3fdb31-2e0f-48dc-8731-82c3abae5caf") : secret "samples-operator-tls" not found Apr 24 16:41:11.455149 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.455071 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jbdj\" (UniqueName: \"kubernetes.io/projected/fd3fdb31-2e0f-48dc-8731-82c3abae5caf-kube-api-access-9jbdj\") pod \"cluster-samples-operator-6dc5bdb6b4-wxbj2\" (UID: \"fd3fdb31-2e0f-48dc-8731-82c3abae5caf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wxbj2" Apr 24 16:41:11.455497 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.455473 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mvbj\" (UniqueName: \"kubernetes.io/projected/d199c90e-ff14-4334-8ee4-d6f19aa8c243-kube-api-access-8mvbj\") pod \"volume-data-source-validator-7c6cbb6c87-qtzff\" (UID: \"d199c90e-ff14-4334-8ee4-d6f19aa8c243\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qtzff" Apr 24 16:41:11.502596 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.502563 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-cf274" Apr 24 16:41:11.513423 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.513394 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qtzff" Apr 24 16:41:11.513851 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.513827 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-whl7q"] Apr 24 16:41:11.518725 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:41:11.518695 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7678b4b_fef3_4d8d_92a1_96d074b744a0.slice/crio-3135cd1c6c5e75ece2d7d04167b83c9662ae1e59432ceefa41afdc40d2fe74b0 WatchSource:0}: Error finding container 3135cd1c6c5e75ece2d7d04167b83c9662ae1e59432ceefa41afdc40d2fe74b0: Status 404 returned error can't find the container with id 3135cd1c6c5e75ece2d7d04167b83c9662ae1e59432ceefa41afdc40d2fe74b0 Apr 24 16:41:11.629019 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.628971 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-cf274"] Apr 24 16:41:11.633302 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:41:11.633273 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3504429_9f84_4f23_a196_d187ad6d16d6.slice/crio-c48e3dbb9efbd3b8a8723fd6f21782bd3bda136295ebd7f1d15da645380d56e7 WatchSource:0}: Error finding container c48e3dbb9efbd3b8a8723fd6f21782bd3bda136295ebd7f1d15da645380d56e7: Status 404 returned error can't find the container with id c48e3dbb9efbd3b8a8723fd6f21782bd3bda136295ebd7f1d15da645380d56e7 Apr 24 16:41:11.645798 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.645773 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qtzff"] Apr 24 16:41:11.648702 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:41:11.648669 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd199c90e_ff14_4334_8ee4_d6f19aa8c243.slice/crio-32794d84aa551188fc41a268eb8b4fc36ae549789a4c2cba16df830e6acded9e WatchSource:0}: Error finding container 32794d84aa551188fc41a268eb8b4fc36ae549789a4c2cba16df830e6acded9e: Status 404 returned error can't find the container with id 32794d84aa551188fc41a268eb8b4fc36ae549789a4c2cba16df830e6acded9e Apr 24 16:41:11.947207 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:11.947167 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd3fdb31-2e0f-48dc-8731-82c3abae5caf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wxbj2\" (UID: \"fd3fdb31-2e0f-48dc-8731-82c3abae5caf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wxbj2" Apr 24 16:41:11.947389 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:11.947292 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 16:41:11.947389 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:11.947347 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd3fdb31-2e0f-48dc-8731-82c3abae5caf-samples-operator-tls podName:fd3fdb31-2e0f-48dc-8731-82c3abae5caf nodeName:}" failed. No retries permitted until 2026-04-24 16:41:12.947332942 +0000 UTC m=+160.661400958 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/fd3fdb31-2e0f-48dc-8731-82c3abae5caf-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wxbj2" (UID: "fd3fdb31-2e0f-48dc-8731-82c3abae5caf") : secret "samples-operator-tls" not found Apr 24 16:41:12.237920 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:12.237794 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-cf274" event={"ID":"a3504429-9f84-4f23-a196-d187ad6d16d6","Type":"ContainerStarted","Data":"c48e3dbb9efbd3b8a8723fd6f21782bd3bda136295ebd7f1d15da645380d56e7"} Apr 24 16:41:12.241135 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:12.241061 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qtzff" event={"ID":"d199c90e-ff14-4334-8ee4-d6f19aa8c243","Type":"ContainerStarted","Data":"32794d84aa551188fc41a268eb8b4fc36ae549789a4c2cba16df830e6acded9e"} Apr 24 16:41:12.243616 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:12.243581 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-whl7q" event={"ID":"a7678b4b-fef3-4d8d-92a1-96d074b744a0","Type":"ContainerStarted","Data":"3135cd1c6c5e75ece2d7d04167b83c9662ae1e59432ceefa41afdc40d2fe74b0"} Apr 24 16:41:12.954265 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:12.954217 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd3fdb31-2e0f-48dc-8731-82c3abae5caf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wxbj2\" (UID: \"fd3fdb31-2e0f-48dc-8731-82c3abae5caf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wxbj2" Apr 24 16:41:12.954444 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:12.954396 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 16:41:12.954496 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:12.954484 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd3fdb31-2e0f-48dc-8731-82c3abae5caf-samples-operator-tls podName:fd3fdb31-2e0f-48dc-8731-82c3abae5caf nodeName:}" failed. No retries permitted until 2026-04-24 16:41:14.954466194 +0000 UTC m=+162.668534208 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/fd3fdb31-2e0f-48dc-8731-82c3abae5caf-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wxbj2" (UID: "fd3fdb31-2e0f-48dc-8731-82c3abae5caf") : secret "samples-operator-tls" not found Apr 24 16:41:13.660870 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:13.660818 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert\") pod \"ingress-canary-z6w7t\" (UID: \"cdbe4c96-edde-4285-9466-eeb5fd3f169b\") " pod="openshift-ingress-canary/ingress-canary-z6w7t" Apr 24 16:41:13.660870 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:13.660876 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls\") pod \"dns-default-ckvvn\" (UID: \"ed7942f7-292f-402b-af38-8f0c16de0ee3\") " pod="openshift-dns/dns-default-ckvvn" Apr 24 16:41:13.661497 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:13.660954 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fwvss\" (UID: \"9de18b4c-44be-4c2e-9b15-1b3401784bcd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" Apr 24 16:41:13.661497 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:13.660979 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:41:13.661497 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:13.661062 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 16:41:13.661497 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:13.661067 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:41:13.661497 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:13.660990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls\") pod \"image-registry-66c895d4b7-5gft4\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:41:13.661497 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:13.661064 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:41:13.661497 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:13.661150 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-66c895d4b7-5gft4: secret "image-registry-tls" not found Apr 24 16:41:13.661497 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:13.661079 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert podName:cdbe4c96-edde-4285-9466-eeb5fd3f169b nodeName:}" failed. No retries permitted until 2026-04-24 16:43:15.661057506 +0000 UTC m=+283.375125524 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert") pod "ingress-canary-z6w7t" (UID: "cdbe4c96-edde-4285-9466-eeb5fd3f169b") : secret "canary-serving-cert" not found Apr 24 16:41:13.661497 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:13.661210 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert podName:9de18b4c-44be-4c2e-9b15-1b3401784bcd nodeName:}" failed. No retries permitted until 2026-04-24 16:43:15.661190895 +0000 UTC m=+283.375258913 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-fwvss" (UID: "9de18b4c-44be-4c2e-9b15-1b3401784bcd") : secret "networking-console-plugin-cert" not found Apr 24 16:41:13.661497 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:13.661235 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls podName:ed7942f7-292f-402b-af38-8f0c16de0ee3 nodeName:}" failed. No retries permitted until 2026-04-24 16:43:15.661222738 +0000 UTC m=+283.375290761 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls") pod "dns-default-ckvvn" (UID: "ed7942f7-292f-402b-af38-8f0c16de0ee3") : secret "dns-default-metrics-tls" not found Apr 24 16:41:13.661497 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:13.661259 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls podName:a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8 nodeName:}" failed. No retries permitted until 2026-04-24 16:43:15.661249973 +0000 UTC m=+283.375317993 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls") pod "image-registry-66c895d4b7-5gft4" (UID: "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8") : secret "image-registry-tls" not found Apr 24 16:41:14.249878 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:14.249784 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qtzff" event={"ID":"d199c90e-ff14-4334-8ee4-d6f19aa8c243","Type":"ContainerStarted","Data":"9ca0eed0c785d8514ae9a0bdabeacce473d716934911a454f43f4120a052b674"} Apr 24 16:41:14.251068 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:14.251038 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-whl7q" event={"ID":"a7678b4b-fef3-4d8d-92a1-96d074b744a0","Type":"ContainerStarted","Data":"bcaee9df0a0c9628a978e5653890f4780ed6b0a225badf1fe560f26684f6a913"} Apr 24 16:41:14.252490 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:14.252457 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-cf274" event={"ID":"a3504429-9f84-4f23-a196-d187ad6d16d6","Type":"ContainerStarted","Data":"cc67f02e23624804425f350846ae37ae17e382d960c37d5e8e80f95db3101174"} Apr 24 16:41:14.265058 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:14.265004 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-qtzff" podStartSLOduration=0.942015229 podStartE2EDuration="3.2649895s" podCreationTimestamp="2026-04-24 16:41:11 +0000 UTC" firstStartedPulling="2026-04-24 16:41:11.650617561 +0000 UTC m=+159.364685578" lastFinishedPulling="2026-04-24 16:41:13.973591834 +0000 UTC m=+161.687659849" observedRunningTime="2026-04-24 16:41:14.263880688 +0000 UTC m=+161.977948723" watchObservedRunningTime="2026-04-24 16:41:14.2649895 +0000 UTC m=+161.979057574" Apr 24 16:41:14.282221 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:14.282157 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-cf274" podStartSLOduration=0.935384933 podStartE2EDuration="3.282082336s" podCreationTimestamp="2026-04-24 16:41:11 +0000 UTC" firstStartedPulling="2026-04-24 16:41:11.635211923 +0000 UTC m=+159.349279952" lastFinishedPulling="2026-04-24 16:41:13.98190934 +0000 UTC m=+161.695977355" observedRunningTime="2026-04-24 16:41:14.281348673 +0000 UTC m=+161.995416712" watchObservedRunningTime="2026-04-24 16:41:14.282082336 +0000 UTC m=+161.996150370" Apr 24 16:41:14.296356 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:14.296305 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-whl7q" podStartSLOduration=0.836889052 podStartE2EDuration="3.296287833s" podCreationTimestamp="2026-04-24 16:41:11 +0000 UTC" firstStartedPulling="2026-04-24 16:41:11.520869181 +0000 UTC m=+159.234937195" lastFinishedPulling="2026-04-24 16:41:13.980267952 +0000 UTC m=+161.694335976" observedRunningTime="2026-04-24 16:41:14.295443777 +0000 UTC m=+162.009511816" watchObservedRunningTime="2026-04-24 16:41:14.296287833 +0000 UTC m=+162.010355868" Apr 24 16:41:14.925936 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:14.925893 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wsp7g"] Apr 24 16:41:14.929310 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:14.929288 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wsp7g" Apr 24 16:41:14.931628 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:14.931603 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 24 16:41:14.932596 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:14.932575 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 24 16:41:14.932688 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:14.932620 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-bvfjc\"" Apr 24 16:41:14.940116 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:14.939675 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wsp7g"] Apr 24 16:41:14.973377 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:14.973333 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd3fdb31-2e0f-48dc-8731-82c3abae5caf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wxbj2\" (UID: \"fd3fdb31-2e0f-48dc-8731-82c3abae5caf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wxbj2" Apr 24 16:41:14.973569 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:14.973486 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 16:41:14.973628 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:14.973570 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd3fdb31-2e0f-48dc-8731-82c3abae5caf-samples-operator-tls podName:fd3fdb31-2e0f-48dc-8731-82c3abae5caf nodeName:}" failed. No retries permitted until 2026-04-24 16:41:18.97354763 +0000 UTC m=+166.687615653 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/fd3fdb31-2e0f-48dc-8731-82c3abae5caf-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wxbj2" (UID: "fd3fdb31-2e0f-48dc-8731-82c3abae5caf") : secret "samples-operator-tls" not found Apr 24 16:41:15.074567 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:15.074520 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g9sg\" (UniqueName: \"kubernetes.io/projected/6675872f-5466-4ba3-93fa-2d8f6edfc801-kube-api-access-6g9sg\") pod \"migrator-74bb7799d9-wsp7g\" (UID: \"6675872f-5466-4ba3-93fa-2d8f6edfc801\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wsp7g" Apr 24 16:41:15.175450 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:15.175408 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6g9sg\" (UniqueName: \"kubernetes.io/projected/6675872f-5466-4ba3-93fa-2d8f6edfc801-kube-api-access-6g9sg\") pod \"migrator-74bb7799d9-wsp7g\" (UID: \"6675872f-5466-4ba3-93fa-2d8f6edfc801\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wsp7g" Apr 24 16:41:15.183851 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:15.183786 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g9sg\" (UniqueName: \"kubernetes.io/projected/6675872f-5466-4ba3-93fa-2d8f6edfc801-kube-api-access-6g9sg\") pod \"migrator-74bb7799d9-wsp7g\" (UID: \"6675872f-5466-4ba3-93fa-2d8f6edfc801\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wsp7g" Apr 24 16:41:15.244162 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:15.244112 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wsp7g" Apr 24 16:41:15.366082 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:15.366045 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wsp7g"] Apr 24 16:41:15.369359 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:41:15.369323 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6675872f_5466_4ba3_93fa_2d8f6edfc801.slice/crio-7b35a6d0eb941981f3d2d7a8bddbc397b538a39191fe2cc5a00ffe7608a550ed WatchSource:0}: Error finding container 7b35a6d0eb941981f3d2d7a8bddbc397b538a39191fe2cc5a00ffe7608a550ed: Status 404 returned error can't find the container with id 7b35a6d0eb941981f3d2d7a8bddbc397b538a39191fe2cc5a00ffe7608a550ed Apr 24 16:41:16.259956 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:16.259918 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wsp7g" event={"ID":"6675872f-5466-4ba3-93fa-2d8f6edfc801","Type":"ContainerStarted","Data":"7b35a6d0eb941981f3d2d7a8bddbc397b538a39191fe2cc5a00ffe7608a550ed"} Apr 24 16:41:16.790789 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:16.790692 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/52705a67-f97f-488e-adc3-2f562fd2fd0e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ccsnq\" (UID: \"52705a67-f97f-488e-adc3-2f562fd2fd0e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq" Apr 24 16:41:16.790931 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:16.790842 2575 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:16.790931 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:16.790911 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52705a67-f97f-488e-adc3-2f562fd2fd0e-cluster-monitoring-operator-tls podName:52705a67-f97f-488e-adc3-2f562fd2fd0e nodeName:}" failed. No retries permitted until 2026-04-24 16:41:32.790895865 +0000 UTC m=+180.504963878 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/52705a67-f97f-488e-adc3-2f562fd2fd0e-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-ccsnq" (UID: "52705a67-f97f-488e-adc3-2f562fd2fd0e") : secret "cluster-monitoring-operator-tls" not found Apr 24 16:41:17.264200 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.264151 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wsp7g" event={"ID":"6675872f-5466-4ba3-93fa-2d8f6edfc801","Type":"ContainerStarted","Data":"c488af3dcf0ddce498b7af05542eddef1e8d4ec1cf75cf527cb80d54fb32a9ba"} Apr 24 16:41:17.264200 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.264196 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wsp7g" event={"ID":"6675872f-5466-4ba3-93fa-2d8f6edfc801","Type":"ContainerStarted","Data":"f7f4093ec21882bcc156a1f9d7577c73267736e6a6d5ccac80ae62a55b22be3f"} Apr 24 16:41:17.280453 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.280390 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wsp7g" podStartSLOduration=2.200660291 podStartE2EDuration="3.280373472s" podCreationTimestamp="2026-04-24 16:41:14 +0000 UTC" firstStartedPulling="2026-04-24 16:41:15.371622448 +0000 UTC m=+163.085690462" lastFinishedPulling="2026-04-24 16:41:16.451335616 +0000 UTC m=+164.165403643" observedRunningTime="2026-04-24 16:41:17.28005366 +0000 UTC m=+164.994121698" watchObservedRunningTime="2026-04-24 16:41:17.280373472 +0000 UTC m=+164.994441507" Apr 24 16:41:17.735997 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.735964 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-wttnf"] Apr 24 16:41:17.740480 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.740452 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wttnf" Apr 24 16:41:17.747563 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.747536 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8vx6n\"" Apr 24 16:41:17.747697 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.747618 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 16:41:17.748290 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.748264 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 16:41:17.750229 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.750205 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wttnf"] Apr 24 16:41:17.800053 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.800008 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blvlr\" (UniqueName: \"kubernetes.io/projected/8a70b0e4-967f-4814-87bb-2bc980391a01-kube-api-access-blvlr\") pod \"insights-runtime-extractor-wttnf\" (UID: \"8a70b0e4-967f-4814-87bb-2bc980391a01\") " pod="openshift-insights/insights-runtime-extractor-wttnf" Apr 24 16:41:17.800246 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.800071 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8a70b0e4-967f-4814-87bb-2bc980391a01-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wttnf\" (UID: \"8a70b0e4-967f-4814-87bb-2bc980391a01\") " pod="openshift-insights/insights-runtime-extractor-wttnf" Apr 24 16:41:17.800246 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.800162 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8a70b0e4-967f-4814-87bb-2bc980391a01-crio-socket\") pod \"insights-runtime-extractor-wttnf\" (UID: \"8a70b0e4-967f-4814-87bb-2bc980391a01\") " pod="openshift-insights/insights-runtime-extractor-wttnf" Apr 24 16:41:17.800246 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.800224 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8a70b0e4-967f-4814-87bb-2bc980391a01-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wttnf\" (UID: \"8a70b0e4-967f-4814-87bb-2bc980391a01\") " pod="openshift-insights/insights-runtime-extractor-wttnf" Apr 24 16:41:17.800343 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.800269 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8a70b0e4-967f-4814-87bb-2bc980391a01-data-volume\") pod \"insights-runtime-extractor-wttnf\" (UID: \"8a70b0e4-967f-4814-87bb-2bc980391a01\") " pod="openshift-insights/insights-runtime-extractor-wttnf" Apr 24 16:41:17.901551 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.901506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-blvlr\" (UniqueName: \"kubernetes.io/projected/8a70b0e4-967f-4814-87bb-2bc980391a01-kube-api-access-blvlr\") pod \"insights-runtime-extractor-wttnf\" (UID: \"8a70b0e4-967f-4814-87bb-2bc980391a01\") " pod="openshift-insights/insights-runtime-extractor-wttnf" Apr 24 16:41:17.901701 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.901580 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8a70b0e4-967f-4814-87bb-2bc980391a01-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wttnf\" (UID: \"8a70b0e4-967f-4814-87bb-2bc980391a01\") " pod="openshift-insights/insights-runtime-extractor-wttnf" Apr 24 16:41:17.901701 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.901614 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8a70b0e4-967f-4814-87bb-2bc980391a01-crio-socket\") pod \"insights-runtime-extractor-wttnf\" (UID: \"8a70b0e4-967f-4814-87bb-2bc980391a01\") " pod="openshift-insights/insights-runtime-extractor-wttnf" Apr 24 16:41:17.901701 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.901675 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8a70b0e4-967f-4814-87bb-2bc980391a01-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wttnf\" (UID: \"8a70b0e4-967f-4814-87bb-2bc980391a01\") " pod="openshift-insights/insights-runtime-extractor-wttnf" Apr 24 16:41:17.901859 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.901742 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8a70b0e4-967f-4814-87bb-2bc980391a01-data-volume\") pod \"insights-runtime-extractor-wttnf\" (UID: \"8a70b0e4-967f-4814-87bb-2bc980391a01\") " pod="openshift-insights/insights-runtime-extractor-wttnf" Apr 24 16:41:17.901859 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.901747 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8a70b0e4-967f-4814-87bb-2bc980391a01-crio-socket\") pod \"insights-runtime-extractor-wttnf\" (UID: \"8a70b0e4-967f-4814-87bb-2bc980391a01\") " pod="openshift-insights/insights-runtime-extractor-wttnf" Apr 24 16:41:17.901955 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:17.901870 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 16:41:17.902004 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:17.901965 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a70b0e4-967f-4814-87bb-2bc980391a01-insights-runtime-extractor-tls podName:8a70b0e4-967f-4814-87bb-2bc980391a01 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:18.401941816 +0000 UTC m=+166.116009837 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/8a70b0e4-967f-4814-87bb-2bc980391a01-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wttnf" (UID: "8a70b0e4-967f-4814-87bb-2bc980391a01") : secret "insights-runtime-extractor-tls" not found Apr 24 16:41:17.902061 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.902047 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8a70b0e4-967f-4814-87bb-2bc980391a01-data-volume\") pod \"insights-runtime-extractor-wttnf\" (UID: \"8a70b0e4-967f-4814-87bb-2bc980391a01\") " pod="openshift-insights/insights-runtime-extractor-wttnf" Apr 24 16:41:17.902200 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.902182 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8a70b0e4-967f-4814-87bb-2bc980391a01-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-wttnf\" (UID: \"8a70b0e4-967f-4814-87bb-2bc980391a01\") " pod="openshift-insights/insights-runtime-extractor-wttnf" Apr 24 16:41:17.912004 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:17.911974 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-blvlr\" (UniqueName: \"kubernetes.io/projected/8a70b0e4-967f-4814-87bb-2bc980391a01-kube-api-access-blvlr\") pod \"insights-runtime-extractor-wttnf\" (UID: \"8a70b0e4-967f-4814-87bb-2bc980391a01\") " pod="openshift-insights/insights-runtime-extractor-wttnf" Apr 24 16:41:18.405597 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:18.405548 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8a70b0e4-967f-4814-87bb-2bc980391a01-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wttnf\" (UID: \"8a70b0e4-967f-4814-87bb-2bc980391a01\") " pod="openshift-insights/insights-runtime-extractor-wttnf" Apr 24 16:41:18.405994 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:18.405721 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 16:41:18.405994 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:18.405796 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a70b0e4-967f-4814-87bb-2bc980391a01-insights-runtime-extractor-tls podName:8a70b0e4-967f-4814-87bb-2bc980391a01 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:19.40577742 +0000 UTC m=+167.119845454 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/8a70b0e4-967f-4814-87bb-2bc980391a01-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wttnf" (UID: "8a70b0e4-967f-4814-87bb-2bc980391a01") : secret "insights-runtime-extractor-tls" not found Apr 24 16:41:19.011234 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:19.011186 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd3fdb31-2e0f-48dc-8731-82c3abae5caf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wxbj2\" (UID: \"fd3fdb31-2e0f-48dc-8731-82c3abae5caf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wxbj2" Apr 24 16:41:19.011389 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:19.011316 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 16:41:19.011389 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:19.011379 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd3fdb31-2e0f-48dc-8731-82c3abae5caf-samples-operator-tls podName:fd3fdb31-2e0f-48dc-8731-82c3abae5caf nodeName:}" failed. No retries permitted until 2026-04-24 16:41:27.011362284 +0000 UTC m=+174.725430314 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/fd3fdb31-2e0f-48dc-8731-82c3abae5caf-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-wxbj2" (UID: "fd3fdb31-2e0f-48dc-8731-82c3abae5caf") : secret "samples-operator-tls" not found Apr 24 16:41:19.414833 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:19.414740 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8a70b0e4-967f-4814-87bb-2bc980391a01-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wttnf\" (UID: \"8a70b0e4-967f-4814-87bb-2bc980391a01\") " pod="openshift-insights/insights-runtime-extractor-wttnf" Apr 24 16:41:19.415235 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:19.414892 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 16:41:19.415235 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:19.414961 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a70b0e4-967f-4814-87bb-2bc980391a01-insights-runtime-extractor-tls podName:8a70b0e4-967f-4814-87bb-2bc980391a01 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:21.414945533 +0000 UTC m=+169.129013547 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/8a70b0e4-967f-4814-87bb-2bc980391a01-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wttnf" (UID: "8a70b0e4-967f-4814-87bb-2bc980391a01") : secret "insights-runtime-extractor-tls" not found Apr 24 16:41:21.432713 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:21.432679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8a70b0e4-967f-4814-87bb-2bc980391a01-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wttnf\" (UID: \"8a70b0e4-967f-4814-87bb-2bc980391a01\") " pod="openshift-insights/insights-runtime-extractor-wttnf" Apr 24 16:41:21.433132 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:21.432828 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 16:41:21.433132 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:21.432894 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a70b0e4-967f-4814-87bb-2bc980391a01-insights-runtime-extractor-tls podName:8a70b0e4-967f-4814-87bb-2bc980391a01 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:25.432877821 +0000 UTC m=+173.146945840 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/8a70b0e4-967f-4814-87bb-2bc980391a01-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wttnf" (UID: "8a70b0e4-967f-4814-87bb-2bc980391a01") : secret "insights-runtime-extractor-tls" not found Apr 24 16:41:22.828053 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:22.828015 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:41:25.468050 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:25.468012 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8a70b0e4-967f-4814-87bb-2bc980391a01-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wttnf\" (UID: \"8a70b0e4-967f-4814-87bb-2bc980391a01\") " pod="openshift-insights/insights-runtime-extractor-wttnf" Apr 24 16:41:25.468459 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:25.468193 2575 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 16:41:25.468459 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:25.468281 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a70b0e4-967f-4814-87bb-2bc980391a01-insights-runtime-extractor-tls podName:8a70b0e4-967f-4814-87bb-2bc980391a01 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:33.468259187 +0000 UTC m=+181.182327207 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/8a70b0e4-967f-4814-87bb-2bc980391a01-insights-runtime-extractor-tls") pod "insights-runtime-extractor-wttnf" (UID: "8a70b0e4-967f-4814-87bb-2bc980391a01") : secret "insights-runtime-extractor-tls" not found Apr 24 16:41:27.084765 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:27.084731 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd3fdb31-2e0f-48dc-8731-82c3abae5caf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wxbj2\" (UID: \"fd3fdb31-2e0f-48dc-8731-82c3abae5caf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wxbj2" Apr 24 16:41:27.087283 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:27.087251 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/fd3fdb31-2e0f-48dc-8731-82c3abae5caf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-wxbj2\" (UID: \"fd3fdb31-2e0f-48dc-8731-82c3abae5caf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wxbj2" Apr 24 16:41:27.118810 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:27.118772 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wxbj2" Apr 24 16:41:27.241461 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:27.241430 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wxbj2"] Apr 24 16:41:27.292293 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:27.292251 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wxbj2" event={"ID":"fd3fdb31-2e0f-48dc-8731-82c3abae5caf","Type":"ContainerStarted","Data":"b11941e622e87890d9ff6732baf6e208d7d7626eb6e6ed38ff7ad66926910650"} Apr 24 16:41:29.298976 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:29.298940 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wxbj2" event={"ID":"fd3fdb31-2e0f-48dc-8731-82c3abae5caf","Type":"ContainerStarted","Data":"a8cdb4b7659073bfb06c80b3460e8a10a91d197cd0309c346f270bd3dd9fd2ea"} Apr 24 16:41:29.298976 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:29.298981 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wxbj2" event={"ID":"fd3fdb31-2e0f-48dc-8731-82c3abae5caf","Type":"ContainerStarted","Data":"c1a1e2e427c4c71ad79f78c5b951de2e87125eb910f93ffe16e5354ce4b9eb40"} Apr 24 16:41:29.315636 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:29.315583 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-wxbj2" podStartSLOduration=16.991582326 podStartE2EDuration="18.315566281s" podCreationTimestamp="2026-04-24 16:41:11 +0000 UTC" firstStartedPulling="2026-04-24 16:41:27.286226064 +0000 UTC m=+175.000294079" lastFinishedPulling="2026-04-24 16:41:28.610210017 +0000 UTC m=+176.324278034" observedRunningTime="2026-04-24 16:41:29.314930685 +0000 UTC m=+177.028998723" watchObservedRunningTime="2026-04-24 16:41:29.315566281 +0000 UTC m=+177.029634317" Apr 24 16:41:32.838252 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:32.838211 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/52705a67-f97f-488e-adc3-2f562fd2fd0e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ccsnq\" (UID: \"52705a67-f97f-488e-adc3-2f562fd2fd0e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq" Apr 24 16:41:32.840722 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:32.840698 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/52705a67-f97f-488e-adc3-2f562fd2fd0e-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-ccsnq\" (UID: \"52705a67-f97f-488e-adc3-2f562fd2fd0e\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq" Apr 24 16:41:33.088525 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:33.088437 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-hp5ct\"" Apr 24 16:41:33.096696 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:33.096673 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq" Apr 24 16:41:33.218748 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:33.218721 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq"] Apr 24 16:41:33.221161 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:41:33.221127 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52705a67_f97f_488e_adc3_2f562fd2fd0e.slice/crio-4075a831043950329e2531a2bf078991d021ee1477b2591bd10cf93df9773853 WatchSource:0}: Error finding container 4075a831043950329e2531a2bf078991d021ee1477b2591bd10cf93df9773853: Status 404 returned error can't find the container with id 4075a831043950329e2531a2bf078991d021ee1477b2591bd10cf93df9773853 Apr 24 16:41:33.311613 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:33.311578 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq" event={"ID":"52705a67-f97f-488e-adc3-2f562fd2fd0e","Type":"ContainerStarted","Data":"4075a831043950329e2531a2bf078991d021ee1477b2591bd10cf93df9773853"} Apr 24 16:41:33.545874 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:33.545829 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8a70b0e4-967f-4814-87bb-2bc980391a01-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wttnf\" (UID: \"8a70b0e4-967f-4814-87bb-2bc980391a01\") " pod="openshift-insights/insights-runtime-extractor-wttnf" Apr 24 16:41:33.548255 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:33.548232 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8a70b0e4-967f-4814-87bb-2bc980391a01-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-wttnf\" (UID: \"8a70b0e4-967f-4814-87bb-2bc980391a01\") " pod="openshift-insights/insights-runtime-extractor-wttnf" Apr 24 16:41:33.653239 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:33.653210 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8vx6n\"" Apr 24 16:41:33.661353 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:33.661321 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-wttnf" Apr 24 16:41:33.796464 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:33.796377 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-wttnf"] Apr 24 16:41:33.799472 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:41:33.799437 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a70b0e4_967f_4814_87bb_2bc980391a01.slice/crio-77938b58f6cb89d3e0e9cdeaa8811de0bbf6f1d5c62602c82c30cedc61514cd0 WatchSource:0}: Error finding container 77938b58f6cb89d3e0e9cdeaa8811de0bbf6f1d5c62602c82c30cedc61514cd0: Status 404 returned error can't find the container with id 77938b58f6cb89d3e0e9cdeaa8811de0bbf6f1d5c62602c82c30cedc61514cd0 Apr 24 16:41:34.315312 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:34.315272 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wttnf" event={"ID":"8a70b0e4-967f-4814-87bb-2bc980391a01","Type":"ContainerStarted","Data":"09fc9a80f1c212e3e2ee8f1656d18c4d9930b99b35713a1fec716826d9387335"} Apr 24 16:41:34.315312 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:34.315317 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wttnf" event={"ID":"8a70b0e4-967f-4814-87bb-2bc980391a01","Type":"ContainerStarted","Data":"77938b58f6cb89d3e0e9cdeaa8811de0bbf6f1d5c62602c82c30cedc61514cd0"} Apr 24 16:41:35.319988 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:35.319890 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq" event={"ID":"52705a67-f97f-488e-adc3-2f562fd2fd0e","Type":"ContainerStarted","Data":"7f673ba56c330e88272138ecf25546dc31310197f643cf5e36b6ff3a72c2483e"} Apr 24 16:41:35.321772 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:35.321738 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wttnf" event={"ID":"8a70b0e4-967f-4814-87bb-2bc980391a01","Type":"ContainerStarted","Data":"c9eb9bbae38437fc8810ab7b3150db288ed6ebd3d2a0630c0c4bc2ce234105c6"} Apr 24 16:41:35.337116 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:35.337034 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-ccsnq" podStartSLOduration=33.757605738 podStartE2EDuration="35.337014375s" podCreationTimestamp="2026-04-24 16:41:00 +0000 UTC" firstStartedPulling="2026-04-24 16:41:33.223156905 +0000 UTC m=+180.937224919" lastFinishedPulling="2026-04-24 16:41:34.802565534 +0000 UTC m=+182.516633556" observedRunningTime="2026-04-24 16:41:35.336001176 +0000 UTC m=+183.050069216" watchObservedRunningTime="2026-04-24 16:41:35.337014375 +0000 UTC m=+183.051082413" Apr 24 16:41:36.326719 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:36.326684 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-wttnf" event={"ID":"8a70b0e4-967f-4814-87bb-2bc980391a01","Type":"ContainerStarted","Data":"25f523add2301b5fa953324281999eee2c69bb68d48083d8836a12ba60b2ec79"} Apr 24 16:41:36.349769 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:36.349720 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-wttnf" podStartSLOduration=17.093888994 podStartE2EDuration="19.349702093s" podCreationTimestamp="2026-04-24 16:41:17 +0000 UTC" firstStartedPulling="2026-04-24 16:41:33.886049867 +0000 UTC m=+181.600117896" lastFinishedPulling="2026-04-24 16:41:36.141862981 +0000 UTC m=+183.855930995" observedRunningTime="2026-04-24 16:41:36.34798884 +0000 UTC m=+184.062056876" watchObservedRunningTime="2026-04-24 16:41:36.349702093 +0000 UTC m=+184.063770156" Apr 24 16:41:37.354836 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:37.354793 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-q5bbn"] Apr 24 16:41:37.357906 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:37.357888 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-q5bbn" Apr 24 16:41:37.362307 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:37.362276 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 16:41:37.362454 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:37.362317 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 16:41:37.366809 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:37.363355 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-bvdlr\"" Apr 24 16:41:37.370622 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:37.370592 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-q5bbn"] Apr 24 16:41:37.379755 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:37.379728 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhc8g\" (UniqueName: \"kubernetes.io/projected/ed3c3ab2-f165-47a6-8669-c3408d5c908f-kube-api-access-mhc8g\") pod \"downloads-6bcc868b7-q5bbn\" (UID: \"ed3c3ab2-f165-47a6-8669-c3408d5c908f\") " pod="openshift-console/downloads-6bcc868b7-q5bbn" Apr 24 16:41:37.480793 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:37.480750 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhc8g\" (UniqueName: \"kubernetes.io/projected/ed3c3ab2-f165-47a6-8669-c3408d5c908f-kube-api-access-mhc8g\") pod \"downloads-6bcc868b7-q5bbn\" (UID: \"ed3c3ab2-f165-47a6-8669-c3408d5c908f\") " pod="openshift-console/downloads-6bcc868b7-q5bbn" Apr 24 16:41:37.502790 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:37.502756 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhc8g\" (UniqueName: \"kubernetes.io/projected/ed3c3ab2-f165-47a6-8669-c3408d5c908f-kube-api-access-mhc8g\") pod \"downloads-6bcc868b7-q5bbn\" (UID: \"ed3c3ab2-f165-47a6-8669-c3408d5c908f\") " pod="openshift-console/downloads-6bcc868b7-q5bbn" Apr 24 16:41:37.672553 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:37.672454 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-q5bbn" Apr 24 16:41:37.808953 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:37.808920 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-q5bbn"] Apr 24 16:41:37.811867 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:41:37.811833 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded3c3ab2_f165_47a6_8669_c3408d5c908f.slice/crio-541d32fa28bf41beab178473a1060d60650434626ec7771cea963f739253c283 WatchSource:0}: Error finding container 541d32fa28bf41beab178473a1060d60650434626ec7771cea963f739253c283: Status 404 returned error can't find the container with id 541d32fa28bf41beab178473a1060d60650434626ec7771cea963f739253c283 Apr 24 16:41:38.332479 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:38.332435 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-q5bbn" event={"ID":"ed3c3ab2-f165-47a6-8669-c3408d5c908f","Type":"ContainerStarted","Data":"541d32fa28bf41beab178473a1060d60650434626ec7771cea963f739253c283"} Apr 24 16:41:44.768494 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.768453 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-bxf9q"] Apr 24 16:41:44.788417 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.788386 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-bxf9q"] Apr 24 16:41:44.788561 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.788546 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" Apr 24 16:41:44.792509 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.792482 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 16:41:44.792658 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.792571 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-wt52q\"" Apr 24 16:41:44.792658 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.792640 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 16:41:44.792762 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.792580 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 16:41:44.792813 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.792794 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 16:41:44.810202 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.810174 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-92qsp"] Apr 24 16:41:44.820282 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.820253 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:44.824887 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.824863 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 16:41:44.825024 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.824985 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 16:41:44.825469 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.825446 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 16:41:44.828630 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.828609 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-sv478\"" Apr 24 16:41:44.848543 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.848510 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d2c85512-9eaa-47a4-93c7-088001707109-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-bxf9q\" (UID: \"d2c85512-9eaa-47a4-93c7-088001707109\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" Apr 24 16:41:44.848680 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.848553 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d2c85512-9eaa-47a4-93c7-088001707109-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-bxf9q\" (UID: \"d2c85512-9eaa-47a4-93c7-088001707109\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" Apr 24 16:41:44.848680 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.848645 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp69j\" (UniqueName: \"kubernetes.io/projected/d2c85512-9eaa-47a4-93c7-088001707109-kube-api-access-lp69j\") pod \"kube-state-metrics-69db897b98-bxf9q\" (UID: \"d2c85512-9eaa-47a4-93c7-088001707109\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" Apr 24 16:41:44.848805 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.848715 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d2c85512-9eaa-47a4-93c7-088001707109-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-bxf9q\" (UID: \"d2c85512-9eaa-47a4-93c7-088001707109\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" Apr 24 16:41:44.848805 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.848746 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d2c85512-9eaa-47a4-93c7-088001707109-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-bxf9q\" (UID: \"d2c85512-9eaa-47a4-93c7-088001707109\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" Apr 24 16:41:44.848805 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.848785 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2c85512-9eaa-47a4-93c7-088001707109-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-bxf9q\" (UID: \"d2c85512-9eaa-47a4-93c7-088001707109\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" Apr 24 16:41:44.949580 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.949548 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d2c85512-9eaa-47a4-93c7-088001707109-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-bxf9q\" (UID: \"d2c85512-9eaa-47a4-93c7-088001707109\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" Apr 24 16:41:44.949771 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.949592 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d2c85512-9eaa-47a4-93c7-088001707109-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-bxf9q\" (UID: \"d2c85512-9eaa-47a4-93c7-088001707109\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" Apr 24 16:41:44.949771 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.949728 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a083e521-de46-463d-921a-44495e3f3333-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:44.949890 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.949788 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a083e521-de46-463d-921a-44495e3f3333-root\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:44.949890 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.949817 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a083e521-de46-463d-921a-44495e3f3333-metrics-client-ca\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:44.949890 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.949859 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlv8x\" (UniqueName: \"kubernetes.io/projected/a083e521-de46-463d-921a-44495e3f3333-kube-api-access-vlv8x\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:44.950044 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.949908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a083e521-de46-463d-921a-44495e3f3333-node-exporter-accelerators-collector-config\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:44.950044 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.949994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lp69j\" (UniqueName: \"kubernetes.io/projected/d2c85512-9eaa-47a4-93c7-088001707109-kube-api-access-lp69j\") pod \"kube-state-metrics-69db897b98-bxf9q\" (UID: \"d2c85512-9eaa-47a4-93c7-088001707109\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" Apr 24 16:41:44.950171 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.950130 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d2c85512-9eaa-47a4-93c7-088001707109-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-bxf9q\" (UID: \"d2c85512-9eaa-47a4-93c7-088001707109\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" Apr 24 16:41:44.950225 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.950169 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d2c85512-9eaa-47a4-93c7-088001707109-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-bxf9q\" (UID: \"d2c85512-9eaa-47a4-93c7-088001707109\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" Apr 24 16:41:44.950225 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.950203 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a083e521-de46-463d-921a-44495e3f3333-node-exporter-tls\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:44.950345 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.950239 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2c85512-9eaa-47a4-93c7-088001707109-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-bxf9q\" (UID: \"d2c85512-9eaa-47a4-93c7-088001707109\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" Apr 24 16:41:44.950345 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.950272 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a083e521-de46-463d-921a-44495e3f3333-node-exporter-textfile\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:44.950345 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.950300 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a083e521-de46-463d-921a-44495e3f3333-node-exporter-wtmp\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:44.950345 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.950337 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a083e521-de46-463d-921a-44495e3f3333-sys\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:44.950546 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.950389 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d2c85512-9eaa-47a4-93c7-088001707109-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-bxf9q\" (UID: \"d2c85512-9eaa-47a4-93c7-088001707109\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" Apr 24 16:41:44.950546 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:44.950443 2575 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 24 16:41:44.950546 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.950477 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d2c85512-9eaa-47a4-93c7-088001707109-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-bxf9q\" (UID: \"d2c85512-9eaa-47a4-93c7-088001707109\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" Apr 24 16:41:44.950546 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:44.950524 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d2c85512-9eaa-47a4-93c7-088001707109-kube-state-metrics-tls podName:d2c85512-9eaa-47a4-93c7-088001707109 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:45.450503717 +0000 UTC m=+193.164571731 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/d2c85512-9eaa-47a4-93c7-088001707109-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-bxf9q" (UID: "d2c85512-9eaa-47a4-93c7-088001707109") : secret "kube-state-metrics-tls" not found Apr 24 16:41:44.950760 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.950497 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d2c85512-9eaa-47a4-93c7-088001707109-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-bxf9q\" (UID: \"d2c85512-9eaa-47a4-93c7-088001707109\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" Apr 24 16:41:44.952892 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.952866 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d2c85512-9eaa-47a4-93c7-088001707109-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-bxf9q\" (UID: \"d2c85512-9eaa-47a4-93c7-088001707109\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" Apr 24 16:41:44.959803 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:44.959758 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp69j\" (UniqueName: \"kubernetes.io/projected/d2c85512-9eaa-47a4-93c7-088001707109-kube-api-access-lp69j\") pod \"kube-state-metrics-69db897b98-bxf9q\" (UID: \"d2c85512-9eaa-47a4-93c7-088001707109\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" Apr 24 16:41:45.051818 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.051729 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a083e521-de46-463d-921a-44495e3f3333-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:45.051818 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.051809 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a083e521-de46-463d-921a-44495e3f3333-root\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:45.052032 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.051842 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a083e521-de46-463d-921a-44495e3f3333-metrics-client-ca\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:45.052032 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.051881 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlv8x\" (UniqueName: \"kubernetes.io/projected/a083e521-de46-463d-921a-44495e3f3333-kube-api-access-vlv8x\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:45.052032 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.051880 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a083e521-de46-463d-921a-44495e3f3333-root\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:45.052032 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.051912 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a083e521-de46-463d-921a-44495e3f3333-node-exporter-accelerators-collector-config\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:45.052032 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.052022 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a083e521-de46-463d-921a-44495e3f3333-node-exporter-tls\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:45.052384 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.052075 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a083e521-de46-463d-921a-44495e3f3333-node-exporter-textfile\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:45.052384 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.052117 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a083e521-de46-463d-921a-44495e3f3333-node-exporter-wtmp\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:45.052384 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.052144 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a083e521-de46-463d-921a-44495e3f3333-sys\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:45.052384 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.052220 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a083e521-de46-463d-921a-44495e3f3333-sys\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:45.052585 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.052418 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a083e521-de46-463d-921a-44495e3f3333-node-exporter-wtmp\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:45.052585 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.052521 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a083e521-de46-463d-921a-44495e3f3333-node-exporter-textfile\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:45.052688 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.052636 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a083e521-de46-463d-921a-44495e3f3333-node-exporter-accelerators-collector-config\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:45.052688 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.052673 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a083e521-de46-463d-921a-44495e3f3333-metrics-client-ca\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:45.054629 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.054608 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a083e521-de46-463d-921a-44495e3f3333-node-exporter-tls\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:45.054861 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.054840 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a083e521-de46-463d-921a-44495e3f3333-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:45.060682 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.060657 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlv8x\" (UniqueName: \"kubernetes.io/projected/a083e521-de46-463d-921a-44495e3f3333-kube-api-access-vlv8x\") pod \"node-exporter-92qsp\" (UID: \"a083e521-de46-463d-921a-44495e3f3333\") " pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:45.132061 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.131820 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-92qsp" Apr 24 16:41:45.142043 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:41:45.142011 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda083e521_de46_463d_921a_44495e3f3333.slice/crio-1035d9a0745f298f4b0483aa93eaa29dcfa0fa7d785dce26a5cb12d2dadf799f WatchSource:0}: Error finding container 1035d9a0745f298f4b0483aa93eaa29dcfa0fa7d785dce26a5cb12d2dadf799f: Status 404 returned error can't find the container with id 1035d9a0745f298f4b0483aa93eaa29dcfa0fa7d785dce26a5cb12d2dadf799f Apr 24 16:41:45.353527 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.353485 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-92qsp" event={"ID":"a083e521-de46-463d-921a-44495e3f3333","Type":"ContainerStarted","Data":"1035d9a0745f298f4b0483aa93eaa29dcfa0fa7d785dce26a5cb12d2dadf799f"} Apr 24 16:41:45.456262 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.456185 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2c85512-9eaa-47a4-93c7-088001707109-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-bxf9q\" (UID: \"d2c85512-9eaa-47a4-93c7-088001707109\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" Apr 24 16:41:45.459262 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.459206 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d2c85512-9eaa-47a4-93c7-088001707109-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-bxf9q\" (UID: \"d2c85512-9eaa-47a4-93c7-088001707109\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" Apr 24 16:41:45.700371 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.700271 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" Apr 24 16:41:45.867911 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.867882 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 16:41:45.872776 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.872748 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:45.877875 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.877375 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 16:41:45.877875 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.877597 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 16:41:45.877875 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.877614 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 16:41:45.877875 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.877774 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 16:41:45.877875 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.877787 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 16:41:45.878235 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.877951 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 16:41:45.878235 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.878153 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-mmfwp\"" Apr 24 16:41:45.878235 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.878199 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 16:41:45.878392 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.878154 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 16:41:45.878392 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.878370 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 16:41:45.903407 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.903370 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 16:41:45.949402 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.949374 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-bxf9q"] Apr 24 16:41:45.952515 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:41:45.952446 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2c85512_9eaa_47a4_93c7_088001707109.slice/crio-390b52c53c8fc43ab084ce9338487d9255847ebde5767ef0a031c8d29423e6ef WatchSource:0}: Error finding container 390b52c53c8fc43ab084ce9338487d9255847ebde5767ef0a031c8d29423e6ef: Status 404 returned error can't find the container with id 390b52c53c8fc43ab084ce9338487d9255847ebde5767ef0a031c8d29423e6ef Apr 24 16:41:45.962013 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.961968 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bd069ff6-010f-4efa-b07d-01a8f02d7c42-config-out\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:45.962157 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.962017 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:45.962157 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.962044 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bd069ff6-010f-4efa-b07d-01a8f02d7c42-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:45.962157 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.962114 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:45.962337 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.962154 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:45.962337 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.962190 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s75nb\" (UniqueName: \"kubernetes.io/projected/bd069ff6-010f-4efa-b07d-01a8f02d7c42-kube-api-access-s75nb\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:45.962337 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.962224 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bd069ff6-010f-4efa-b07d-01a8f02d7c42-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:45.962337 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.962288 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-config-volume\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:45.962337 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.962315 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:45.962559 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.962388 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd069ff6-010f-4efa-b07d-01a8f02d7c42-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:45.962559 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.962439 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:45.962559 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.962490 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-web-config\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:45.962559 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:45.962542 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd069ff6-010f-4efa-b07d-01a8f02d7c42-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.063730 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.063687 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd069ff6-010f-4efa-b07d-01a8f02d7c42-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.063907 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.063751 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bd069ff6-010f-4efa-b07d-01a8f02d7c42-config-out\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.063907 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.063789 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.063907 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.063818 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bd069ff6-010f-4efa-b07d-01a8f02d7c42-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.063907 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.063848 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.063907 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.063875 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.064189 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.063912 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s75nb\" (UniqueName: \"kubernetes.io/projected/bd069ff6-010f-4efa-b07d-01a8f02d7c42-kube-api-access-s75nb\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.064189 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.063943 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bd069ff6-010f-4efa-b07d-01a8f02d7c42-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.064189 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.063999 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-config-volume\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.064189 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.064025 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.064189 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.064071 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd069ff6-010f-4efa-b07d-01a8f02d7c42-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.064564 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.064537 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd069ff6-010f-4efa-b07d-01a8f02d7c42-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.065547 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.065079 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bd069ff6-010f-4efa-b07d-01a8f02d7c42-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.065547 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.065184 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.065547 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.065267 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-web-config\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.065547 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.065286 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd069ff6-010f-4efa-b07d-01a8f02d7c42-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.068381 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.068110 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.068381 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.068307 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bd069ff6-010f-4efa-b07d-01a8f02d7c42-config-out\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.069585 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.069551 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-web-config\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.070021 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.069969 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-config-volume\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.070575 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.070513 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.070575 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.070523 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.070575 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.070529 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bd069ff6-010f-4efa-b07d-01a8f02d7c42-tls-assets\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.070923 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.070902 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.071527 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.071502 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.073383 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.073349 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s75nb\" (UniqueName: \"kubernetes.io/projected/bd069ff6-010f-4efa-b07d-01a8f02d7c42-kube-api-access-s75nb\") pod \"alertmanager-main-0\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.189921 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.189878 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:41:46.347590 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.347549 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 16:41:46.350671 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:41:46.350622 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd069ff6_010f_4efa_b07d_01a8f02d7c42.slice/crio-92ce1aacc9c516b998d0619546251cf5dfd610f7b53574fa41bdcaf4c9518d96 WatchSource:0}: Error finding container 92ce1aacc9c516b998d0619546251cf5dfd610f7b53574fa41bdcaf4c9518d96: Status 404 returned error can't find the container with id 92ce1aacc9c516b998d0619546251cf5dfd610f7b53574fa41bdcaf4c9518d96 Apr 24 16:41:46.357150 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.357113 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bd069ff6-010f-4efa-b07d-01a8f02d7c42","Type":"ContainerStarted","Data":"92ce1aacc9c516b998d0619546251cf5dfd610f7b53574fa41bdcaf4c9518d96"} Apr 24 16:41:46.358804 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.358773 2575 generic.go:358] "Generic (PLEG): container finished" podID="a083e521-de46-463d-921a-44495e3f3333" containerID="efb2e4f916b92d5beafd7476d5ad17ac7b7d6eb48691b33800a50aebc73f481c" exitCode=0 Apr 24 16:41:46.358927 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.358861 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-92qsp" event={"ID":"a083e521-de46-463d-921a-44495e3f3333","Type":"ContainerDied","Data":"efb2e4f916b92d5beafd7476d5ad17ac7b7d6eb48691b33800a50aebc73f481c"} Apr 24 16:41:46.360147 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:46.360117 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" event={"ID":"d2c85512-9eaa-47a4-93c7-088001707109","Type":"ContainerStarted","Data":"390b52c53c8fc43ab084ce9338487d9255847ebde5767ef0a031c8d29423e6ef"} Apr 24 16:41:47.365867 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.365828 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-92qsp" event={"ID":"a083e521-de46-463d-921a-44495e3f3333","Type":"ContainerStarted","Data":"788de3b4d8067164311352d7ff1af126761e68e896745b05d4dd37938eccde53"} Apr 24 16:41:47.783891 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.783799 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-cb954bcc8-kmqk4"] Apr 24 16:41:47.787778 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.787752 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.791449 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.791255 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 16:41:47.791449 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.791276 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 16:41:47.791449 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.791304 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 16:41:47.791449 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.791319 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 16:41:47.791449 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.791255 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 16:41:47.791806 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.791669 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-bm0o4fhvao4c3\"" Apr 24 16:41:47.791806 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.791717 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-vqp7b\"" Apr 24 16:41:47.807698 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.807661 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-cb954bcc8-kmqk4"] Apr 24 16:41:47.884852 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.884811 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/7045a425-fd49-4f04-8bdc-44a088056f4d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.885044 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.884935 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7045a425-fd49-4f04-8bdc-44a088056f4d-metrics-client-ca\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.885044 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.885019 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7045a425-fd49-4f04-8bdc-44a088056f4d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.885184 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.885135 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/7045a425-fd49-4f04-8bdc-44a088056f4d-secret-thanos-querier-tls\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.885184 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.885168 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7045a425-fd49-4f04-8bdc-44a088056f4d-secret-grpc-tls\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.885276 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.885196 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrfbs\" (UniqueName: \"kubernetes.io/projected/7045a425-fd49-4f04-8bdc-44a088056f4d-kube-api-access-nrfbs\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.885321 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.885276 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/7045a425-fd49-4f04-8bdc-44a088056f4d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.885353 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.885323 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7045a425-fd49-4f04-8bdc-44a088056f4d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.986503 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.986462 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/7045a425-fd49-4f04-8bdc-44a088056f4d-secret-thanos-querier-tls\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.986709 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.986513 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7045a425-fd49-4f04-8bdc-44a088056f4d-secret-grpc-tls\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.986775 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.986726 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrfbs\" (UniqueName: \"kubernetes.io/projected/7045a425-fd49-4f04-8bdc-44a088056f4d-kube-api-access-nrfbs\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.986833 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.986806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/7045a425-fd49-4f04-8bdc-44a088056f4d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.986887 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.986846 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7045a425-fd49-4f04-8bdc-44a088056f4d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.986934 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.986894 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/7045a425-fd49-4f04-8bdc-44a088056f4d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.986984 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.986959 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7045a425-fd49-4f04-8bdc-44a088056f4d-metrics-client-ca\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.987049 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.987032 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7045a425-fd49-4f04-8bdc-44a088056f4d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.987883 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.987816 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7045a425-fd49-4f04-8bdc-44a088056f4d-metrics-client-ca\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.989843 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.989815 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/7045a425-fd49-4f04-8bdc-44a088056f4d-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.989984 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.989897 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7045a425-fd49-4f04-8bdc-44a088056f4d-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.990514 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.990437 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/7045a425-fd49-4f04-8bdc-44a088056f4d-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.990514 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.990458 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7045a425-fd49-4f04-8bdc-44a088056f4d-secret-grpc-tls\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.990514 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.990470 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/7045a425-fd49-4f04-8bdc-44a088056f4d-secret-thanos-querier-tls\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:47.990859 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:47.990837 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7045a425-fd49-4f04-8bdc-44a088056f4d-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:48.026922 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:48.026883 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrfbs\" (UniqueName: \"kubernetes.io/projected/7045a425-fd49-4f04-8bdc-44a088056f4d-kube-api-access-nrfbs\") pod \"thanos-querier-cb954bcc8-kmqk4\" (UID: \"7045a425-fd49-4f04-8bdc-44a088056f4d\") " pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:48.100188 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:48.100123 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:41:49.541437 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:49.541395 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-xlvqq"] Apr 24 16:41:49.544968 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:49.544944 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xlvqq" Apr 24 16:41:49.547617 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:49.547585 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-rssd9\"" Apr 24 16:41:49.547993 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:49.547973 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 24 16:41:49.558034 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:49.558008 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-xlvqq"] Apr 24 16:41:49.604825 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:49.604778 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a6af0dd3-dc36-494a-b420-bb7b1ac5fd89-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-xlvqq\" (UID: \"a6af0dd3-dc36-494a-b420-bb7b1ac5fd89\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xlvqq" Apr 24 16:41:49.706527 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:49.706483 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a6af0dd3-dc36-494a-b420-bb7b1ac5fd89-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-xlvqq\" (UID: \"a6af0dd3-dc36-494a-b420-bb7b1ac5fd89\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xlvqq" Apr 24 16:41:49.706735 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:49.706656 2575 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 24 16:41:49.706735 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:41:49.706732 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6af0dd3-dc36-494a-b420-bb7b1ac5fd89-monitoring-plugin-cert podName:a6af0dd3-dc36-494a-b420-bb7b1ac5fd89 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:50.206714907 +0000 UTC m=+197.920782920 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/a6af0dd3-dc36-494a-b420-bb7b1ac5fd89-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-xlvqq" (UID: "a6af0dd3-dc36-494a-b420-bb7b1ac5fd89") : secret "monitoring-plugin-cert" not found Apr 24 16:41:50.212182 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:50.212145 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a6af0dd3-dc36-494a-b420-bb7b1ac5fd89-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-xlvqq\" (UID: \"a6af0dd3-dc36-494a-b420-bb7b1ac5fd89\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xlvqq" Apr 24 16:41:50.214797 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:50.214762 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a6af0dd3-dc36-494a-b420-bb7b1ac5fd89-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-xlvqq\" (UID: \"a6af0dd3-dc36-494a-b420-bb7b1ac5fd89\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xlvqq" Apr 24 16:41:50.456756 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:50.456702 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xlvqq" Apr 24 16:41:50.989030 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:50.988993 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 16:41:50.994393 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:50.994360 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:50.997039 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:50.997015 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 16:41:50.997222 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:50.997202 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-ammrmejd5ifce\"" Apr 24 16:41:50.998227 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:50.997747 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 16:41:50.998583 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:50.998509 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 16:41:50.998583 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:50.998509 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 16:41:50.998939 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:50.998798 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 16:41:50.998939 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:50.998823 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 16:41:50.999078 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:50.999052 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 16:41:50.999169 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:50.999118 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-k4mxp\"" Apr 24 16:41:50.999537 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:50.999277 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 16:41:50.999537 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:50.999390 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 16:41:50.999537 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:50.999405 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 16:41:51.000021 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:50.999983 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 16:41:51.001286 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.001268 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 16:41:51.004748 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.004727 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 16:41:51.010150 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.010120 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 16:41:51.120932 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.120891 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-config-out\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.120932 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.120931 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.121187 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.120963 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.121187 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.121056 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4p5b\" (UniqueName: \"kubernetes.io/projected/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-kube-api-access-f4p5b\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.121187 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.121173 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-config\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.121308 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.121214 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.121308 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.121254 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.121308 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.121284 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.121436 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.121313 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.121436 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.121342 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.121436 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.121366 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.121436 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.121400 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-web-config\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.121587 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.121461 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.121587 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.121529 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.121587 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.121556 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.121587 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.121581 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.121740 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.121665 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.121740 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.121695 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.222290 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.222245 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.222290 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.222299 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.222551 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.222335 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.222551 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.222394 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.222551 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.222427 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.222551 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.222468 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-config-out\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.222551 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.222492 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.222788 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.222673 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.222788 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.222714 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4p5b\" (UniqueName: \"kubernetes.io/projected/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-kube-api-access-f4p5b\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.222788 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.222771 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-config\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.222933 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.222801 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.222933 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.222846 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.222933 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.222872 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.222933 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.222900 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.222933 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.222926 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.223206 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.222948 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.223206 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.222990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-web-config\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.223206 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.223028 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.224999 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.223447 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.224999 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.224058 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.224999 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.224331 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.225798 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.225764 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.225880 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.225816 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.226199 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.226125 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.226496 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.226472 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.226588 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.226551 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.227364 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.226984 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.227364 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.227218 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.227571 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.227522 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-config-out\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.228029 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.228004 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.232618 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.232505 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.232618 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.232587 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-config\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.232798 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.232642 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-web-config\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.232798 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.232712 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.232905 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.232878 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.235256 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.235230 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4p5b\" (UniqueName: \"kubernetes.io/projected/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-kube-api-access-f4p5b\") pod \"prometheus-k8s-0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:51.308749 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:51.308655 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:41:53.856464 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:53.856406 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-xlvqq"] Apr 24 16:41:53.864062 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:41:53.863901 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6af0dd3_dc36_494a_b420_bb7b1ac5fd89.slice/crio-cf0a3efcf73199475ed3213b0d0cd18c91edaf5b395fda908343df911fd50f49 WatchSource:0}: Error finding container cf0a3efcf73199475ed3213b0d0cd18c91edaf5b395fda908343df911fd50f49: Status 404 returned error can't find the container with id cf0a3efcf73199475ed3213b0d0cd18c91edaf5b395fda908343df911fd50f49 Apr 24 16:41:53.895793 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:53.895739 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-cb954bcc8-kmqk4"] Apr 24 16:41:53.901608 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:41:53.901190 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7045a425_fd49_4f04_8bdc_44a088056f4d.slice/crio-4b8ea3ea0f323ab18cbca0d0eaa2227d43bf45ec2e8f25d76e4a501357ff172d WatchSource:0}: Error finding container 4b8ea3ea0f323ab18cbca0d0eaa2227d43bf45ec2e8f25d76e4a501357ff172d: Status 404 returned error can't find the container with id 4b8ea3ea0f323ab18cbca0d0eaa2227d43bf45ec2e8f25d76e4a501357ff172d Apr 24 16:41:54.101839 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:54.101803 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 16:41:54.106349 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:41:54.106303 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb0669f0_9212_4ad7_9ba8_15161d2b0dd0.slice/crio-b11a520f0353f93ad11cddba4e55be55bf1f3a8c76ad7c635a314fc841efd492 WatchSource:0}: Error finding container b11a520f0353f93ad11cddba4e55be55bf1f3a8c76ad7c635a314fc841efd492: Status 404 returned error can't find the container with id b11a520f0353f93ad11cddba4e55be55bf1f3a8c76ad7c635a314fc841efd492 Apr 24 16:41:54.396443 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:54.396394 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0","Type":"ContainerStarted","Data":"b11a520f0353f93ad11cddba4e55be55bf1f3a8c76ad7c635a314fc841efd492"} Apr 24 16:41:54.402058 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:54.401006 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" event={"ID":"d2c85512-9eaa-47a4-93c7-088001707109","Type":"ContainerStarted","Data":"04e72482f01528716922629b0f1ef753c9f29add83183d59d1d7bb1b412ac99c"} Apr 24 16:41:54.402058 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:54.401053 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" event={"ID":"d2c85512-9eaa-47a4-93c7-088001707109","Type":"ContainerStarted","Data":"29468b180216d1ae4e7b66d7d5888f4ba49cd36b33ec2e91f722dac7876d5d10"} Apr 24 16:41:54.402058 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:54.401069 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" event={"ID":"d2c85512-9eaa-47a4-93c7-088001707109","Type":"ContainerStarted","Data":"e1fd4557f907656fc32399d1fd9648890e4dda0f0e7d2030cc90840f602f100b"} Apr 24 16:41:54.405650 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:54.404549 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-q5bbn" event={"ID":"ed3c3ab2-f165-47a6-8669-c3408d5c908f","Type":"ContainerStarted","Data":"2a8351599bc080a346eb5fabc3db651bf5048260c138987fe9c7ed2d61430b96"} Apr 24 16:41:54.405882 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:54.405683 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-q5bbn" Apr 24 16:41:54.409228 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:54.409152 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xlvqq" event={"ID":"a6af0dd3-dc36-494a-b420-bb7b1ac5fd89","Type":"ContainerStarted","Data":"cf0a3efcf73199475ed3213b0d0cd18c91edaf5b395fda908343df911fd50f49"} Apr 24 16:41:54.411972 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:54.411812 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-92qsp" event={"ID":"a083e521-de46-463d-921a-44495e3f3333","Type":"ContainerStarted","Data":"9d2b6cf7cfecb2e6fb1f5497c9cc6591f8097efcf7065140959bad5a41cd6fa4"} Apr 24 16:41:54.413895 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:54.413861 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" event={"ID":"7045a425-fd49-4f04-8bdc-44a088056f4d","Type":"ContainerStarted","Data":"4b8ea3ea0f323ab18cbca0d0eaa2227d43bf45ec2e8f25d76e4a501357ff172d"} Apr 24 16:41:54.417047 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:54.416995 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-q5bbn" Apr 24 16:41:54.423029 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:54.422660 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-bxf9q" podStartSLOduration=2.699107574 podStartE2EDuration="10.422642049s" podCreationTimestamp="2026-04-24 16:41:44 +0000 UTC" firstStartedPulling="2026-04-24 16:41:45.955155659 +0000 UTC m=+193.669223687" lastFinishedPulling="2026-04-24 16:41:53.678690135 +0000 UTC m=+201.392758162" observedRunningTime="2026-04-24 16:41:54.422456014 +0000 UTC m=+202.136524052" watchObservedRunningTime="2026-04-24 16:41:54.422642049 +0000 UTC m=+202.136710087" Apr 24 16:41:54.466747 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:54.465015 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-q5bbn" podStartSLOduration=1.558718969 podStartE2EDuration="17.464993277s" podCreationTimestamp="2026-04-24 16:41:37 +0000 UTC" firstStartedPulling="2026-04-24 16:41:37.813631648 +0000 UTC m=+185.527699662" lastFinishedPulling="2026-04-24 16:41:53.719905939 +0000 UTC m=+201.433973970" observedRunningTime="2026-04-24 16:41:54.439001537 +0000 UTC m=+202.153069573" watchObservedRunningTime="2026-04-24 16:41:54.464993277 +0000 UTC m=+202.179061314" Apr 24 16:41:54.466747 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:54.466516 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-92qsp" podStartSLOduration=9.753245743 podStartE2EDuration="10.46649822s" podCreationTimestamp="2026-04-24 16:41:44 +0000 UTC" firstStartedPulling="2026-04-24 16:41:45.144220155 +0000 UTC m=+192.858288184" lastFinishedPulling="2026-04-24 16:41:45.857472632 +0000 UTC m=+193.571540661" observedRunningTime="2026-04-24 16:41:54.464058021 +0000 UTC m=+202.178126071" watchObservedRunningTime="2026-04-24 16:41:54.46649822 +0000 UTC m=+202.180566257" Apr 24 16:41:55.420158 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:55.420117 2575 generic.go:358] "Generic (PLEG): container finished" podID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerID="8c914f11936fb24f8e57b4955f905235df16c1eb02532b4854b200fd39de9fbf" exitCode=0 Apr 24 16:41:55.420622 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:55.420221 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0","Type":"ContainerDied","Data":"8c914f11936fb24f8e57b4955f905235df16c1eb02532b4854b200fd39de9fbf"} Apr 24 16:41:55.422278 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:55.422209 2575 generic.go:358] "Generic (PLEG): container finished" podID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerID="b2308835e13d9206b51ec9f28786d5054d421643c2d62be84a6cf21e4ca4c3de" exitCode=0 Apr 24 16:41:55.422744 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:55.422265 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bd069ff6-010f-4efa-b07d-01a8f02d7c42","Type":"ContainerDied","Data":"b2308835e13d9206b51ec9f28786d5054d421643c2d62be84a6cf21e4ca4c3de"} Apr 24 16:41:57.434257 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:57.434169 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xlvqq" event={"ID":"a6af0dd3-dc36-494a-b420-bb7b1ac5fd89","Type":"ContainerStarted","Data":"fad21ce266a343b401a92bce731301db887321a333eb17af5bde16bdeb01b877"} Apr 24 16:41:57.435924 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:57.435896 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xlvqq" Apr 24 16:41:57.439335 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:57.439279 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" event={"ID":"7045a425-fd49-4f04-8bdc-44a088056f4d","Type":"ContainerStarted","Data":"0ffa19907a5954d1f9b195d6ae58d420edeaaadcfe53618f929e0de15019a732"} Apr 24 16:41:57.439508 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:57.439337 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" event={"ID":"7045a425-fd49-4f04-8bdc-44a088056f4d","Type":"ContainerStarted","Data":"267baccd23cfae782e5868764c8b668752fe53115d34b4dfddb195dde082666e"} Apr 24 16:41:57.439508 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:57.439356 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" event={"ID":"7045a425-fd49-4f04-8bdc-44a088056f4d","Type":"ContainerStarted","Data":"1d88bfbae658134ccf0e4ab015ccb7444cea76d66df5e69cacf9d3faf9fddd09"} Apr 24 16:41:57.443461 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:57.443436 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xlvqq" Apr 24 16:41:57.456377 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:57.456314 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-xlvqq" podStartSLOduration=5.688529129 podStartE2EDuration="8.456293072s" podCreationTimestamp="2026-04-24 16:41:49 +0000 UTC" firstStartedPulling="2026-04-24 16:41:53.870403268 +0000 UTC m=+201.584471298" lastFinishedPulling="2026-04-24 16:41:56.638167211 +0000 UTC m=+204.352235241" observedRunningTime="2026-04-24 16:41:57.454810567 +0000 UTC m=+205.168878595" watchObservedRunningTime="2026-04-24 16:41:57.456293072 +0000 UTC m=+205.170361109" Apr 24 16:41:58.446559 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:41:58.446458 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bd069ff6-010f-4efa-b07d-01a8f02d7c42","Type":"ContainerStarted","Data":"6cbe5cd1786c0d38509cc63fea883233897dae543e2d80dcff148bd27be6d477"} Apr 24 16:42:00.454949 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:00.454901 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bd069ff6-010f-4efa-b07d-01a8f02d7c42","Type":"ContainerStarted","Data":"b7964686604c7a2c353d731078aaa6ab004f9c662e5a1942af44e16edaad7fa4"} Apr 24 16:42:00.457535 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:00.457500 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" event={"ID":"7045a425-fd49-4f04-8bdc-44a088056f4d","Type":"ContainerStarted","Data":"0e234810fd412b28623cf17e6211e3b7956558358307baa2a94ad27e1a7034fb"} Apr 24 16:42:00.459519 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:00.459482 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0","Type":"ContainerStarted","Data":"6c315aeca48985093c24fa5da9fc942cc6ba4fb823406f118e9a19c88320ecec"} Apr 24 16:42:00.724523 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:00.724487 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66c895d4b7-5gft4"] Apr 24 16:42:00.724880 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:42:00.724807 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" podUID="a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8" Apr 24 16:42:01.474478 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.474432 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" event={"ID":"7045a425-fd49-4f04-8bdc-44a088056f4d","Type":"ContainerStarted","Data":"b51397abccacd40bde2095aa42c677315b97801dbb0b5983188960e98780e384"} Apr 24 16:42:01.474967 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.474490 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" event={"ID":"7045a425-fd49-4f04-8bdc-44a088056f4d","Type":"ContainerStarted","Data":"fc88a5896902e5c7146215c4e33789c8887035bb7824bb345a087d2fb3bacb33"} Apr 24 16:42:01.474967 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.474793 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:42:01.477973 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.477929 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0","Type":"ContainerStarted","Data":"732f681dd41a49c95cd0c1a388bae1cf06f7357e413afb883da58f52f31aaac0"} Apr 24 16:42:01.478239 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.478197 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0","Type":"ContainerStarted","Data":"22e97e75c90f184d16631f202528babc86e724d9febc3f79d417642cd8f9a01a"} Apr 24 16:42:01.478382 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.478243 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0","Type":"ContainerStarted","Data":"6bd0f79278121356b75fca3cf0361407765ea3dd744b00b0c9a643da2446ad23"} Apr 24 16:42:01.478382 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.478258 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0","Type":"ContainerStarted","Data":"5e7ada27d474228359bf257944306014aa0a5c7f680affd28293b80e33b9cd3c"} Apr 24 16:42:01.478382 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.478270 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0","Type":"ContainerStarted","Data":"34d095b4daa46114c6218d6006227e217eeeb5978208a95cf996e8d23b2dcc4f"} Apr 24 16:42:01.482132 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.482084 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" Apr 24 16:42:01.483001 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.482977 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:42:01.483126 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.482967 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bd069ff6-010f-4efa-b07d-01a8f02d7c42","Type":"ContainerStarted","Data":"d470fc8213cf6f9a886226b3d94ac501d7ab602d20b0eb5fb29b78a03a566aad"} Apr 24 16:42:01.483126 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.483114 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bd069ff6-010f-4efa-b07d-01a8f02d7c42","Type":"ContainerStarted","Data":"10a8335d7301185294f0b3980c954355edfe82b7d65e06e8e90c18f4ff2abe44"} Apr 24 16:42:01.483227 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.483138 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bd069ff6-010f-4efa-b07d-01a8f02d7c42","Type":"ContainerStarted","Data":"58603faf17bbc87517d1ca7276a5c1a2baffd0460bb2c2d16a4aed2ed31a3117"} Apr 24 16:42:01.483227 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.483153 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bd069ff6-010f-4efa-b07d-01a8f02d7c42","Type":"ContainerStarted","Data":"fc1f13c22e649d123b47119bfe6ae454a7ef5ff3c89630d30f83d1a9fdd0f572"} Apr 24 16:42:01.489047 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.489018 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:42:01.504825 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.504763 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-cb954bcc8-kmqk4" podStartSLOduration=8.187170834 podStartE2EDuration="14.504744113s" podCreationTimestamp="2026-04-24 16:41:47 +0000 UTC" firstStartedPulling="2026-04-24 16:41:53.903534341 +0000 UTC m=+201.617602356" lastFinishedPulling="2026-04-24 16:42:00.221107603 +0000 UTC m=+207.935175635" observedRunningTime="2026-04-24 16:42:01.503555491 +0000 UTC m=+209.217623545" watchObservedRunningTime="2026-04-24 16:42:01.504744113 +0000 UTC m=+209.218812150" Apr 24 16:42:01.534699 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.534634 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=4.732192458 podStartE2EDuration="16.534615745s" podCreationTimestamp="2026-04-24 16:41:45 +0000 UTC" firstStartedPulling="2026-04-24 16:41:46.353131314 +0000 UTC m=+194.067199333" lastFinishedPulling="2026-04-24 16:41:58.155554607 +0000 UTC m=+205.869622620" observedRunningTime="2026-04-24 16:42:01.534157416 +0000 UTC m=+209.248225454" watchObservedRunningTime="2026-04-24 16:42:01.534615745 +0000 UTC m=+209.248683781" Apr 24 16:42:01.542808 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.542768 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-certificates\") pod \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " Apr 24 16:42:01.542999 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.542827 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-trusted-ca\") pod \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " Apr 24 16:42:01.542999 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.542878 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-bound-sa-token\") pod \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " Apr 24 16:42:01.543166 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.543026 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-installation-pull-secrets\") pod \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " Apr 24 16:42:01.543166 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.543111 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtjkm\" (UniqueName: \"kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-kube-api-access-jtjkm\") pod \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " Apr 24 16:42:01.543166 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.543158 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-ca-trust-extracted\") pod \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " Apr 24 16:42:01.543311 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.543237 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-image-registry-private-configuration\") pod \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\" (UID: \"a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8\") " Apr 24 16:42:01.543400 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.543378 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8" (UID: "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:42:01.545956 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.545292 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8" (UID: "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:42:01.546103 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.546023 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-certificates\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:42:01.546347 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.546295 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-kube-api-access-jtjkm" (OuterVolumeSpecName: "kube-api-access-jtjkm") pod "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8" (UID: "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8"). InnerVolumeSpecName "kube-api-access-jtjkm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:42:01.546482 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.546456 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8" (UID: "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:42:01.547201 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.547176 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8" (UID: "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:42:01.547350 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.547240 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8" (UID: "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:42:01.548990 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.548950 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8" (UID: "a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:42:01.576599 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.576534 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=5.466094437 podStartE2EDuration="11.576515012s" podCreationTimestamp="2026-04-24 16:41:50 +0000 UTC" firstStartedPulling="2026-04-24 16:41:54.108683543 +0000 UTC m=+201.822751577" lastFinishedPulling="2026-04-24 16:42:00.219104125 +0000 UTC m=+207.933172152" observedRunningTime="2026-04-24 16:42:01.574501181 +0000 UTC m=+209.288569275" watchObservedRunningTime="2026-04-24 16:42:01.576515012 +0000 UTC m=+209.290583048" Apr 24 16:42:01.647140 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.647107 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-image-registry-private-configuration\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:42:01.647140 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.647140 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-trusted-ca\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:42:01.647140 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.647152 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-bound-sa-token\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:42:01.647396 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.647161 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-installation-pull-secrets\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:42:01.647396 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.647170 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jtjkm\" (UniqueName: \"kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-kube-api-access-jtjkm\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:42:01.647396 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:01.647181 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-ca-trust-extracted\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:42:02.486804 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:02.486771 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66c895d4b7-5gft4" Apr 24 16:42:02.522155 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:02.522117 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-66c895d4b7-5gft4"] Apr 24 16:42:02.525852 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:02.525818 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-66c895d4b7-5gft4"] Apr 24 16:42:02.556753 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:02.556711 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8-registry-tls\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:42:02.832934 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:02.832851 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8" path="/var/lib/kubelet/pods/a6bc94c0-c2aa-4ac7-a2c3-ca0e4d96e6e8/volumes" Apr 24 16:42:06.309258 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:06.309217 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:29.569141 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:29.569100 2575 generic.go:358] "Generic (PLEG): container finished" podID="c1786d47-e613-4796-a98f-1ea71904bff8" containerID="3b9ae78b21e42abdc6034e548b7291c53378cfece72678fa2ea056a002bcec69" exitCode=0 Apr 24 16:42:29.569610 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:29.569148 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-df7xx" event={"ID":"c1786d47-e613-4796-a98f-1ea71904bff8","Type":"ContainerDied","Data":"3b9ae78b21e42abdc6034e548b7291c53378cfece72678fa2ea056a002bcec69"} Apr 24 16:42:29.569610 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:29.569460 2575 scope.go:117] "RemoveContainer" containerID="3b9ae78b21e42abdc6034e548b7291c53378cfece72678fa2ea056a002bcec69" Apr 24 16:42:30.573617 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:30.573580 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-df7xx" event={"ID":"c1786d47-e613-4796-a98f-1ea71904bff8","Type":"ContainerStarted","Data":"2fdb80f293dee09a7f334bab182aa69867240450d793265ee073933e0a80fa07"} Apr 24 16:42:44.660418 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:44.660368 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs\") pod \"network-metrics-daemon-f9bsr\" (UID: \"10ab450a-933f-4b41-8316-09109770ac99\") " pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:42:44.662742 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:44.662712 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/10ab450a-933f-4b41-8316-09109770ac99-metrics-certs\") pod \"network-metrics-daemon-f9bsr\" (UID: \"10ab450a-933f-4b41-8316-09109770ac99\") " pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:42:44.731518 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:44.731486 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-zln8m\"" Apr 24 16:42:44.739756 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:44.739729 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-f9bsr" Apr 24 16:42:44.919158 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:44.918969 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-f9bsr"] Apr 24 16:42:44.922306 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:42:44.922275 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10ab450a_933f_4b41_8316_09109770ac99.slice/crio-4d3a721870ec84b98a9950bd88391a87ba453345d01f77a9f2f2644fde7e0d5a WatchSource:0}: Error finding container 4d3a721870ec84b98a9950bd88391a87ba453345d01f77a9f2f2644fde7e0d5a: Status 404 returned error can't find the container with id 4d3a721870ec84b98a9950bd88391a87ba453345d01f77a9f2f2644fde7e0d5a Apr 24 16:42:45.626839 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:45.626804 2575 generic.go:358] "Generic (PLEG): container finished" podID="a7678b4b-fef3-4d8d-92a1-96d074b744a0" containerID="bcaee9df0a0c9628a978e5653890f4780ed6b0a225badf1fe560f26684f6a913" exitCode=0 Apr 24 16:42:45.627030 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:45.626893 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-whl7q" event={"ID":"a7678b4b-fef3-4d8d-92a1-96d074b744a0","Type":"ContainerDied","Data":"bcaee9df0a0c9628a978e5653890f4780ed6b0a225badf1fe560f26684f6a913"} Apr 24 16:42:45.627391 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:45.627344 2575 scope.go:117] "RemoveContainer" containerID="bcaee9df0a0c9628a978e5653890f4780ed6b0a225badf1fe560f26684f6a913" Apr 24 16:42:45.628410 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:45.628382 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f9bsr" event={"ID":"10ab450a-933f-4b41-8316-09109770ac99","Type":"ContainerStarted","Data":"4d3a721870ec84b98a9950bd88391a87ba453345d01f77a9f2f2644fde7e0d5a"} Apr 24 16:42:46.633823 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:46.633782 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-whl7q" event={"ID":"a7678b4b-fef3-4d8d-92a1-96d074b744a0","Type":"ContainerStarted","Data":"5e238487645742187d43d2e744a41854819b42f9e4d6601d2ab38c1434e8cacd"} Apr 24 16:42:46.635430 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:46.635401 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f9bsr" event={"ID":"10ab450a-933f-4b41-8316-09109770ac99","Type":"ContainerStarted","Data":"c98453ccf7222fd179fb7ffc14011f27a6ba7ef5e1667c57c75d5daf78c08201"} Apr 24 16:42:46.635551 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:46.635438 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-f9bsr" event={"ID":"10ab450a-933f-4b41-8316-09109770ac99","Type":"ContainerStarted","Data":"c196154545609ca35ea7f71e69d8147d126923bc303047213b10f4b98fb9d7ca"} Apr 24 16:42:46.665365 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:46.664990 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-f9bsr" podStartSLOduration=252.542902576 podStartE2EDuration="4m13.664968238s" podCreationTimestamp="2026-04-24 16:38:33 +0000 UTC" firstStartedPulling="2026-04-24 16:42:44.924710862 +0000 UTC m=+252.638778876" lastFinishedPulling="2026-04-24 16:42:46.04677651 +0000 UTC m=+253.760844538" observedRunningTime="2026-04-24 16:42:46.664642723 +0000 UTC m=+254.378710761" watchObservedRunningTime="2026-04-24 16:42:46.664968238 +0000 UTC m=+254.379036276" Apr 24 16:42:49.646500 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:49.646463 2575 generic.go:358] "Generic (PLEG): container finished" podID="a3504429-9f84-4f23-a196-d187ad6d16d6" containerID="cc67f02e23624804425f350846ae37ae17e382d960c37d5e8e80f95db3101174" exitCode=0 Apr 24 16:42:49.646881 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:49.646535 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-cf274" event={"ID":"a3504429-9f84-4f23-a196-d187ad6d16d6","Type":"ContainerDied","Data":"cc67f02e23624804425f350846ae37ae17e382d960c37d5e8e80f95db3101174"} Apr 24 16:42:49.646881 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:49.646875 2575 scope.go:117] "RemoveContainer" containerID="cc67f02e23624804425f350846ae37ae17e382d960c37d5e8e80f95db3101174" Apr 24 16:42:50.651053 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:50.651018 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-cf274" event={"ID":"a3504429-9f84-4f23-a196-d187ad6d16d6","Type":"ContainerStarted","Data":"1fd1fb5f8061ecb7725e4a09b63c44781bd7c30a98870e571f3ae5f14f339eea"} Apr 24 16:42:51.308906 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:51.308858 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:51.328277 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:51.328249 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:42:51.669576 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:42:51.669547 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:05.168638 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:05.168601 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 16:43:05.169189 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:05.169052 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="alertmanager" containerID="cri-o://6cbe5cd1786c0d38509cc63fea883233897dae543e2d80dcff148bd27be6d477" gracePeriod=120 Apr 24 16:43:05.169267 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:05.169151 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="prom-label-proxy" containerID="cri-o://d470fc8213cf6f9a886226b3d94ac501d7ab602d20b0eb5fb29b78a03a566aad" gracePeriod=120 Apr 24 16:43:05.169267 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:05.169181 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="kube-rbac-proxy" containerID="cri-o://58603faf17bbc87517d1ca7276a5c1a2baffd0460bb2c2d16a4aed2ed31a3117" gracePeriod=120 Apr 24 16:43:05.169267 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:05.169183 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="kube-rbac-proxy-web" containerID="cri-o://fc1f13c22e649d123b47119bfe6ae454a7ef5ff3c89630d30f83d1a9fdd0f572" gracePeriod=120 Apr 24 16:43:05.169267 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:05.169082 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="kube-rbac-proxy-metric" containerID="cri-o://10a8335d7301185294f0b3980c954355edfe82b7d65e06e8e90c18f4ff2abe44" gracePeriod=120 Apr 24 16:43:05.169495 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:05.169297 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="config-reloader" containerID="cri-o://b7964686604c7a2c353d731078aaa6ab004f9c662e5a1942af44e16edaad7fa4" gracePeriod=120 Apr 24 16:43:05.698997 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:05.698951 2575 generic.go:358] "Generic (PLEG): container finished" podID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerID="d470fc8213cf6f9a886226b3d94ac501d7ab602d20b0eb5fb29b78a03a566aad" exitCode=0 Apr 24 16:43:05.698997 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:05.698991 2575 generic.go:358] "Generic (PLEG): container finished" podID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerID="58603faf17bbc87517d1ca7276a5c1a2baffd0460bb2c2d16a4aed2ed31a3117" exitCode=0 Apr 24 16:43:05.698997 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:05.699000 2575 generic.go:358] "Generic (PLEG): container finished" podID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerID="b7964686604c7a2c353d731078aaa6ab004f9c662e5a1942af44e16edaad7fa4" exitCode=0 Apr 24 16:43:05.698997 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:05.699007 2575 generic.go:358] "Generic (PLEG): container finished" podID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerID="6cbe5cd1786c0d38509cc63fea883233897dae543e2d80dcff148bd27be6d477" exitCode=0 Apr 24 16:43:05.699309 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:05.699024 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bd069ff6-010f-4efa-b07d-01a8f02d7c42","Type":"ContainerDied","Data":"d470fc8213cf6f9a886226b3d94ac501d7ab602d20b0eb5fb29b78a03a566aad"} Apr 24 16:43:05.699309 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:05.699066 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bd069ff6-010f-4efa-b07d-01a8f02d7c42","Type":"ContainerDied","Data":"58603faf17bbc87517d1ca7276a5c1a2baffd0460bb2c2d16a4aed2ed31a3117"} Apr 24 16:43:05.699309 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:05.699080 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bd069ff6-010f-4efa-b07d-01a8f02d7c42","Type":"ContainerDied","Data":"b7964686604c7a2c353d731078aaa6ab004f9c662e5a1942af44e16edaad7fa4"} Apr 24 16:43:05.699309 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:05.699108 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bd069ff6-010f-4efa-b07d-01a8f02d7c42","Type":"ContainerDied","Data":"6cbe5cd1786c0d38509cc63fea883233897dae543e2d80dcff148bd27be6d477"} Apr 24 16:43:06.420059 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.420034 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.580589 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.580492 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-kube-rbac-proxy-metric\") pod \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " Apr 24 16:43:06.580589 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.580535 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-kube-rbac-proxy-web\") pod \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " Apr 24 16:43:06.580589 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.580557 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-cluster-tls-config\") pod \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " Apr 24 16:43:06.580872 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.580592 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-main-tls\") pod \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " Apr 24 16:43:06.580872 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.580763 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd069ff6-010f-4efa-b07d-01a8f02d7c42-metrics-client-ca\") pod \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " Apr 24 16:43:06.580872 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.580808 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bd069ff6-010f-4efa-b07d-01a8f02d7c42-config-out\") pod \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " Apr 24 16:43:06.580872 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.580839 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-kube-rbac-proxy\") pod \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " Apr 24 16:43:06.580872 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.580864 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bd069ff6-010f-4efa-b07d-01a8f02d7c42-alertmanager-main-db\") pod \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " Apr 24 16:43:06.581198 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.580946 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bd069ff6-010f-4efa-b07d-01a8f02d7c42-tls-assets\") pod \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " Apr 24 16:43:06.581198 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.581017 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-config-volume\") pod \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " Apr 24 16:43:06.581198 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.581051 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s75nb\" (UniqueName: \"kubernetes.io/projected/bd069ff6-010f-4efa-b07d-01a8f02d7c42-kube-api-access-s75nb\") pod \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " Apr 24 16:43:06.581198 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.581077 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd069ff6-010f-4efa-b07d-01a8f02d7c42-alertmanager-trusted-ca-bundle\") pod \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " Apr 24 16:43:06.581198 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.581149 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-web-config\") pod \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\" (UID: \"bd069ff6-010f-4efa-b07d-01a8f02d7c42\") " Apr 24 16:43:06.581435 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.581312 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd069ff6-010f-4efa-b07d-01a8f02d7c42-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "bd069ff6-010f-4efa-b07d-01a8f02d7c42" (UID: "bd069ff6-010f-4efa-b07d-01a8f02d7c42"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:43:06.581517 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.581495 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/bd069ff6-010f-4efa-b07d-01a8f02d7c42-alertmanager-main-db\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:06.581926 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.581308 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd069ff6-010f-4efa-b07d-01a8f02d7c42-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "bd069ff6-010f-4efa-b07d-01a8f02d7c42" (UID: "bd069ff6-010f-4efa-b07d-01a8f02d7c42"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:43:06.582357 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.582317 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd069ff6-010f-4efa-b07d-01a8f02d7c42-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "bd069ff6-010f-4efa-b07d-01a8f02d7c42" (UID: "bd069ff6-010f-4efa-b07d-01a8f02d7c42"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:43:06.584137 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.584072 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "bd069ff6-010f-4efa-b07d-01a8f02d7c42" (UID: "bd069ff6-010f-4efa-b07d-01a8f02d7c42"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:43:06.584244 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.584135 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "bd069ff6-010f-4efa-b07d-01a8f02d7c42" (UID: "bd069ff6-010f-4efa-b07d-01a8f02d7c42"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:43:06.584302 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.584262 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "bd069ff6-010f-4efa-b07d-01a8f02d7c42" (UID: "bd069ff6-010f-4efa-b07d-01a8f02d7c42"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:43:06.584554 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.584534 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-config-volume" (OuterVolumeSpecName: "config-volume") pod "bd069ff6-010f-4efa-b07d-01a8f02d7c42" (UID: "bd069ff6-010f-4efa-b07d-01a8f02d7c42"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:43:06.584911 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.584875 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd069ff6-010f-4efa-b07d-01a8f02d7c42-config-out" (OuterVolumeSpecName: "config-out") pod "bd069ff6-010f-4efa-b07d-01a8f02d7c42" (UID: "bd069ff6-010f-4efa-b07d-01a8f02d7c42"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:43:06.585029 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.584993 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd069ff6-010f-4efa-b07d-01a8f02d7c42-kube-api-access-s75nb" (OuterVolumeSpecName: "kube-api-access-s75nb") pod "bd069ff6-010f-4efa-b07d-01a8f02d7c42" (UID: "bd069ff6-010f-4efa-b07d-01a8f02d7c42"). InnerVolumeSpecName "kube-api-access-s75nb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:43:06.585161 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.585047 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd069ff6-010f-4efa-b07d-01a8f02d7c42-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "bd069ff6-010f-4efa-b07d-01a8f02d7c42" (UID: "bd069ff6-010f-4efa-b07d-01a8f02d7c42"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:43:06.585543 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.585518 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "bd069ff6-010f-4efa-b07d-01a8f02d7c42" (UID: "bd069ff6-010f-4efa-b07d-01a8f02d7c42"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:43:06.588360 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.588335 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "bd069ff6-010f-4efa-b07d-01a8f02d7c42" (UID: "bd069ff6-010f-4efa-b07d-01a8f02d7c42"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:43:06.596233 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.596202 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-web-config" (OuterVolumeSpecName: "web-config") pod "bd069ff6-010f-4efa-b07d-01a8f02d7c42" (UID: "bd069ff6-010f-4efa-b07d-01a8f02d7c42"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:43:06.682531 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.682494 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:06.682531 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.682526 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:06.682531 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.682538 2575 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-cluster-tls-config\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:06.682756 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.682547 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-main-tls\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:06.682756 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.682557 2575 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bd069ff6-010f-4efa-b07d-01a8f02d7c42-metrics-client-ca\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:06.682756 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.682566 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/bd069ff6-010f-4efa-b07d-01a8f02d7c42-config-out\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:06.682756 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.682575 2575 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:06.682756 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.682583 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/bd069ff6-010f-4efa-b07d-01a8f02d7c42-tls-assets\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:06.682756 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.682591 2575 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-config-volume\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:06.682756 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.682599 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s75nb\" (UniqueName: \"kubernetes.io/projected/bd069ff6-010f-4efa-b07d-01a8f02d7c42-kube-api-access-s75nb\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:06.682756 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.682608 2575 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bd069ff6-010f-4efa-b07d-01a8f02d7c42-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:06.682756 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.682617 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/bd069ff6-010f-4efa-b07d-01a8f02d7c42-web-config\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:06.704442 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.704403 2575 generic.go:358] "Generic (PLEG): container finished" podID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerID="10a8335d7301185294f0b3980c954355edfe82b7d65e06e8e90c18f4ff2abe44" exitCode=0 Apr 24 16:43:06.704442 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.704427 2575 generic.go:358] "Generic (PLEG): container finished" podID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerID="fc1f13c22e649d123b47119bfe6ae454a7ef5ff3c89630d30f83d1a9fdd0f572" exitCode=0 Apr 24 16:43:06.704640 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.704471 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bd069ff6-010f-4efa-b07d-01a8f02d7c42","Type":"ContainerDied","Data":"10a8335d7301185294f0b3980c954355edfe82b7d65e06e8e90c18f4ff2abe44"} Apr 24 16:43:06.704640 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.704499 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bd069ff6-010f-4efa-b07d-01a8f02d7c42","Type":"ContainerDied","Data":"fc1f13c22e649d123b47119bfe6ae454a7ef5ff3c89630d30f83d1a9fdd0f572"} Apr 24 16:43:06.704640 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.704510 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"bd069ff6-010f-4efa-b07d-01a8f02d7c42","Type":"ContainerDied","Data":"92ce1aacc9c516b998d0619546251cf5dfd610f7b53574fa41bdcaf4c9518d96"} Apr 24 16:43:06.704640 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.704508 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.704640 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.704521 2575 scope.go:117] "RemoveContainer" containerID="d470fc8213cf6f9a886226b3d94ac501d7ab602d20b0eb5fb29b78a03a566aad" Apr 24 16:43:06.713211 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.713193 2575 scope.go:117] "RemoveContainer" containerID="10a8335d7301185294f0b3980c954355edfe82b7d65e06e8e90c18f4ff2abe44" Apr 24 16:43:06.720637 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.720616 2575 scope.go:117] "RemoveContainer" containerID="58603faf17bbc87517d1ca7276a5c1a2baffd0460bb2c2d16a4aed2ed31a3117" Apr 24 16:43:06.727796 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.727766 2575 scope.go:117] "RemoveContainer" containerID="fc1f13c22e649d123b47119bfe6ae454a7ef5ff3c89630d30f83d1a9fdd0f572" Apr 24 16:43:06.730355 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.730321 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 16:43:06.735815 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.735781 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 16:43:06.736762 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.736741 2575 scope.go:117] "RemoveContainer" containerID="b7964686604c7a2c353d731078aaa6ab004f9c662e5a1942af44e16edaad7fa4" Apr 24 16:43:06.744206 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.744184 2575 scope.go:117] "RemoveContainer" containerID="6cbe5cd1786c0d38509cc63fea883233897dae543e2d80dcff148bd27be6d477" Apr 24 16:43:06.751294 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.751274 2575 scope.go:117] "RemoveContainer" containerID="b2308835e13d9206b51ec9f28786d5054d421643c2d62be84a6cf21e4ca4c3de" Apr 24 16:43:06.758700 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.758681 2575 scope.go:117] "RemoveContainer" containerID="d470fc8213cf6f9a886226b3d94ac501d7ab602d20b0eb5fb29b78a03a566aad" Apr 24 16:43:06.758977 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:43:06.758957 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d470fc8213cf6f9a886226b3d94ac501d7ab602d20b0eb5fb29b78a03a566aad\": container with ID starting with d470fc8213cf6f9a886226b3d94ac501d7ab602d20b0eb5fb29b78a03a566aad not found: ID does not exist" containerID="d470fc8213cf6f9a886226b3d94ac501d7ab602d20b0eb5fb29b78a03a566aad" Apr 24 16:43:06.759048 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.758984 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d470fc8213cf6f9a886226b3d94ac501d7ab602d20b0eb5fb29b78a03a566aad"} err="failed to get container status \"d470fc8213cf6f9a886226b3d94ac501d7ab602d20b0eb5fb29b78a03a566aad\": rpc error: code = NotFound desc = could not find container \"d470fc8213cf6f9a886226b3d94ac501d7ab602d20b0eb5fb29b78a03a566aad\": container with ID starting with d470fc8213cf6f9a886226b3d94ac501d7ab602d20b0eb5fb29b78a03a566aad not found: ID does not exist" Apr 24 16:43:06.759048 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.759019 2575 scope.go:117] "RemoveContainer" containerID="10a8335d7301185294f0b3980c954355edfe82b7d65e06e8e90c18f4ff2abe44" Apr 24 16:43:06.759301 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:43:06.759277 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10a8335d7301185294f0b3980c954355edfe82b7d65e06e8e90c18f4ff2abe44\": container with ID starting with 10a8335d7301185294f0b3980c954355edfe82b7d65e06e8e90c18f4ff2abe44 not found: ID does not exist" containerID="10a8335d7301185294f0b3980c954355edfe82b7d65e06e8e90c18f4ff2abe44" Apr 24 16:43:06.759347 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.759313 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10a8335d7301185294f0b3980c954355edfe82b7d65e06e8e90c18f4ff2abe44"} err="failed to get container status \"10a8335d7301185294f0b3980c954355edfe82b7d65e06e8e90c18f4ff2abe44\": rpc error: code = NotFound desc = could not find container \"10a8335d7301185294f0b3980c954355edfe82b7d65e06e8e90c18f4ff2abe44\": container with ID starting with 10a8335d7301185294f0b3980c954355edfe82b7d65e06e8e90c18f4ff2abe44 not found: ID does not exist" Apr 24 16:43:06.759347 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.759336 2575 scope.go:117] "RemoveContainer" containerID="58603faf17bbc87517d1ca7276a5c1a2baffd0460bb2c2d16a4aed2ed31a3117" Apr 24 16:43:06.759608 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:43:06.759590 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58603faf17bbc87517d1ca7276a5c1a2baffd0460bb2c2d16a4aed2ed31a3117\": container with ID starting with 58603faf17bbc87517d1ca7276a5c1a2baffd0460bb2c2d16a4aed2ed31a3117 not found: ID does not exist" containerID="58603faf17bbc87517d1ca7276a5c1a2baffd0460bb2c2d16a4aed2ed31a3117" Apr 24 16:43:06.759666 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.759617 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58603faf17bbc87517d1ca7276a5c1a2baffd0460bb2c2d16a4aed2ed31a3117"} err="failed to get container status \"58603faf17bbc87517d1ca7276a5c1a2baffd0460bb2c2d16a4aed2ed31a3117\": rpc error: code = NotFound desc = could not find container \"58603faf17bbc87517d1ca7276a5c1a2baffd0460bb2c2d16a4aed2ed31a3117\": container with ID starting with 58603faf17bbc87517d1ca7276a5c1a2baffd0460bb2c2d16a4aed2ed31a3117 not found: ID does not exist" Apr 24 16:43:06.759666 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.759644 2575 scope.go:117] "RemoveContainer" containerID="fc1f13c22e649d123b47119bfe6ae454a7ef5ff3c89630d30f83d1a9fdd0f572" Apr 24 16:43:06.759861 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:43:06.759844 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc1f13c22e649d123b47119bfe6ae454a7ef5ff3c89630d30f83d1a9fdd0f572\": container with ID starting with fc1f13c22e649d123b47119bfe6ae454a7ef5ff3c89630d30f83d1a9fdd0f572 not found: ID does not exist" containerID="fc1f13c22e649d123b47119bfe6ae454a7ef5ff3c89630d30f83d1a9fdd0f572" Apr 24 16:43:06.759901 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.759869 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc1f13c22e649d123b47119bfe6ae454a7ef5ff3c89630d30f83d1a9fdd0f572"} err="failed to get container status \"fc1f13c22e649d123b47119bfe6ae454a7ef5ff3c89630d30f83d1a9fdd0f572\": rpc error: code = NotFound desc = could not find container \"fc1f13c22e649d123b47119bfe6ae454a7ef5ff3c89630d30f83d1a9fdd0f572\": container with ID starting with fc1f13c22e649d123b47119bfe6ae454a7ef5ff3c89630d30f83d1a9fdd0f572 not found: ID does not exist" Apr 24 16:43:06.759901 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.759891 2575 scope.go:117] "RemoveContainer" containerID="b7964686604c7a2c353d731078aaa6ab004f9c662e5a1942af44e16edaad7fa4" Apr 24 16:43:06.760154 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:43:06.760134 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7964686604c7a2c353d731078aaa6ab004f9c662e5a1942af44e16edaad7fa4\": container with ID starting with b7964686604c7a2c353d731078aaa6ab004f9c662e5a1942af44e16edaad7fa4 not found: ID does not exist" containerID="b7964686604c7a2c353d731078aaa6ab004f9c662e5a1942af44e16edaad7fa4" Apr 24 16:43:06.760226 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.760160 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7964686604c7a2c353d731078aaa6ab004f9c662e5a1942af44e16edaad7fa4"} err="failed to get container status \"b7964686604c7a2c353d731078aaa6ab004f9c662e5a1942af44e16edaad7fa4\": rpc error: code = NotFound desc = could not find container \"b7964686604c7a2c353d731078aaa6ab004f9c662e5a1942af44e16edaad7fa4\": container with ID starting with b7964686604c7a2c353d731078aaa6ab004f9c662e5a1942af44e16edaad7fa4 not found: ID does not exist" Apr 24 16:43:06.760226 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.760180 2575 scope.go:117] "RemoveContainer" containerID="6cbe5cd1786c0d38509cc63fea883233897dae543e2d80dcff148bd27be6d477" Apr 24 16:43:06.760401 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:43:06.760381 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cbe5cd1786c0d38509cc63fea883233897dae543e2d80dcff148bd27be6d477\": container with ID starting with 6cbe5cd1786c0d38509cc63fea883233897dae543e2d80dcff148bd27be6d477 not found: ID does not exist" containerID="6cbe5cd1786c0d38509cc63fea883233897dae543e2d80dcff148bd27be6d477" Apr 24 16:43:06.760440 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.760406 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cbe5cd1786c0d38509cc63fea883233897dae543e2d80dcff148bd27be6d477"} err="failed to get container status \"6cbe5cd1786c0d38509cc63fea883233897dae543e2d80dcff148bd27be6d477\": rpc error: code = NotFound desc = could not find container \"6cbe5cd1786c0d38509cc63fea883233897dae543e2d80dcff148bd27be6d477\": container with ID starting with 6cbe5cd1786c0d38509cc63fea883233897dae543e2d80dcff148bd27be6d477 not found: ID does not exist" Apr 24 16:43:06.760440 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.760419 2575 scope.go:117] "RemoveContainer" containerID="b2308835e13d9206b51ec9f28786d5054d421643c2d62be84a6cf21e4ca4c3de" Apr 24 16:43:06.760634 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:43:06.760620 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2308835e13d9206b51ec9f28786d5054d421643c2d62be84a6cf21e4ca4c3de\": container with ID starting with b2308835e13d9206b51ec9f28786d5054d421643c2d62be84a6cf21e4ca4c3de not found: ID does not exist" containerID="b2308835e13d9206b51ec9f28786d5054d421643c2d62be84a6cf21e4ca4c3de" Apr 24 16:43:06.760681 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.760636 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2308835e13d9206b51ec9f28786d5054d421643c2d62be84a6cf21e4ca4c3de"} err="failed to get container status \"b2308835e13d9206b51ec9f28786d5054d421643c2d62be84a6cf21e4ca4c3de\": rpc error: code = NotFound desc = could not find container \"b2308835e13d9206b51ec9f28786d5054d421643c2d62be84a6cf21e4ca4c3de\": container with ID starting with b2308835e13d9206b51ec9f28786d5054d421643c2d62be84a6cf21e4ca4c3de not found: ID does not exist" Apr 24 16:43:06.760681 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.760648 2575 scope.go:117] "RemoveContainer" containerID="d470fc8213cf6f9a886226b3d94ac501d7ab602d20b0eb5fb29b78a03a566aad" Apr 24 16:43:06.760869 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.760848 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d470fc8213cf6f9a886226b3d94ac501d7ab602d20b0eb5fb29b78a03a566aad"} err="failed to get container status \"d470fc8213cf6f9a886226b3d94ac501d7ab602d20b0eb5fb29b78a03a566aad\": rpc error: code = NotFound desc = could not find container \"d470fc8213cf6f9a886226b3d94ac501d7ab602d20b0eb5fb29b78a03a566aad\": container with ID starting with d470fc8213cf6f9a886226b3d94ac501d7ab602d20b0eb5fb29b78a03a566aad not found: ID does not exist" Apr 24 16:43:06.760913 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.760870 2575 scope.go:117] "RemoveContainer" containerID="10a8335d7301185294f0b3980c954355edfe82b7d65e06e8e90c18f4ff2abe44" Apr 24 16:43:06.761078 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.761061 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10a8335d7301185294f0b3980c954355edfe82b7d65e06e8e90c18f4ff2abe44"} err="failed to get container status \"10a8335d7301185294f0b3980c954355edfe82b7d65e06e8e90c18f4ff2abe44\": rpc error: code = NotFound desc = could not find container \"10a8335d7301185294f0b3980c954355edfe82b7d65e06e8e90c18f4ff2abe44\": container with ID starting with 10a8335d7301185294f0b3980c954355edfe82b7d65e06e8e90c18f4ff2abe44 not found: ID does not exist" Apr 24 16:43:06.761143 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.761078 2575 scope.go:117] "RemoveContainer" containerID="58603faf17bbc87517d1ca7276a5c1a2baffd0460bb2c2d16a4aed2ed31a3117" Apr 24 16:43:06.761299 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.761280 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58603faf17bbc87517d1ca7276a5c1a2baffd0460bb2c2d16a4aed2ed31a3117"} err="failed to get container status \"58603faf17bbc87517d1ca7276a5c1a2baffd0460bb2c2d16a4aed2ed31a3117\": rpc error: code = NotFound desc = could not find container \"58603faf17bbc87517d1ca7276a5c1a2baffd0460bb2c2d16a4aed2ed31a3117\": container with ID starting with 58603faf17bbc87517d1ca7276a5c1a2baffd0460bb2c2d16a4aed2ed31a3117 not found: ID does not exist" Apr 24 16:43:06.761360 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.761301 2575 scope.go:117] "RemoveContainer" containerID="fc1f13c22e649d123b47119bfe6ae454a7ef5ff3c89630d30f83d1a9fdd0f572" Apr 24 16:43:06.761494 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.761478 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc1f13c22e649d123b47119bfe6ae454a7ef5ff3c89630d30f83d1a9fdd0f572"} err="failed to get container status \"fc1f13c22e649d123b47119bfe6ae454a7ef5ff3c89630d30f83d1a9fdd0f572\": rpc error: code = NotFound desc = could not find container \"fc1f13c22e649d123b47119bfe6ae454a7ef5ff3c89630d30f83d1a9fdd0f572\": container with ID starting with fc1f13c22e649d123b47119bfe6ae454a7ef5ff3c89630d30f83d1a9fdd0f572 not found: ID does not exist" Apr 24 16:43:06.761543 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.761493 2575 scope.go:117] "RemoveContainer" containerID="b7964686604c7a2c353d731078aaa6ab004f9c662e5a1942af44e16edaad7fa4" Apr 24 16:43:06.761656 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.761643 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7964686604c7a2c353d731078aaa6ab004f9c662e5a1942af44e16edaad7fa4"} err="failed to get container status \"b7964686604c7a2c353d731078aaa6ab004f9c662e5a1942af44e16edaad7fa4\": rpc error: code = NotFound desc = could not find container \"b7964686604c7a2c353d731078aaa6ab004f9c662e5a1942af44e16edaad7fa4\": container with ID starting with b7964686604c7a2c353d731078aaa6ab004f9c662e5a1942af44e16edaad7fa4 not found: ID does not exist" Apr 24 16:43:06.761705 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.761656 2575 scope.go:117] "RemoveContainer" containerID="6cbe5cd1786c0d38509cc63fea883233897dae543e2d80dcff148bd27be6d477" Apr 24 16:43:06.761811 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.761798 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cbe5cd1786c0d38509cc63fea883233897dae543e2d80dcff148bd27be6d477"} err="failed to get container status \"6cbe5cd1786c0d38509cc63fea883233897dae543e2d80dcff148bd27be6d477\": rpc error: code = NotFound desc = could not find container \"6cbe5cd1786c0d38509cc63fea883233897dae543e2d80dcff148bd27be6d477\": container with ID starting with 6cbe5cd1786c0d38509cc63fea883233897dae543e2d80dcff148bd27be6d477 not found: ID does not exist" Apr 24 16:43:06.761856 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.761810 2575 scope.go:117] "RemoveContainer" containerID="b2308835e13d9206b51ec9f28786d5054d421643c2d62be84a6cf21e4ca4c3de" Apr 24 16:43:06.761971 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.761958 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2308835e13d9206b51ec9f28786d5054d421643c2d62be84a6cf21e4ca4c3de"} err="failed to get container status \"b2308835e13d9206b51ec9f28786d5054d421643c2d62be84a6cf21e4ca4c3de\": rpc error: code = NotFound desc = could not find container \"b2308835e13d9206b51ec9f28786d5054d421643c2d62be84a6cf21e4ca4c3de\": container with ID starting with b2308835e13d9206b51ec9f28786d5054d421643c2d62be84a6cf21e4ca4c3de not found: ID does not exist" Apr 24 16:43:06.765506 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.765485 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 16:43:06.765850 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.765837 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="kube-rbac-proxy-metric" Apr 24 16:43:06.765891 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.765852 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="kube-rbac-proxy-metric" Apr 24 16:43:06.765891 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.765863 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="init-config-reloader" Apr 24 16:43:06.765891 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.765870 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="init-config-reloader" Apr 24 16:43:06.765891 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.765879 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="config-reloader" Apr 24 16:43:06.765891 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.765884 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="config-reloader" Apr 24 16:43:06.766034 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.765894 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="kube-rbac-proxy" Apr 24 16:43:06.766034 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.765900 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="kube-rbac-proxy" Apr 24 16:43:06.766034 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.765911 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="alertmanager" Apr 24 16:43:06.766034 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.765916 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="alertmanager" Apr 24 16:43:06.766034 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.765924 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="prom-label-proxy" Apr 24 16:43:06.766034 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.765930 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="prom-label-proxy" Apr 24 16:43:06.766034 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.765937 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="kube-rbac-proxy-web" Apr 24 16:43:06.766034 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.765942 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="kube-rbac-proxy-web" Apr 24 16:43:06.766034 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.765985 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="kube-rbac-proxy-web" Apr 24 16:43:06.766034 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.765994 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="kube-rbac-proxy-metric" Apr 24 16:43:06.766034 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.766000 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="prom-label-proxy" Apr 24 16:43:06.766034 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.766006 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="kube-rbac-proxy" Apr 24 16:43:06.766034 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.766013 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="alertmanager" Apr 24 16:43:06.766034 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.766019 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" containerName="config-reloader" Apr 24 16:43:06.771052 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.771033 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.775524 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.775504 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 24 16:43:06.775524 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.775517 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 24 16:43:06.775701 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.775507 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 24 16:43:06.775701 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.775650 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 24 16:43:06.775701 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.775690 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 24 16:43:06.775883 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.775868 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 24 16:43:06.775930 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.775899 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-mmfwp\"" Apr 24 16:43:06.775971 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.775954 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 24 16:43:06.776014 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.775967 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 24 16:43:06.781867 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.781847 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 24 16:43:06.788685 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.788657 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 16:43:06.836605 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.832925 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd069ff6-010f-4efa-b07d-01a8f02d7c42" path="/var/lib/kubelet/pods/bd069ff6-010f-4efa-b07d-01a8f02d7c42/volumes" Apr 24 16:43:06.884446 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.884406 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0818fb2f-39c7-434f-a46e-242055a30017-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.884446 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.884447 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0818fb2f-39c7-434f-a46e-242055a30017-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.884706 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.884512 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0818fb2f-39c7-434f-a46e-242055a30017-config-out\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.884706 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.884545 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0818fb2f-39c7-434f-a46e-242055a30017-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.884706 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.884614 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0818fb2f-39c7-434f-a46e-242055a30017-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.884706 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.884646 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0818fb2f-39c7-434f-a46e-242055a30017-config-volume\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.884706 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.884668 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sf5p\" (UniqueName: \"kubernetes.io/projected/0818fb2f-39c7-434f-a46e-242055a30017-kube-api-access-2sf5p\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.884905 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.884735 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0818fb2f-39c7-434f-a46e-242055a30017-web-config\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.884905 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.884813 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0818fb2f-39c7-434f-a46e-242055a30017-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.884905 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.884837 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0818fb2f-39c7-434f-a46e-242055a30017-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.884905 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.884882 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0818fb2f-39c7-434f-a46e-242055a30017-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.885048 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.884908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0818fb2f-39c7-434f-a46e-242055a30017-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.885048 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.884943 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0818fb2f-39c7-434f-a46e-242055a30017-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.985648 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.985603 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0818fb2f-39c7-434f-a46e-242055a30017-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.985648 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.985651 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0818fb2f-39c7-434f-a46e-242055a30017-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.985901 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.985686 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0818fb2f-39c7-434f-a46e-242055a30017-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.985901 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.985720 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0818fb2f-39c7-434f-a46e-242055a30017-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.985901 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.985744 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0818fb2f-39c7-434f-a46e-242055a30017-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.986042 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.985897 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0818fb2f-39c7-434f-a46e-242055a30017-config-out\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.986042 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.985959 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0818fb2f-39c7-434f-a46e-242055a30017-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.986042 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.986010 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0818fb2f-39c7-434f-a46e-242055a30017-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.986243 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.986052 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0818fb2f-39c7-434f-a46e-242055a30017-config-volume\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.986243 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.986075 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sf5p\" (UniqueName: \"kubernetes.io/projected/0818fb2f-39c7-434f-a46e-242055a30017-kube-api-access-2sf5p\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.986243 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.986145 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0818fb2f-39c7-434f-a46e-242055a30017-web-config\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.986243 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.986183 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0818fb2f-39c7-434f-a46e-242055a30017-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.986243 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.986217 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0818fb2f-39c7-434f-a46e-242055a30017-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.986500 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.986401 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0818fb2f-39c7-434f-a46e-242055a30017-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.986613 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.986589 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0818fb2f-39c7-434f-a46e-242055a30017-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.986671 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.986643 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0818fb2f-39c7-434f-a46e-242055a30017-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.988944 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.988908 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0818fb2f-39c7-434f-a46e-242055a30017-config-out\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.988944 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.988923 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0818fb2f-39c7-434f-a46e-242055a30017-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.989137 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.989010 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0818fb2f-39c7-434f-a46e-242055a30017-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.989137 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.989027 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0818fb2f-39c7-434f-a46e-242055a30017-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.989137 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.989037 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0818fb2f-39c7-434f-a46e-242055a30017-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.989311 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.989288 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0818fb2f-39c7-434f-a46e-242055a30017-web-config\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.989425 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.989407 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0818fb2f-39c7-434f-a46e-242055a30017-config-volume\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.989783 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.989766 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0818fb2f-39c7-434f-a46e-242055a30017-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.990793 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.990777 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0818fb2f-39c7-434f-a46e-242055a30017-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:06.995363 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:06.995339 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sf5p\" (UniqueName: \"kubernetes.io/projected/0818fb2f-39c7-434f-a46e-242055a30017-kube-api-access-2sf5p\") pod \"alertmanager-main-0\" (UID: \"0818fb2f-39c7-434f-a46e-242055a30017\") " pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:07.081121 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:07.081052 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 24 16:43:07.236371 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:07.236340 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 24 16:43:07.239998 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:43:07.239969 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0818fb2f_39c7_434f_a46e_242055a30017.slice/crio-c0841f7fb4fb2d7a6db6d84bcb6c3ea3554344c457a67c193d4597edb649a3be WatchSource:0}: Error finding container c0841f7fb4fb2d7a6db6d84bcb6c3ea3554344c457a67c193d4597edb649a3be: Status 404 returned error can't find the container with id c0841f7fb4fb2d7a6db6d84bcb6c3ea3554344c457a67c193d4597edb649a3be Apr 24 16:43:07.708897 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:07.708862 2575 generic.go:358] "Generic (PLEG): container finished" podID="0818fb2f-39c7-434f-a46e-242055a30017" containerID="d715d33c256042e7e2b91db2750028177f371b426273122a9361075b54f657b0" exitCode=0 Apr 24 16:43:07.709374 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:07.708952 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0818fb2f-39c7-434f-a46e-242055a30017","Type":"ContainerDied","Data":"d715d33c256042e7e2b91db2750028177f371b426273122a9361075b54f657b0"} Apr 24 16:43:07.709374 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:07.708994 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0818fb2f-39c7-434f-a46e-242055a30017","Type":"ContainerStarted","Data":"c0841f7fb4fb2d7a6db6d84bcb6c3ea3554344c457a67c193d4597edb649a3be"} Apr 24 16:43:08.715787 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:08.715747 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0818fb2f-39c7-434f-a46e-242055a30017","Type":"ContainerStarted","Data":"41ae49b4e0217f56c969b143dbd74a52b9edc33ee8fb76ab80c7f3adec624c4d"} Apr 24 16:43:08.715787 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:08.715789 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0818fb2f-39c7-434f-a46e-242055a30017","Type":"ContainerStarted","Data":"6904107fb5c28ae7c5f3211e634ceed1a2a2bbeda396b607fe23cd84eec22c52"} Apr 24 16:43:08.716218 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:08.715800 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0818fb2f-39c7-434f-a46e-242055a30017","Type":"ContainerStarted","Data":"3d76bdd77afd813a4415d8e723f2e33c019d17867dba54a9af0a93999b66a13b"} Apr 24 16:43:08.716218 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:08.715809 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0818fb2f-39c7-434f-a46e-242055a30017","Type":"ContainerStarted","Data":"4e4d73247c94f3d26cf7ce2a38d9777d6e1803cc40dffce861eb499b3ff700b0"} Apr 24 16:43:08.716218 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:08.715817 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0818fb2f-39c7-434f-a46e-242055a30017","Type":"ContainerStarted","Data":"579355850c46be53dc9c210edf0a7f6fbabff4693e6a4c67a9aa490def907bfa"} Apr 24 16:43:08.716218 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:08.715826 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0818fb2f-39c7-434f-a46e-242055a30017","Type":"ContainerStarted","Data":"f42738927188e42aa1fdd743207f162ff066d68dd37d3a4369920a538e43ede2"} Apr 24 16:43:08.747332 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:08.747287 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.747268707 podStartE2EDuration="2.747268707s" podCreationTimestamp="2026-04-24 16:43:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:43:08.745901428 +0000 UTC m=+276.459969465" watchObservedRunningTime="2026-04-24 16:43:08.747268707 +0000 UTC m=+276.461336743" Apr 24 16:43:09.510362 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.510321 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 16:43:09.510809 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.510780 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="prometheus" containerID="cri-o://6c315aeca48985093c24fa5da9fc942cc6ba4fb823406f118e9a19c88320ecec" gracePeriod=600 Apr 24 16:43:09.510899 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.510792 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="kube-rbac-proxy" containerID="cri-o://22e97e75c90f184d16631f202528babc86e724d9febc3f79d417642cd8f9a01a" gracePeriod=600 Apr 24 16:43:09.510899 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.510828 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="kube-rbac-proxy-web" containerID="cri-o://6bd0f79278121356b75fca3cf0361407765ea3dd744b00b0c9a643da2446ad23" gracePeriod=600 Apr 24 16:43:09.510899 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.510839 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="kube-rbac-proxy-thanos" containerID="cri-o://732f681dd41a49c95cd0c1a388bae1cf06f7357e413afb883da58f52f31aaac0" gracePeriod=600 Apr 24 16:43:09.511042 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.510902 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="thanos-sidecar" containerID="cri-o://5e7ada27d474228359bf257944306014aa0a5c7f680affd28293b80e33b9cd3c" gracePeriod=600 Apr 24 16:43:09.511042 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.510861 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="config-reloader" containerID="cri-o://34d095b4daa46114c6218d6006227e217eeeb5978208a95cf996e8d23b2dcc4f" gracePeriod=600 Apr 24 16:43:09.722273 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.722230 2575 generic.go:358] "Generic (PLEG): container finished" podID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerID="732f681dd41a49c95cd0c1a388bae1cf06f7357e413afb883da58f52f31aaac0" exitCode=0 Apr 24 16:43:09.722273 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.722254 2575 generic.go:358] "Generic (PLEG): container finished" podID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerID="22e97e75c90f184d16631f202528babc86e724d9febc3f79d417642cd8f9a01a" exitCode=0 Apr 24 16:43:09.722273 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.722259 2575 generic.go:358] "Generic (PLEG): container finished" podID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerID="6bd0f79278121356b75fca3cf0361407765ea3dd744b00b0c9a643da2446ad23" exitCode=0 Apr 24 16:43:09.722273 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.722265 2575 generic.go:358] "Generic (PLEG): container finished" podID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerID="5e7ada27d474228359bf257944306014aa0a5c7f680affd28293b80e33b9cd3c" exitCode=0 Apr 24 16:43:09.722273 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.722271 2575 generic.go:358] "Generic (PLEG): container finished" podID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerID="34d095b4daa46114c6218d6006227e217eeeb5978208a95cf996e8d23b2dcc4f" exitCode=0 Apr 24 16:43:09.722273 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.722276 2575 generic.go:358] "Generic (PLEG): container finished" podID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerID="6c315aeca48985093c24fa5da9fc942cc6ba4fb823406f118e9a19c88320ecec" exitCode=0 Apr 24 16:43:09.722818 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.722297 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0","Type":"ContainerDied","Data":"732f681dd41a49c95cd0c1a388bae1cf06f7357e413afb883da58f52f31aaac0"} Apr 24 16:43:09.722818 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.722343 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0","Type":"ContainerDied","Data":"22e97e75c90f184d16631f202528babc86e724d9febc3f79d417642cd8f9a01a"} Apr 24 16:43:09.722818 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.722378 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0","Type":"ContainerDied","Data":"6bd0f79278121356b75fca3cf0361407765ea3dd744b00b0c9a643da2446ad23"} Apr 24 16:43:09.722818 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.722399 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0","Type":"ContainerDied","Data":"5e7ada27d474228359bf257944306014aa0a5c7f680affd28293b80e33b9cd3c"} Apr 24 16:43:09.722818 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.722416 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0","Type":"ContainerDied","Data":"34d095b4daa46114c6218d6006227e217eeeb5978208a95cf996e8d23b2dcc4f"} Apr 24 16:43:09.722818 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.722429 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0","Type":"ContainerDied","Data":"6c315aeca48985093c24fa5da9fc942cc6ba4fb823406f118e9a19c88320ecec"} Apr 24 16:43:09.747500 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.747473 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:09.914591 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.914542 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-grpc-tls\") pod \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " Apr 24 16:43:09.914774 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.914604 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-metrics-client-certs\") pod \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " Apr 24 16:43:09.914774 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.914634 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-prometheus-k8s-rulefiles-0\") pod \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " Apr 24 16:43:09.914774 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.914668 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-thanos-prometheus-http-client-file\") pod \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " Apr 24 16:43:09.914774 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.914699 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-configmap-serving-certs-ca-bundle\") pod \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " Apr 24 16:43:09.914774 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.914772 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-prometheus-k8s-db\") pod \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " Apr 24 16:43:09.915026 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.914797 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " Apr 24 16:43:09.915026 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.914833 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-configmap-kubelet-serving-ca-bundle\") pod \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " Apr 24 16:43:09.915026 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.914862 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-config\") pod \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " Apr 24 16:43:09.915026 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.914890 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4p5b\" (UniqueName: \"kubernetes.io/projected/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-kube-api-access-f4p5b\") pod \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " Apr 24 16:43:09.915026 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.914913 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-configmap-metrics-client-ca\") pod \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " Apr 24 16:43:09.915026 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.914939 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-tls-assets\") pod \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " Apr 24 16:43:09.915026 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.914961 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-web-config\") pod \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " Apr 24 16:43:09.915026 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.914996 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-kube-rbac-proxy\") pod \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " Apr 24 16:43:09.915026 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.915023 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-prometheus-k8s-tls\") pod \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " Apr 24 16:43:09.915535 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.915052 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-prometheus-trusted-ca-bundle\") pod \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " Apr 24 16:43:09.915535 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.915118 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-config-out\") pod \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " Apr 24 16:43:09.915535 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.915154 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\" (UID: \"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0\") " Apr 24 16:43:09.915535 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.915428 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" (UID: "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:43:09.915740 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.915647 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-configmap-metrics-client-ca\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:09.915847 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.915814 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" (UID: "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:43:09.916375 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.916082 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" (UID: "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:43:09.916375 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.916348 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" (UID: "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:43:09.917524 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.917491 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" (UID: "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:43:09.917619 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.917520 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" (UID: "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:43:09.917619 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.917555 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" (UID: "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:43:09.917865 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.917839 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" (UID: "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:43:09.918470 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.918229 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" (UID: "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:43:09.918960 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.918709 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" (UID: "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:43:09.918960 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.918821 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-config" (OuterVolumeSpecName: "config") pod "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" (UID: "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:43:09.919139 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.919102 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" (UID: "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:43:09.919376 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.919347 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" (UID: "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:43:09.919772 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.919749 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-kube-api-access-f4p5b" (OuterVolumeSpecName: "kube-api-access-f4p5b") pod "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" (UID: "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0"). InnerVolumeSpecName "kube-api-access-f4p5b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:43:09.919844 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.919751 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-config-out" (OuterVolumeSpecName: "config-out") pod "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" (UID: "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:43:09.920034 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.920020 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" (UID: "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:43:09.920191 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.920169 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" (UID: "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:43:09.930459 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:09.930423 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-web-config" (OuterVolumeSpecName: "web-config") pod "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" (UID: "fb0669f0-9212-4ad7-9ba8-15161d2b0dd0"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:43:10.016889 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.016849 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-prometheus-k8s-db\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:10.016889 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.016882 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:10.016889 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.016895 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:10.017159 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.016906 2575 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-config\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:10.017159 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.016915 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f4p5b\" (UniqueName: \"kubernetes.io/projected/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-kube-api-access-f4p5b\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:10.017159 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.016924 2575 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-tls-assets\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:10.017159 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.016933 2575 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-web-config\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:10.017159 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.016941 2575 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-kube-rbac-proxy\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:10.017159 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.016951 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-prometheus-k8s-tls\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:10.017159 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.016961 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-prometheus-trusted-ca-bundle\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:10.017159 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.016970 2575 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-config-out\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:10.017159 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.016980 2575 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:10.017159 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.016988 2575 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-grpc-tls\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:10.017159 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.016997 2575 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-secret-metrics-client-certs\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:10.017159 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.017006 2575 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:10.017159 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.017016 2575 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-thanos-prometheus-http-client-file\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:10.017159 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.017025 2575 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:43:10.728747 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.728710 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fb0669f0-9212-4ad7-9ba8-15161d2b0dd0","Type":"ContainerDied","Data":"b11a520f0353f93ad11cddba4e55be55bf1f3a8c76ad7c635a314fc841efd492"} Apr 24 16:43:10.729191 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.728763 2575 scope.go:117] "RemoveContainer" containerID="732f681dd41a49c95cd0c1a388bae1cf06f7357e413afb883da58f52f31aaac0" Apr 24 16:43:10.729191 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.728783 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:10.736720 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.736698 2575 scope.go:117] "RemoveContainer" containerID="22e97e75c90f184d16631f202528babc86e724d9febc3f79d417642cd8f9a01a" Apr 24 16:43:10.744452 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.744428 2575 scope.go:117] "RemoveContainer" containerID="6bd0f79278121356b75fca3cf0361407765ea3dd744b00b0c9a643da2446ad23" Apr 24 16:43:10.752118 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.752065 2575 scope.go:117] "RemoveContainer" containerID="5e7ada27d474228359bf257944306014aa0a5c7f680affd28293b80e33b9cd3c" Apr 24 16:43:10.752383 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.752353 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 16:43:10.756292 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.756266 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 16:43:10.760105 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.760062 2575 scope.go:117] "RemoveContainer" containerID="34d095b4daa46114c6218d6006227e217eeeb5978208a95cf996e8d23b2dcc4f" Apr 24 16:43:10.767401 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.767381 2575 scope.go:117] "RemoveContainer" containerID="6c315aeca48985093c24fa5da9fc942cc6ba4fb823406f118e9a19c88320ecec" Apr 24 16:43:10.775427 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.775369 2575 scope.go:117] "RemoveContainer" containerID="8c914f11936fb24f8e57b4955f905235df16c1eb02532b4854b200fd39de9fbf" Apr 24 16:43:10.783598 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.783574 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 16:43:10.783959 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.783942 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="kube-rbac-proxy-web" Apr 24 16:43:10.784035 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.783962 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="kube-rbac-proxy-web" Apr 24 16:43:10.784035 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.783983 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="thanos-sidecar" Apr 24 16:43:10.784035 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.783992 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="thanos-sidecar" Apr 24 16:43:10.784035 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.784002 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="kube-rbac-proxy" Apr 24 16:43:10.784035 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.784011 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="kube-rbac-proxy" Apr 24 16:43:10.784035 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.784026 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="kube-rbac-proxy-thanos" Apr 24 16:43:10.784035 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.784034 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="kube-rbac-proxy-thanos" Apr 24 16:43:10.784409 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.784048 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="init-config-reloader" Apr 24 16:43:10.784409 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.784057 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="init-config-reloader" Apr 24 16:43:10.784409 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.784081 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="prometheus" Apr 24 16:43:10.784409 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.784107 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="prometheus" Apr 24 16:43:10.784409 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.784117 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="config-reloader" Apr 24 16:43:10.784409 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.784126 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="config-reloader" Apr 24 16:43:10.784409 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.784216 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="prometheus" Apr 24 16:43:10.784409 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.784230 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="kube-rbac-proxy-web" Apr 24 16:43:10.784409 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.784243 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="kube-rbac-proxy" Apr 24 16:43:10.784409 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.784252 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="config-reloader" Apr 24 16:43:10.784409 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.784262 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="thanos-sidecar" Apr 24 16:43:10.784409 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.784272 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" containerName="kube-rbac-proxy-thanos" Apr 24 16:43:10.789885 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.789856 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:10.792596 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.792571 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 16:43:10.792741 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.792571 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-ammrmejd5ifce\"" Apr 24 16:43:10.792741 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.792615 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 16:43:10.792741 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.792572 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-k4mxp\"" Apr 24 16:43:10.792934 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.792920 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 16:43:10.792996 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.792930 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 16:43:10.793161 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.793128 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 16:43:10.793267 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.793171 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 16:43:10.793267 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.793205 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 16:43:10.793267 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.793225 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 16:43:10.793430 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.793267 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 16:43:10.793473 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.793439 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 16:43:10.793473 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.793463 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 16:43:10.797019 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.796995 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 16:43:10.802606 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.802195 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 16:43:10.803700 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.803677 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 16:43:10.831760 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.831719 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb0669f0-9212-4ad7-9ba8-15161d2b0dd0" path="/var/lib/kubelet/pods/fb0669f0-9212-4ad7-9ba8-15161d2b0dd0/volumes" Apr 24 16:43:10.927850 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.927791 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:10.927850 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.927853 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:10.928066 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.927884 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ffzq\" (UniqueName: \"kubernetes.io/projected/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-kube-api-access-6ffzq\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:10.928066 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.927913 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:10.928066 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.927929 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:10.928066 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.927959 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:10.928066 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.927981 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:10.928066 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.928012 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:10.928066 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.928037 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:10.928303 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.928066 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:10.928303 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.928119 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-web-config\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:10.928303 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.928221 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-config\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:10.928303 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.928261 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-config-out\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:10.928303 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.928280 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:10.928303 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.928297 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:10.928479 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.928332 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:10.928479 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.928370 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:10.928479 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:10.928404 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.028941 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.028843 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-config\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.028941 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.028898 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-config-out\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.028941 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.028916 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.028941 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.028934 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.029242 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.028971 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.029242 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.028989 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.029242 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.029010 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.029242 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.029168 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.029242 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.029216 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.029485 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.029265 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ffzq\" (UniqueName: \"kubernetes.io/projected/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-kube-api-access-6ffzq\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.029485 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.029318 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.029485 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.029345 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.029485 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.029399 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.029485 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.029426 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.029485 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.029465 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.029759 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.029499 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.029759 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.029528 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.029759 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.029566 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-web-config\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.029912 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.029886 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.030402 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.030244 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.031782 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.030858 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.031782 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.031552 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.032335 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.031990 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-config-out\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.032335 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.032226 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.032335 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.032258 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.032560 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.032377 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-web-config\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.032560 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.032539 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.033345 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.032929 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.033345 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.033311 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.033823 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.033769 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-config\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.034756 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.034730 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.034880 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.034860 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.034942 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.034900 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.034980 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.034959 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.035582 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.035566 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.041785 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.041754 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ffzq\" (UniqueName: \"kubernetes.io/projected/6cd4eebe-82c0-4ce9-9b33-8bbea44303bb-kube-api-access-6ffzq\") pod \"prometheus-k8s-0\" (UID: \"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.104430 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.104387 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:11.244916 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.244880 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 16:43:11.245617 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:43:11.245584 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cd4eebe_82c0_4ce9_9b33_8bbea44303bb.slice/crio-dc2f8b2c9f54d5da6f3328f9f3550fbebe87ce784642c9e035d71041b4aad668 WatchSource:0}: Error finding container dc2f8b2c9f54d5da6f3328f9f3550fbebe87ce784642c9e035d71041b4aad668: Status 404 returned error can't find the container with id dc2f8b2c9f54d5da6f3328f9f3550fbebe87ce784642c9e035d71041b4aad668 Apr 24 16:43:11.733683 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.733648 2575 generic.go:358] "Generic (PLEG): container finished" podID="6cd4eebe-82c0-4ce9-9b33-8bbea44303bb" containerID="079da4251b8958c8bd7dbeda70b295941a8849754e8f1d554b56d38be58ad34c" exitCode=0 Apr 24 16:43:11.734185 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.733730 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb","Type":"ContainerDied","Data":"079da4251b8958c8bd7dbeda70b295941a8849754e8f1d554b56d38be58ad34c"} Apr 24 16:43:11.734185 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:11.733766 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb","Type":"ContainerStarted","Data":"dc2f8b2c9f54d5da6f3328f9f3550fbebe87ce784642c9e035d71041b4aad668"} Apr 24 16:43:12.230844 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:43:12.230797 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-z6w7t" podUID="cdbe4c96-edde-4285-9466-eeb5fd3f169b" Apr 24 16:43:12.230844 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:43:12.230834 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" podUID="9de18b4c-44be-4c2e-9b15-1b3401784bcd" Apr 24 16:43:12.231053 ip-10-0-131-37 kubenswrapper[2575]: E0424 16:43:12.230833 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-ckvvn" podUID="ed7942f7-292f-402b-af38-8f0c16de0ee3" Apr 24 16:43:12.740736 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:12.740693 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb","Type":"ContainerStarted","Data":"cac02f0c109af484e2fc863670ef76eff7c1db079734ebd5eeeec349e1cb0764"} Apr 24 16:43:12.740736 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:12.740739 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ckvvn" Apr 24 16:43:12.741278 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:12.740747 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" Apr 24 16:43:12.741278 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:12.740744 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z6w7t" Apr 24 16:43:12.741278 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:12.740742 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb","Type":"ContainerStarted","Data":"111b814abf7e558e4f84963e76306b4704a5bb19f60d4b4a1d1a9d98ac7c3b6a"} Apr 24 16:43:12.741278 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:12.740833 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb","Type":"ContainerStarted","Data":"e70a2b1c32e67efb974bcb9aae17652bb4938827cbf0766272c778a318647b17"} Apr 24 16:43:12.741278 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:12.740847 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb","Type":"ContainerStarted","Data":"db12bcb4198d1e19a09d3f0b6c9ba45b824540786eb1bf2c4b5f3036309c9211"} Apr 24 16:43:12.741278 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:12.740859 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb","Type":"ContainerStarted","Data":"636b1fae7e8571c272b5190cf3ff618bf42d7f4bc5825b511f959fcc23c26a52"} Apr 24 16:43:12.741278 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:12.740871 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6cd4eebe-82c0-4ce9-9b33-8bbea44303bb","Type":"ContainerStarted","Data":"2c82e37075831a437795a84c566731eff72e665bdc4cb2ebfec87fe414095ed8"} Apr 24 16:43:12.770455 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:12.770399 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.770378363 podStartE2EDuration="2.770378363s" podCreationTimestamp="2026-04-24 16:43:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:43:12.768269188 +0000 UTC m=+280.482337249" watchObservedRunningTime="2026-04-24 16:43:12.770378363 +0000 UTC m=+280.484446402" Apr 24 16:43:15.673896 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:15.673852 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fwvss\" (UID: \"9de18b4c-44be-4c2e-9b15-1b3401784bcd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" Apr 24 16:43:15.674442 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:15.673937 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert\") pod \"ingress-canary-z6w7t\" (UID: \"cdbe4c96-edde-4285-9466-eeb5fd3f169b\") " pod="openshift-ingress-canary/ingress-canary-z6w7t" Apr 24 16:43:15.674442 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:15.673956 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls\") pod \"dns-default-ckvvn\" (UID: \"ed7942f7-292f-402b-af38-8f0c16de0ee3\") " pod="openshift-dns/dns-default-ckvvn" Apr 24 16:43:15.676383 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:15.676346 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed7942f7-292f-402b-af38-8f0c16de0ee3-metrics-tls\") pod \"dns-default-ckvvn\" (UID: \"ed7942f7-292f-402b-af38-8f0c16de0ee3\") " pod="openshift-dns/dns-default-ckvvn" Apr 24 16:43:15.676503 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:15.676389 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cdbe4c96-edde-4285-9466-eeb5fd3f169b-cert\") pod \"ingress-canary-z6w7t\" (UID: \"cdbe4c96-edde-4285-9466-eeb5fd3f169b\") " pod="openshift-ingress-canary/ingress-canary-z6w7t" Apr 24 16:43:15.676503 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:15.676389 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9de18b4c-44be-4c2e-9b15-1b3401784bcd-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-fwvss\" (UID: \"9de18b4c-44be-4c2e-9b15-1b3401784bcd\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" Apr 24 16:43:15.744421 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:15.744388 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-lhhc7\"" Apr 24 16:43:15.745381 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:15.745361 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-zzj8j\"" Apr 24 16:43:15.745595 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:15.745580 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-qzvrt\"" Apr 24 16:43:15.752187 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:15.752159 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ckvvn" Apr 24 16:43:15.752302 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:15.752159 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" Apr 24 16:43:15.752362 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:15.752174 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-z6w7t" Apr 24 16:43:15.928577 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:15.928545 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-fwvss"] Apr 24 16:43:15.930166 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:43:15.930133 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9de18b4c_44be_4c2e_9b15_1b3401784bcd.slice/crio-c535c401af3afdfcb34bbbcac3f2a9365da147c3796bf2b93fc98024be2cccf1 WatchSource:0}: Error finding container c535c401af3afdfcb34bbbcac3f2a9365da147c3796bf2b93fc98024be2cccf1: Status 404 returned error can't find the container with id c535c401af3afdfcb34bbbcac3f2a9365da147c3796bf2b93fc98024be2cccf1 Apr 24 16:43:15.947880 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:15.947764 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-z6w7t"] Apr 24 16:43:15.950566 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:43:15.950536 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdbe4c96_edde_4285_9466_eeb5fd3f169b.slice/crio-0ccce07e1b0fd9c950499053279731e105a7b6ccaf4a08d576cdb09d97a17b42 WatchSource:0}: Error finding container 0ccce07e1b0fd9c950499053279731e105a7b6ccaf4a08d576cdb09d97a17b42: Status 404 returned error can't find the container with id 0ccce07e1b0fd9c950499053279731e105a7b6ccaf4a08d576cdb09d97a17b42 Apr 24 16:43:16.104566 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:16.104523 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:43:16.149572 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:16.149539 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ckvvn"] Apr 24 16:43:16.152947 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:43:16.152917 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded7942f7_292f_402b_af38_8f0c16de0ee3.slice/crio-54f2d7b3bff8b07e136838823a45d732e8b8376d5477ebdba4aabf37bdf26ead WatchSource:0}: Error finding container 54f2d7b3bff8b07e136838823a45d732e8b8376d5477ebdba4aabf37bdf26ead: Status 404 returned error can't find the container with id 54f2d7b3bff8b07e136838823a45d732e8b8376d5477ebdba4aabf37bdf26ead Apr 24 16:43:16.756424 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:16.756354 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z6w7t" event={"ID":"cdbe4c96-edde-4285-9466-eeb5fd3f169b","Type":"ContainerStarted","Data":"0ccce07e1b0fd9c950499053279731e105a7b6ccaf4a08d576cdb09d97a17b42"} Apr 24 16:43:16.758188 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:16.758132 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" event={"ID":"9de18b4c-44be-4c2e-9b15-1b3401784bcd","Type":"ContainerStarted","Data":"c535c401af3afdfcb34bbbcac3f2a9365da147c3796bf2b93fc98024be2cccf1"} Apr 24 16:43:16.760106 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:16.760045 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ckvvn" event={"ID":"ed7942f7-292f-402b-af38-8f0c16de0ee3","Type":"ContainerStarted","Data":"54f2d7b3bff8b07e136838823a45d732e8b8376d5477ebdba4aabf37bdf26ead"} Apr 24 16:43:18.770302 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:18.770258 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-z6w7t" event={"ID":"cdbe4c96-edde-4285-9466-eeb5fd3f169b","Type":"ContainerStarted","Data":"d5ed06f16458ade4d42fa32109b4cba8a0a81ab7244a4922059ee9353898c4e3"} Apr 24 16:43:18.771634 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:18.771608 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" event={"ID":"9de18b4c-44be-4c2e-9b15-1b3401784bcd","Type":"ContainerStarted","Data":"ed587f0e06dffab6b324bcdeca1e8521b55403fea48448645b52811c90067b9e"} Apr 24 16:43:18.773139 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:18.773111 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ckvvn" event={"ID":"ed7942f7-292f-402b-af38-8f0c16de0ee3","Type":"ContainerStarted","Data":"7a4ea63752e93480ac1b2bf51ad06f2f85015f1c41a3c322eead5d8e678168ea"} Apr 24 16:43:18.773235 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:18.773143 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ckvvn" event={"ID":"ed7942f7-292f-402b-af38-8f0c16de0ee3","Type":"ContainerStarted","Data":"3e21c7657b54719ff62264306812891629c9ffe65fb504b61d86043c8b5c5f30"} Apr 24 16:43:18.773235 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:18.773202 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-ckvvn" Apr 24 16:43:18.788505 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:18.788450 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-z6w7t" podStartSLOduration=251.67773264 podStartE2EDuration="4m13.788433295s" podCreationTimestamp="2026-04-24 16:39:05 +0000 UTC" firstStartedPulling="2026-04-24 16:43:15.952787469 +0000 UTC m=+283.666855486" lastFinishedPulling="2026-04-24 16:43:18.063488127 +0000 UTC m=+285.777556141" observedRunningTime="2026-04-24 16:43:18.787347516 +0000 UTC m=+286.501415545" watchObservedRunningTime="2026-04-24 16:43:18.788433295 +0000 UTC m=+286.502501321" Apr 24 16:43:18.804527 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:18.804472 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ckvvn" podStartSLOduration=251.902059248 podStartE2EDuration="4m13.804456861s" podCreationTimestamp="2026-04-24 16:39:05 +0000 UTC" firstStartedPulling="2026-04-24 16:43:16.154788047 +0000 UTC m=+283.868856061" lastFinishedPulling="2026-04-24 16:43:18.057185647 +0000 UTC m=+285.771253674" observedRunningTime="2026-04-24 16:43:18.802990452 +0000 UTC m=+286.517058488" watchObservedRunningTime="2026-04-24 16:43:18.804456861 +0000 UTC m=+286.518524897" Apr 24 16:43:18.825419 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:18.825348 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-fwvss" podStartSLOduration=274.705905707 podStartE2EDuration="4m36.825328118s" podCreationTimestamp="2026-04-24 16:38:42 +0000 UTC" firstStartedPulling="2026-04-24 16:43:15.932609709 +0000 UTC m=+283.646677724" lastFinishedPulling="2026-04-24 16:43:18.052032117 +0000 UTC m=+285.766100135" observedRunningTime="2026-04-24 16:43:18.824243262 +0000 UTC m=+286.538311314" watchObservedRunningTime="2026-04-24 16:43:18.825328118 +0000 UTC m=+286.539396156" Apr 24 16:43:28.778889 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:28.778859 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ckvvn" Apr 24 16:43:32.747163 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:32.747134 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 16:43:32.747649 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:32.747620 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 16:43:32.755816 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:43:32.755781 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 16:44:11.105543 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:44:11.105490 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:44:11.121655 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:44:11.121619 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:44:11.978862 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:44:11.978830 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 16:48:32.775569 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:48:32.775537 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 16:48:32.776976 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:48:32.776957 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 16:49:17.737474 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:17.737437 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-n6cxb"] Apr 24 16:49:17.740724 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:17.740709 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-n6cxb" Apr 24 16:49:17.743148 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:17.743124 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 16:49:17.743295 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:17.743162 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 16:49:17.743295 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:17.743162 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-z9xpl\"" Apr 24 16:49:17.744076 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:17.744057 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 16:49:17.747893 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:17.747839 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-n6cxb"] Apr 24 16:49:17.877702 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:17.877666 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvgvj\" (UniqueName: \"kubernetes.io/projected/34711f0e-be64-4a74-815d-7a4a00dd0413-kube-api-access-hvgvj\") pod \"s3-init-n6cxb\" (UID: \"34711f0e-be64-4a74-815d-7a4a00dd0413\") " pod="kserve/s3-init-n6cxb" Apr 24 16:49:17.978890 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:17.978860 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hvgvj\" (UniqueName: \"kubernetes.io/projected/34711f0e-be64-4a74-815d-7a4a00dd0413-kube-api-access-hvgvj\") pod \"s3-init-n6cxb\" (UID: \"34711f0e-be64-4a74-815d-7a4a00dd0413\") " pod="kserve/s3-init-n6cxb" Apr 24 16:49:17.989562 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:17.989508 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvgvj\" (UniqueName: \"kubernetes.io/projected/34711f0e-be64-4a74-815d-7a4a00dd0413-kube-api-access-hvgvj\") pod \"s3-init-n6cxb\" (UID: \"34711f0e-be64-4a74-815d-7a4a00dd0413\") " pod="kserve/s3-init-n6cxb" Apr 24 16:49:18.067205 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:18.067172 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-n6cxb" Apr 24 16:49:18.181655 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:18.181628 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-n6cxb"] Apr 24 16:49:18.184412 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:49:18.184384 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34711f0e_be64_4a74_815d_7a4a00dd0413.slice/crio-435725b090c2f5c7189e32ee3ca29e39a0208584f3a7345e6128cc9100d4b08c WatchSource:0}: Error finding container 435725b090c2f5c7189e32ee3ca29e39a0208584f3a7345e6128cc9100d4b08c: Status 404 returned error can't find the container with id 435725b090c2f5c7189e32ee3ca29e39a0208584f3a7345e6128cc9100d4b08c Apr 24 16:49:18.186058 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:18.186044 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:49:18.914604 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:18.914551 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-n6cxb" event={"ID":"34711f0e-be64-4a74-815d-7a4a00dd0413","Type":"ContainerStarted","Data":"435725b090c2f5c7189e32ee3ca29e39a0208584f3a7345e6128cc9100d4b08c"} Apr 24 16:49:22.930446 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:22.930399 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-n6cxb" event={"ID":"34711f0e-be64-4a74-815d-7a4a00dd0413","Type":"ContainerStarted","Data":"2137294760e73e62633c20d2f93f0b842989bd0b7382e0e1a66dc9573cb66137"} Apr 24 16:49:22.945584 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:22.945528 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-n6cxb" podStartSLOduration=1.6144552220000001 podStartE2EDuration="5.945508977s" podCreationTimestamp="2026-04-24 16:49:17 +0000 UTC" firstStartedPulling="2026-04-24 16:49:18.186189955 +0000 UTC m=+645.900257968" lastFinishedPulling="2026-04-24 16:49:22.517243699 +0000 UTC m=+650.231311723" observedRunningTime="2026-04-24 16:49:22.943985129 +0000 UTC m=+650.658053173" watchObservedRunningTime="2026-04-24 16:49:22.945508977 +0000 UTC m=+650.659577013" Apr 24 16:49:25.940572 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:25.940540 2575 generic.go:358] "Generic (PLEG): container finished" podID="34711f0e-be64-4a74-815d-7a4a00dd0413" containerID="2137294760e73e62633c20d2f93f0b842989bd0b7382e0e1a66dc9573cb66137" exitCode=0 Apr 24 16:49:25.940954 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:25.940612 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-n6cxb" event={"ID":"34711f0e-be64-4a74-815d-7a4a00dd0413","Type":"ContainerDied","Data":"2137294760e73e62633c20d2f93f0b842989bd0b7382e0e1a66dc9573cb66137"} Apr 24 16:49:27.067181 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:27.067159 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-n6cxb" Apr 24 16:49:27.166722 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:27.166691 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvgvj\" (UniqueName: \"kubernetes.io/projected/34711f0e-be64-4a74-815d-7a4a00dd0413-kube-api-access-hvgvj\") pod \"34711f0e-be64-4a74-815d-7a4a00dd0413\" (UID: \"34711f0e-be64-4a74-815d-7a4a00dd0413\") " Apr 24 16:49:27.168821 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:27.168796 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34711f0e-be64-4a74-815d-7a4a00dd0413-kube-api-access-hvgvj" (OuterVolumeSpecName: "kube-api-access-hvgvj") pod "34711f0e-be64-4a74-815d-7a4a00dd0413" (UID: "34711f0e-be64-4a74-815d-7a4a00dd0413"). InnerVolumeSpecName "kube-api-access-hvgvj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:49:27.267932 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:27.267848 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hvgvj\" (UniqueName: \"kubernetes.io/projected/34711f0e-be64-4a74-815d-7a4a00dd0413-kube-api-access-hvgvj\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:49:27.948419 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:27.948393 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-n6cxb" Apr 24 16:49:27.948419 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:27.948406 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-n6cxb" event={"ID":"34711f0e-be64-4a74-815d-7a4a00dd0413","Type":"ContainerDied","Data":"435725b090c2f5c7189e32ee3ca29e39a0208584f3a7345e6128cc9100d4b08c"} Apr 24 16:49:27.948617 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:49:27.948431 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="435725b090c2f5c7189e32ee3ca29e39a0208584f3a7345e6128cc9100d4b08c" Apr 24 16:50:02.371762 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:02.371725 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-8b82n"] Apr 24 16:50:02.372278 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:02.372134 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34711f0e-be64-4a74-815d-7a4a00dd0413" containerName="s3-init" Apr 24 16:50:02.372278 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:02.372147 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="34711f0e-be64-4a74-815d-7a4a00dd0413" containerName="s3-init" Apr 24 16:50:02.372278 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:02.372208 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="34711f0e-be64-4a74-815d-7a4a00dd0413" containerName="s3-init" Apr 24 16:50:02.375258 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:02.375240 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-8b82n" Apr 24 16:50:02.377788 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:02.377765 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 16:50:02.377928 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:02.377768 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 24 16:50:02.377928 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:02.377846 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-z9xpl\"" Apr 24 16:50:02.378576 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:02.378561 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 16:50:02.383157 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:02.383135 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-8b82n"] Apr 24 16:50:02.474815 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:02.474774 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsnxh\" (UniqueName: \"kubernetes.io/projected/edccb082-bc45-4540-a39a-9f307d807d40-kube-api-access-tsnxh\") pod \"s3-tls-init-custom-8b82n\" (UID: \"edccb082-bc45-4540-a39a-9f307d807d40\") " pod="kserve/s3-tls-init-custom-8b82n" Apr 24 16:50:02.576105 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:02.576068 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsnxh\" (UniqueName: \"kubernetes.io/projected/edccb082-bc45-4540-a39a-9f307d807d40-kube-api-access-tsnxh\") pod \"s3-tls-init-custom-8b82n\" (UID: \"edccb082-bc45-4540-a39a-9f307d807d40\") " pod="kserve/s3-tls-init-custom-8b82n" Apr 24 16:50:02.584638 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:02.584608 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsnxh\" (UniqueName: \"kubernetes.io/projected/edccb082-bc45-4540-a39a-9f307d807d40-kube-api-access-tsnxh\") pod \"s3-tls-init-custom-8b82n\" (UID: \"edccb082-bc45-4540-a39a-9f307d807d40\") " pod="kserve/s3-tls-init-custom-8b82n" Apr 24 16:50:02.693150 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:02.693050 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-8b82n" Apr 24 16:50:02.811858 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:02.811834 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-8b82n"] Apr 24 16:50:02.814034 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:50:02.814005 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedccb082_bc45_4540_a39a_9f307d807d40.slice/crio-d6af94502d20b585ef4a7b9aa2fa365dde9ec71118d3b1edd0110eb4e8629999 WatchSource:0}: Error finding container d6af94502d20b585ef4a7b9aa2fa365dde9ec71118d3b1edd0110eb4e8629999: Status 404 returned error can't find the container with id d6af94502d20b585ef4a7b9aa2fa365dde9ec71118d3b1edd0110eb4e8629999 Apr 24 16:50:03.052131 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:03.052032 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-8b82n" event={"ID":"edccb082-bc45-4540-a39a-9f307d807d40","Type":"ContainerStarted","Data":"3299d7d1bcc771cee5a3eab1b6cbe4a2046c642e6a426fcaab1ae9209bb2292a"} Apr 24 16:50:03.052131 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:03.052067 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-8b82n" event={"ID":"edccb082-bc45-4540-a39a-9f307d807d40","Type":"ContainerStarted","Data":"d6af94502d20b585ef4a7b9aa2fa365dde9ec71118d3b1edd0110eb4e8629999"} Apr 24 16:50:03.066693 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:03.066627 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-8b82n" podStartSLOduration=1.06660795 podStartE2EDuration="1.06660795s" podCreationTimestamp="2026-04-24 16:50:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:50:03.066301707 +0000 UTC m=+690.780369744" watchObservedRunningTime="2026-04-24 16:50:03.06660795 +0000 UTC m=+690.780675989" Apr 24 16:50:08.072207 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:08.072179 2575 generic.go:358] "Generic (PLEG): container finished" podID="edccb082-bc45-4540-a39a-9f307d807d40" containerID="3299d7d1bcc771cee5a3eab1b6cbe4a2046c642e6a426fcaab1ae9209bb2292a" exitCode=0 Apr 24 16:50:08.072582 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:08.072252 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-8b82n" event={"ID":"edccb082-bc45-4540-a39a-9f307d807d40","Type":"ContainerDied","Data":"3299d7d1bcc771cee5a3eab1b6cbe4a2046c642e6a426fcaab1ae9209bb2292a"} Apr 24 16:50:09.202126 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:09.202102 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-8b82n" Apr 24 16:50:09.234356 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:09.234304 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsnxh\" (UniqueName: \"kubernetes.io/projected/edccb082-bc45-4540-a39a-9f307d807d40-kube-api-access-tsnxh\") pod \"edccb082-bc45-4540-a39a-9f307d807d40\" (UID: \"edccb082-bc45-4540-a39a-9f307d807d40\") " Apr 24 16:50:09.236367 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:09.236337 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edccb082-bc45-4540-a39a-9f307d807d40-kube-api-access-tsnxh" (OuterVolumeSpecName: "kube-api-access-tsnxh") pod "edccb082-bc45-4540-a39a-9f307d807d40" (UID: "edccb082-bc45-4540-a39a-9f307d807d40"). InnerVolumeSpecName "kube-api-access-tsnxh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:50:09.335536 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:09.335451 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tsnxh\" (UniqueName: \"kubernetes.io/projected/edccb082-bc45-4540-a39a-9f307d807d40-kube-api-access-tsnxh\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:50:10.079956 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:10.079919 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-8b82n" event={"ID":"edccb082-bc45-4540-a39a-9f307d807d40","Type":"ContainerDied","Data":"d6af94502d20b585ef4a7b9aa2fa365dde9ec71118d3b1edd0110eb4e8629999"} Apr 24 16:50:10.079956 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:10.079955 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6af94502d20b585ef4a7b9aa2fa365dde9ec71118d3b1edd0110eb4e8629999" Apr 24 16:50:10.080230 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:10.079969 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-8b82n" Apr 24 16:50:15.024416 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:15.024380 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-t5r9z"] Apr 24 16:50:15.024900 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:15.024880 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edccb082-bc45-4540-a39a-9f307d807d40" containerName="s3-tls-init-custom" Apr 24 16:50:15.024976 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:15.024905 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="edccb082-bc45-4540-a39a-9f307d807d40" containerName="s3-tls-init-custom" Apr 24 16:50:15.025028 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:15.025011 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="edccb082-bc45-4540-a39a-9f307d807d40" containerName="s3-tls-init-custom" Apr 24 16:50:15.028246 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:15.028214 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-t5r9z" Apr 24 16:50:15.030704 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:15.030673 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 24 16:50:15.030820 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:15.030679 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 16:50:15.030820 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:15.030726 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 16:50:15.031475 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:15.031460 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-z9xpl\"" Apr 24 16:50:15.036501 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:15.036477 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-t5r9z"] Apr 24 16:50:15.073876 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:15.073842 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlrlv\" (UniqueName: \"kubernetes.io/projected/741965f2-4863-4fc9-9231-dd2bb9ab650a-kube-api-access-zlrlv\") pod \"s3-tls-init-serving-t5r9z\" (UID: \"741965f2-4863-4fc9-9231-dd2bb9ab650a\") " pod="kserve/s3-tls-init-serving-t5r9z" Apr 24 16:50:15.175218 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:15.175184 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlrlv\" (UniqueName: \"kubernetes.io/projected/741965f2-4863-4fc9-9231-dd2bb9ab650a-kube-api-access-zlrlv\") pod \"s3-tls-init-serving-t5r9z\" (UID: \"741965f2-4863-4fc9-9231-dd2bb9ab650a\") " pod="kserve/s3-tls-init-serving-t5r9z" Apr 24 16:50:15.187541 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:15.187499 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlrlv\" (UniqueName: \"kubernetes.io/projected/741965f2-4863-4fc9-9231-dd2bb9ab650a-kube-api-access-zlrlv\") pod \"s3-tls-init-serving-t5r9z\" (UID: \"741965f2-4863-4fc9-9231-dd2bb9ab650a\") " pod="kserve/s3-tls-init-serving-t5r9z" Apr 24 16:50:15.347845 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:15.347816 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-t5r9z" Apr 24 16:50:15.470625 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:15.470536 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-t5r9z"] Apr 24 16:50:15.472900 ip-10-0-131-37 kubenswrapper[2575]: W0424 16:50:15.472873 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod741965f2_4863_4fc9_9231_dd2bb9ab650a.slice/crio-baed4f93e7d5e4b952b7b3ac84f14b5a32eba4792dfd2772965f42b1ce7da9b0 WatchSource:0}: Error finding container baed4f93e7d5e4b952b7b3ac84f14b5a32eba4792dfd2772965f42b1ce7da9b0: Status 404 returned error can't find the container with id baed4f93e7d5e4b952b7b3ac84f14b5a32eba4792dfd2772965f42b1ce7da9b0 Apr 24 16:50:16.098935 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:16.098899 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-t5r9z" event={"ID":"741965f2-4863-4fc9-9231-dd2bb9ab650a","Type":"ContainerStarted","Data":"24ab920ecf65454857f652f2aad72e8427d6cf36dab78e37b72fa3f3bf2ae3d5"} Apr 24 16:50:16.098935 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:16.098941 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-t5r9z" event={"ID":"741965f2-4863-4fc9-9231-dd2bb9ab650a","Type":"ContainerStarted","Data":"baed4f93e7d5e4b952b7b3ac84f14b5a32eba4792dfd2772965f42b1ce7da9b0"} Apr 24 16:50:16.120396 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:16.120352 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-t5r9z" podStartSLOduration=1.120337369 podStartE2EDuration="1.120337369s" podCreationTimestamp="2026-04-24 16:50:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:50:16.119507498 +0000 UTC m=+703.833575535" watchObservedRunningTime="2026-04-24 16:50:16.120337369 +0000 UTC m=+703.834405618" Apr 24 16:50:19.114295 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:19.114256 2575 generic.go:358] "Generic (PLEG): container finished" podID="741965f2-4863-4fc9-9231-dd2bb9ab650a" containerID="24ab920ecf65454857f652f2aad72e8427d6cf36dab78e37b72fa3f3bf2ae3d5" exitCode=0 Apr 24 16:50:19.114666 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:19.114346 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-t5r9z" event={"ID":"741965f2-4863-4fc9-9231-dd2bb9ab650a","Type":"ContainerDied","Data":"24ab920ecf65454857f652f2aad72e8427d6cf36dab78e37b72fa3f3bf2ae3d5"} Apr 24 16:50:20.244373 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:20.244351 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-t5r9z" Apr 24 16:50:20.321038 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:20.320996 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlrlv\" (UniqueName: \"kubernetes.io/projected/741965f2-4863-4fc9-9231-dd2bb9ab650a-kube-api-access-zlrlv\") pod \"741965f2-4863-4fc9-9231-dd2bb9ab650a\" (UID: \"741965f2-4863-4fc9-9231-dd2bb9ab650a\") " Apr 24 16:50:20.323167 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:20.323142 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/741965f2-4863-4fc9-9231-dd2bb9ab650a-kube-api-access-zlrlv" (OuterVolumeSpecName: "kube-api-access-zlrlv") pod "741965f2-4863-4fc9-9231-dd2bb9ab650a" (UID: "741965f2-4863-4fc9-9231-dd2bb9ab650a"). InnerVolumeSpecName "kube-api-access-zlrlv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:50:20.421890 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:20.421802 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zlrlv\" (UniqueName: \"kubernetes.io/projected/741965f2-4863-4fc9-9231-dd2bb9ab650a-kube-api-access-zlrlv\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 16:50:21.122404 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:21.122373 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-t5r9z" event={"ID":"741965f2-4863-4fc9-9231-dd2bb9ab650a","Type":"ContainerDied","Data":"baed4f93e7d5e4b952b7b3ac84f14b5a32eba4792dfd2772965f42b1ce7da9b0"} Apr 24 16:50:21.122404 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:21.122391 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-t5r9z" Apr 24 16:50:21.122404 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:50:21.122405 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baed4f93e7d5e4b952b7b3ac84f14b5a32eba4792dfd2772965f42b1ce7da9b0" Apr 24 16:53:32.800806 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:53:32.800727 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 16:53:32.803288 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:53:32.803265 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 16:58:32.826902 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:58:32.826869 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 16:58:32.830646 ip-10-0-131-37 kubenswrapper[2575]: I0424 16:58:32.830620 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 17:03:32.851539 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:03:32.851504 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 17:03:32.861826 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:03:32.861799 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 17:08:32.876932 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:08:32.876902 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 17:08:32.887469 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:08:32.887439 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 17:13:32.901732 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:13:32.901705 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 17:13:32.913058 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:13:32.913037 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 17:18:32.930057 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:18:32.930029 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 17:18:32.940850 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:18:32.940830 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 17:23:32.954813 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:23:32.954656 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 17:23:32.966156 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:23:32.966135 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 17:28:32.982292 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:28:32.982180 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 17:28:32.990356 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:28:32.990338 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 17:33:33.012442 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:33:33.012324 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 17:33:33.017210 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:33:33.017191 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 17:38:33.036975 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:38:33.036853 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 17:38:33.043797 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:38:33.042297 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 17:43:33.063203 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:43:33.063081 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 17:43:33.069315 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:43:33.069294 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 17:46:00.972566 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:00.972528 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xnml2/must-gather-dfjm2"] Apr 24 17:46:00.973074 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:00.973043 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="741965f2-4863-4fc9-9231-dd2bb9ab650a" containerName="s3-tls-init-serving" Apr 24 17:46:00.973074 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:00.973062 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="741965f2-4863-4fc9-9231-dd2bb9ab650a" containerName="s3-tls-init-serving" Apr 24 17:46:00.973243 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:00.973171 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="741965f2-4863-4fc9-9231-dd2bb9ab650a" containerName="s3-tls-init-serving" Apr 24 17:46:00.976316 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:00.976295 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xnml2/must-gather-dfjm2" Apr 24 17:46:00.978583 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:00.978559 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xnml2\"/\"openshift-service-ca.crt\"" Apr 24 17:46:00.978696 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:00.978660 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xnml2\"/\"kube-root-ca.crt\"" Apr 24 17:46:00.978762 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:00.978720 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xnml2\"/\"default-dockercfg-nf29k\"" Apr 24 17:46:00.989766 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:00.986406 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xnml2/must-gather-dfjm2"] Apr 24 17:46:01.080701 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:01.080662 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56qmg\" (UniqueName: \"kubernetes.io/projected/1a8079f4-26b7-40f0-8490-e77758d0e393-kube-api-access-56qmg\") pod \"must-gather-dfjm2\" (UID: \"1a8079f4-26b7-40f0-8490-e77758d0e393\") " pod="openshift-must-gather-xnml2/must-gather-dfjm2" Apr 24 17:46:01.080701 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:01.080706 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a8079f4-26b7-40f0-8490-e77758d0e393-must-gather-output\") pod \"must-gather-dfjm2\" (UID: \"1a8079f4-26b7-40f0-8490-e77758d0e393\") " pod="openshift-must-gather-xnml2/must-gather-dfjm2" Apr 24 17:46:01.181576 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:01.181537 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56qmg\" (UniqueName: \"kubernetes.io/projected/1a8079f4-26b7-40f0-8490-e77758d0e393-kube-api-access-56qmg\") pod \"must-gather-dfjm2\" (UID: \"1a8079f4-26b7-40f0-8490-e77758d0e393\") " pod="openshift-must-gather-xnml2/must-gather-dfjm2" Apr 24 17:46:01.181576 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:01.181578 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a8079f4-26b7-40f0-8490-e77758d0e393-must-gather-output\") pod \"must-gather-dfjm2\" (UID: \"1a8079f4-26b7-40f0-8490-e77758d0e393\") " pod="openshift-must-gather-xnml2/must-gather-dfjm2" Apr 24 17:46:01.181905 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:01.181887 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a8079f4-26b7-40f0-8490-e77758d0e393-must-gather-output\") pod \"must-gather-dfjm2\" (UID: \"1a8079f4-26b7-40f0-8490-e77758d0e393\") " pod="openshift-must-gather-xnml2/must-gather-dfjm2" Apr 24 17:46:01.189604 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:01.189572 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-56qmg\" (UniqueName: \"kubernetes.io/projected/1a8079f4-26b7-40f0-8490-e77758d0e393-kube-api-access-56qmg\") pod \"must-gather-dfjm2\" (UID: \"1a8079f4-26b7-40f0-8490-e77758d0e393\") " pod="openshift-must-gather-xnml2/must-gather-dfjm2" Apr 24 17:46:01.302769 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:01.302663 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xnml2/must-gather-dfjm2" Apr 24 17:46:01.425839 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:01.425795 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xnml2/must-gather-dfjm2"] Apr 24 17:46:01.429520 ip-10-0-131-37 kubenswrapper[2575]: W0424 17:46:01.429485 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a8079f4_26b7_40f0_8490_e77758d0e393.slice/crio-a3f374d57c38e77b116d2ab47e5a9e7114660b9b1aa93ec2e29df596b5bbd42a WatchSource:0}: Error finding container a3f374d57c38e77b116d2ab47e5a9e7114660b9b1aa93ec2e29df596b5bbd42a: Status 404 returned error can't find the container with id a3f374d57c38e77b116d2ab47e5a9e7114660b9b1aa93ec2e29df596b5bbd42a Apr 24 17:46:01.431688 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:01.431670 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 17:46:02.368437 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:02.368394 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xnml2/must-gather-dfjm2" event={"ID":"1a8079f4-26b7-40f0-8490-e77758d0e393","Type":"ContainerStarted","Data":"a3f374d57c38e77b116d2ab47e5a9e7114660b9b1aa93ec2e29df596b5bbd42a"} Apr 24 17:46:06.384846 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:06.384800 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xnml2/must-gather-dfjm2" event={"ID":"1a8079f4-26b7-40f0-8490-e77758d0e393","Type":"ContainerStarted","Data":"44b1423d1d49f4a62440af6dcd51382017c7289fb21a08fd77a74748840d7f48"} Apr 24 17:46:07.390889 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:07.390848 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xnml2/must-gather-dfjm2" event={"ID":"1a8079f4-26b7-40f0-8490-e77758d0e393","Type":"ContainerStarted","Data":"1cbeb2c19afbbb572478832e24baca7d3164858b0fa473a889a3a6eb3d277064"} Apr 24 17:46:07.407872 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:07.407816 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xnml2/must-gather-dfjm2" podStartSLOduration=2.620177363 podStartE2EDuration="7.407800923s" podCreationTimestamp="2026-04-24 17:46:00 +0000 UTC" firstStartedPulling="2026-04-24 17:46:01.431857801 +0000 UTC m=+4049.145925817" lastFinishedPulling="2026-04-24 17:46:06.219481359 +0000 UTC m=+4053.933549377" observedRunningTime="2026-04-24 17:46:07.406107701 +0000 UTC m=+4055.120175734" watchObservedRunningTime="2026-04-24 17:46:07.407800923 +0000 UTC m=+4055.121868957" Apr 24 17:46:28.475821 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:28.475786 2575 generic.go:358] "Generic (PLEG): container finished" podID="1a8079f4-26b7-40f0-8490-e77758d0e393" containerID="44b1423d1d49f4a62440af6dcd51382017c7289fb21a08fd77a74748840d7f48" exitCode=0 Apr 24 17:46:28.476314 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:28.475859 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xnml2/must-gather-dfjm2" event={"ID":"1a8079f4-26b7-40f0-8490-e77758d0e393","Type":"ContainerDied","Data":"44b1423d1d49f4a62440af6dcd51382017c7289fb21a08fd77a74748840d7f48"} Apr 24 17:46:28.476314 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:28.476184 2575 scope.go:117] "RemoveContainer" containerID="44b1423d1d49f4a62440af6dcd51382017c7289fb21a08fd77a74748840d7f48" Apr 24 17:46:28.745416 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:28.745321 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xnml2_must-gather-dfjm2_1a8079f4-26b7-40f0-8490-e77758d0e393/gather/0.log" Apr 24 17:46:32.444658 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:32.444622 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-cwpk5_f7bcaaf9-0dfc-4e23-9bc0-fd3beb2ecbd1/global-pull-secret-syncer/0.log" Apr 24 17:46:32.648450 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:32.648414 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-phgx4_52f93581-75a0-4ae1-92b6-3ce3e189cd48/konnectivity-agent/0.log" Apr 24 17:46:32.671339 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:32.671308 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-37.ec2.internal_dcc90d88bf75fe88b8ea0db46e250029/haproxy/0.log" Apr 24 17:46:34.218930 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.218897 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xnml2/must-gather-dfjm2"] Apr 24 17:46:34.219397 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.219119 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-xnml2/must-gather-dfjm2" podUID="1a8079f4-26b7-40f0-8490-e77758d0e393" containerName="copy" containerID="cri-o://1cbeb2c19afbbb572478832e24baca7d3164858b0fa473a889a3a6eb3d277064" gracePeriod=2 Apr 24 17:46:34.222943 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.222914 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xnml2/must-gather-dfjm2"] Apr 24 17:46:34.459253 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.459227 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xnml2_must-gather-dfjm2_1a8079f4-26b7-40f0-8490-e77758d0e393/copy/0.log" Apr 24 17:46:34.459645 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.459627 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xnml2/must-gather-dfjm2" Apr 24 17:46:34.461840 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.461810 2575 status_manager.go:895] "Failed to get status for pod" podUID="1a8079f4-26b7-40f0-8490-e77758d0e393" pod="openshift-must-gather-xnml2/must-gather-dfjm2" err="pods \"must-gather-dfjm2\" is forbidden: User \"system:node:ip-10-0-131-37.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-xnml2\": no relationship found between node 'ip-10-0-131-37.ec2.internal' and this object" Apr 24 17:46:34.502847 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.502768 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xnml2_must-gather-dfjm2_1a8079f4-26b7-40f0-8490-e77758d0e393/copy/0.log" Apr 24 17:46:34.503113 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.503074 2575 generic.go:358] "Generic (PLEG): container finished" podID="1a8079f4-26b7-40f0-8490-e77758d0e393" containerID="1cbeb2c19afbbb572478832e24baca7d3164858b0fa473a889a3a6eb3d277064" exitCode=143 Apr 24 17:46:34.503250 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.503140 2575 scope.go:117] "RemoveContainer" containerID="1cbeb2c19afbbb572478832e24baca7d3164858b0fa473a889a3a6eb3d277064" Apr 24 17:46:34.503250 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.503144 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xnml2/must-gather-dfjm2" Apr 24 17:46:34.505221 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.505194 2575 status_manager.go:895] "Failed to get status for pod" podUID="1a8079f4-26b7-40f0-8490-e77758d0e393" pod="openshift-must-gather-xnml2/must-gather-dfjm2" err="pods \"must-gather-dfjm2\" is forbidden: User \"system:node:ip-10-0-131-37.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-xnml2\": no relationship found between node 'ip-10-0-131-37.ec2.internal' and this object" Apr 24 17:46:34.510834 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.510812 2575 scope.go:117] "RemoveContainer" containerID="44b1423d1d49f4a62440af6dcd51382017c7289fb21a08fd77a74748840d7f48" Apr 24 17:46:34.523858 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.523835 2575 scope.go:117] "RemoveContainer" containerID="1cbeb2c19afbbb572478832e24baca7d3164858b0fa473a889a3a6eb3d277064" Apr 24 17:46:34.524189 ip-10-0-131-37 kubenswrapper[2575]: E0424 17:46:34.524163 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cbeb2c19afbbb572478832e24baca7d3164858b0fa473a889a3a6eb3d277064\": container with ID starting with 1cbeb2c19afbbb572478832e24baca7d3164858b0fa473a889a3a6eb3d277064 not found: ID does not exist" containerID="1cbeb2c19afbbb572478832e24baca7d3164858b0fa473a889a3a6eb3d277064" Apr 24 17:46:34.524256 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.524204 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cbeb2c19afbbb572478832e24baca7d3164858b0fa473a889a3a6eb3d277064"} err="failed to get container status \"1cbeb2c19afbbb572478832e24baca7d3164858b0fa473a889a3a6eb3d277064\": rpc error: code = NotFound desc = could not find container \"1cbeb2c19afbbb572478832e24baca7d3164858b0fa473a889a3a6eb3d277064\": container with ID starting with 1cbeb2c19afbbb572478832e24baca7d3164858b0fa473a889a3a6eb3d277064 not found: ID does not exist" Apr 24 17:46:34.524256 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.524233 2575 scope.go:117] "RemoveContainer" containerID="44b1423d1d49f4a62440af6dcd51382017c7289fb21a08fd77a74748840d7f48" Apr 24 17:46:34.524527 ip-10-0-131-37 kubenswrapper[2575]: E0424 17:46:34.524510 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44b1423d1d49f4a62440af6dcd51382017c7289fb21a08fd77a74748840d7f48\": container with ID starting with 44b1423d1d49f4a62440af6dcd51382017c7289fb21a08fd77a74748840d7f48 not found: ID does not exist" containerID="44b1423d1d49f4a62440af6dcd51382017c7289fb21a08fd77a74748840d7f48" Apr 24 17:46:34.524573 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.524532 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44b1423d1d49f4a62440af6dcd51382017c7289fb21a08fd77a74748840d7f48"} err="failed to get container status \"44b1423d1d49f4a62440af6dcd51382017c7289fb21a08fd77a74748840d7f48\": rpc error: code = NotFound desc = could not find container \"44b1423d1d49f4a62440af6dcd51382017c7289fb21a08fd77a74748840d7f48\": container with ID starting with 44b1423d1d49f4a62440af6dcd51382017c7289fb21a08fd77a74748840d7f48 not found: ID does not exist" Apr 24 17:46:34.591942 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.591899 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a8079f4-26b7-40f0-8490-e77758d0e393-must-gather-output\") pod \"1a8079f4-26b7-40f0-8490-e77758d0e393\" (UID: \"1a8079f4-26b7-40f0-8490-e77758d0e393\") " Apr 24 17:46:34.591942 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.591950 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56qmg\" (UniqueName: \"kubernetes.io/projected/1a8079f4-26b7-40f0-8490-e77758d0e393-kube-api-access-56qmg\") pod \"1a8079f4-26b7-40f0-8490-e77758d0e393\" (UID: \"1a8079f4-26b7-40f0-8490-e77758d0e393\") " Apr 24 17:46:34.593530 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.593499 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a8079f4-26b7-40f0-8490-e77758d0e393-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "1a8079f4-26b7-40f0-8490-e77758d0e393" (UID: "1a8079f4-26b7-40f0-8490-e77758d0e393"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 17:46:34.594302 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.594278 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a8079f4-26b7-40f0-8490-e77758d0e393-kube-api-access-56qmg" (OuterVolumeSpecName: "kube-api-access-56qmg") pod "1a8079f4-26b7-40f0-8490-e77758d0e393" (UID: "1a8079f4-26b7-40f0-8490-e77758d0e393"). InnerVolumeSpecName "kube-api-access-56qmg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 17:46:34.693494 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.693452 2575 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a8079f4-26b7-40f0-8490-e77758d0e393-must-gather-output\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 17:46:34.693494 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.693487 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-56qmg\" (UniqueName: \"kubernetes.io/projected/1a8079f4-26b7-40f0-8490-e77758d0e393-kube-api-access-56qmg\") on node \"ip-10-0-131-37.ec2.internal\" DevicePath \"\"" Apr 24 17:46:34.813576 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.813544 2575 status_manager.go:895] "Failed to get status for pod" podUID="1a8079f4-26b7-40f0-8490-e77758d0e393" pod="openshift-must-gather-xnml2/must-gather-dfjm2" err="pods \"must-gather-dfjm2\" is forbidden: User \"system:node:ip-10-0-131-37.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-xnml2\": no relationship found between node 'ip-10-0-131-37.ec2.internal' and this object" Apr 24 17:46:34.831151 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:34.831116 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a8079f4-26b7-40f0-8490-e77758d0e393" path="/var/lib/kubelet/pods/1a8079f4-26b7-40f0-8490-e77758d0e393/volumes" Apr 24 17:46:36.084576 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:36.084547 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0818fb2f-39c7-434f-a46e-242055a30017/alertmanager/0.log" Apr 24 17:46:36.109108 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:36.109073 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0818fb2f-39c7-434f-a46e-242055a30017/config-reloader/0.log" Apr 24 17:46:36.134564 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:36.134515 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0818fb2f-39c7-434f-a46e-242055a30017/kube-rbac-proxy-web/0.log" Apr 24 17:46:36.164177 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:36.163106 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0818fb2f-39c7-434f-a46e-242055a30017/kube-rbac-proxy/0.log" Apr 24 17:46:36.197333 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:36.197290 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0818fb2f-39c7-434f-a46e-242055a30017/kube-rbac-proxy-metric/0.log" Apr 24 17:46:36.224970 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:36.224941 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0818fb2f-39c7-434f-a46e-242055a30017/prom-label-proxy/0.log" Apr 24 17:46:36.253749 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:36.253724 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_0818fb2f-39c7-434f-a46e-242055a30017/init-config-reloader/0.log" Apr 24 17:46:36.297722 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:36.297692 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-ccsnq_52705a67-f97f-488e-adc3-2f562fd2fd0e/cluster-monitoring-operator/0.log" Apr 24 17:46:36.326574 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:36.326535 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-bxf9q_d2c85512-9eaa-47a4-93c7-088001707109/kube-state-metrics/0.log" Apr 24 17:46:36.364633 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:36.364608 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-bxf9q_d2c85512-9eaa-47a4-93c7-088001707109/kube-rbac-proxy-main/0.log" Apr 24 17:46:36.412378 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:36.412353 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-bxf9q_d2c85512-9eaa-47a4-93c7-088001707109/kube-rbac-proxy-self/0.log" Apr 24 17:46:36.493409 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:36.493376 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-xlvqq_a6af0dd3-dc36-494a-b420-bb7b1ac5fd89/monitoring-plugin/0.log" Apr 24 17:46:36.728472 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:36.728391 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-92qsp_a083e521-de46-463d-921a-44495e3f3333/node-exporter/0.log" Apr 24 17:46:36.761711 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:36.761680 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-92qsp_a083e521-de46-463d-921a-44495e3f3333/kube-rbac-proxy/0.log" Apr 24 17:46:36.793978 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:36.793955 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-92qsp_a083e521-de46-463d-921a-44495e3f3333/init-textfile/0.log" Apr 24 17:46:36.970274 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:36.970234 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6cd4eebe-82c0-4ce9-9b33-8bbea44303bb/prometheus/0.log" Apr 24 17:46:36.992142 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:36.992043 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6cd4eebe-82c0-4ce9-9b33-8bbea44303bb/config-reloader/0.log" Apr 24 17:46:37.016826 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:37.016799 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6cd4eebe-82c0-4ce9-9b33-8bbea44303bb/thanos-sidecar/0.log" Apr 24 17:46:37.042602 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:37.042551 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6cd4eebe-82c0-4ce9-9b33-8bbea44303bb/kube-rbac-proxy-web/0.log" Apr 24 17:46:37.067946 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:37.067920 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6cd4eebe-82c0-4ce9-9b33-8bbea44303bb/kube-rbac-proxy/0.log" Apr 24 17:46:37.091201 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:37.091151 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6cd4eebe-82c0-4ce9-9b33-8bbea44303bb/kube-rbac-proxy-thanos/0.log" Apr 24 17:46:37.113767 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:37.113727 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_6cd4eebe-82c0-4ce9-9b33-8bbea44303bb/init-config-reloader/0.log" Apr 24 17:46:37.366080 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:37.366050 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-cb954bcc8-kmqk4_7045a425-fd49-4f04-8bdc-44a088056f4d/thanos-query/0.log" Apr 24 17:46:37.392064 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:37.392036 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-cb954bcc8-kmqk4_7045a425-fd49-4f04-8bdc-44a088056f4d/kube-rbac-proxy-web/0.log" Apr 24 17:46:37.425571 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:37.425547 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-cb954bcc8-kmqk4_7045a425-fd49-4f04-8bdc-44a088056f4d/kube-rbac-proxy/0.log" Apr 24 17:46:37.452121 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:37.452076 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-cb954bcc8-kmqk4_7045a425-fd49-4f04-8bdc-44a088056f4d/prom-label-proxy/0.log" Apr 24 17:46:37.486561 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:37.486527 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-cb954bcc8-kmqk4_7045a425-fd49-4f04-8bdc-44a088056f4d/kube-rbac-proxy-rules/0.log" Apr 24 17:46:37.522176 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:37.522150 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-cb954bcc8-kmqk4_7045a425-fd49-4f04-8bdc-44a088056f4d/kube-rbac-proxy-metrics/0.log" Apr 24 17:46:38.501774 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:38.501742 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-fwvss_9de18b4c-44be-4c2e-9b15-1b3401784bcd/networking-console-plugin/0.log" Apr 24 17:46:39.292343 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.292310 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-q5bbn_ed3c3ab2-f165-47a6-8669-c3408d5c908f/download-server/0.log" Apr 24 17:46:39.442216 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.442180 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq"] Apr 24 17:46:39.442551 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.442538 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a8079f4-26b7-40f0-8490-e77758d0e393" containerName="gather" Apr 24 17:46:39.442595 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.442552 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8079f4-26b7-40f0-8490-e77758d0e393" containerName="gather" Apr 24 17:46:39.442595 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.442574 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a8079f4-26b7-40f0-8490-e77758d0e393" containerName="copy" Apr 24 17:46:39.442595 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.442580 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8079f4-26b7-40f0-8490-e77758d0e393" containerName="copy" Apr 24 17:46:39.442689 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.442635 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a8079f4-26b7-40f0-8490-e77758d0e393" containerName="gather" Apr 24 17:46:39.442689 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.442646 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a8079f4-26b7-40f0-8490-e77758d0e393" containerName="copy" Apr 24 17:46:39.446726 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.446702 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq" Apr 24 17:46:39.448987 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.448964 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-twkp6\"/\"kube-root-ca.crt\"" Apr 24 17:46:39.449813 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.449792 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-twkp6\"/\"default-dockercfg-7txp2\"" Apr 24 17:46:39.449880 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.449792 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-twkp6\"/\"openshift-service-ca.crt\"" Apr 24 17:46:39.454727 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.454707 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq"] Apr 24 17:46:39.530227 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.530197 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/30026867-2d9f-49e5-9785-4e3f5e7e8a13-podres\") pod \"perf-node-gather-daemonset-2l2dq\" (UID: \"30026867-2d9f-49e5-9785-4e3f5e7e8a13\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq" Apr 24 17:46:39.530616 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.530237 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/30026867-2d9f-49e5-9785-4e3f5e7e8a13-proc\") pod \"perf-node-gather-daemonset-2l2dq\" (UID: \"30026867-2d9f-49e5-9785-4e3f5e7e8a13\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq" Apr 24 17:46:39.530616 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.530258 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/30026867-2d9f-49e5-9785-4e3f5e7e8a13-lib-modules\") pod \"perf-node-gather-daemonset-2l2dq\" (UID: \"30026867-2d9f-49e5-9785-4e3f5e7e8a13\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq" Apr 24 17:46:39.530616 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.530325 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/30026867-2d9f-49e5-9785-4e3f5e7e8a13-sys\") pod \"perf-node-gather-daemonset-2l2dq\" (UID: \"30026867-2d9f-49e5-9785-4e3f5e7e8a13\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq" Apr 24 17:46:39.530616 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.530350 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m5gc\" (UniqueName: \"kubernetes.io/projected/30026867-2d9f-49e5-9785-4e3f5e7e8a13-kube-api-access-5m5gc\") pod \"perf-node-gather-daemonset-2l2dq\" (UID: \"30026867-2d9f-49e5-9785-4e3f5e7e8a13\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq" Apr 24 17:46:39.631693 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.631652 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/30026867-2d9f-49e5-9785-4e3f5e7e8a13-sys\") pod \"perf-node-gather-daemonset-2l2dq\" (UID: \"30026867-2d9f-49e5-9785-4e3f5e7e8a13\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq" Apr 24 17:46:39.631693 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.631695 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m5gc\" (UniqueName: \"kubernetes.io/projected/30026867-2d9f-49e5-9785-4e3f5e7e8a13-kube-api-access-5m5gc\") pod \"perf-node-gather-daemonset-2l2dq\" (UID: \"30026867-2d9f-49e5-9785-4e3f5e7e8a13\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq" Apr 24 17:46:39.631867 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.631757 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/30026867-2d9f-49e5-9785-4e3f5e7e8a13-podres\") pod \"perf-node-gather-daemonset-2l2dq\" (UID: \"30026867-2d9f-49e5-9785-4e3f5e7e8a13\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq" Apr 24 17:46:39.631867 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.631781 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/30026867-2d9f-49e5-9785-4e3f5e7e8a13-proc\") pod \"perf-node-gather-daemonset-2l2dq\" (UID: \"30026867-2d9f-49e5-9785-4e3f5e7e8a13\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq" Apr 24 17:46:39.631867 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.631793 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/30026867-2d9f-49e5-9785-4e3f5e7e8a13-sys\") pod \"perf-node-gather-daemonset-2l2dq\" (UID: \"30026867-2d9f-49e5-9785-4e3f5e7e8a13\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq" Apr 24 17:46:39.631867 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.631797 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/30026867-2d9f-49e5-9785-4e3f5e7e8a13-lib-modules\") pod \"perf-node-gather-daemonset-2l2dq\" (UID: \"30026867-2d9f-49e5-9785-4e3f5e7e8a13\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq" Apr 24 17:46:39.631867 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.631847 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/30026867-2d9f-49e5-9785-4e3f5e7e8a13-proc\") pod \"perf-node-gather-daemonset-2l2dq\" (UID: \"30026867-2d9f-49e5-9785-4e3f5e7e8a13\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq" Apr 24 17:46:39.632016 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.631898 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/30026867-2d9f-49e5-9785-4e3f5e7e8a13-podres\") pod \"perf-node-gather-daemonset-2l2dq\" (UID: \"30026867-2d9f-49e5-9785-4e3f5e7e8a13\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq" Apr 24 17:46:39.632016 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.631931 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/30026867-2d9f-49e5-9785-4e3f5e7e8a13-lib-modules\") pod \"perf-node-gather-daemonset-2l2dq\" (UID: \"30026867-2d9f-49e5-9785-4e3f5e7e8a13\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq" Apr 24 17:46:39.639803 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.639776 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m5gc\" (UniqueName: \"kubernetes.io/projected/30026867-2d9f-49e5-9785-4e3f5e7e8a13-kube-api-access-5m5gc\") pod \"perf-node-gather-daemonset-2l2dq\" (UID: \"30026867-2d9f-49e5-9785-4e3f5e7e8a13\") " pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq" Apr 24 17:46:39.691562 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.691530 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-qtzff_d199c90e-ff14-4334-8ee4-d6f19aa8c243/volume-data-source-validator/0.log" Apr 24 17:46:39.757420 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.757381 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq" Apr 24 17:46:39.881455 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:39.881423 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq"] Apr 24 17:46:39.884487 ip-10-0-131-37 kubenswrapper[2575]: W0424 17:46:39.884459 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod30026867_2d9f_49e5_9785_4e3f5e7e8a13.slice/crio-7be8f566063296d47ba6139b18995511b375ac9a98899fd1afc8a6d05a7ac558 WatchSource:0}: Error finding container 7be8f566063296d47ba6139b18995511b375ac9a98899fd1afc8a6d05a7ac558: Status 404 returned error can't find the container with id 7be8f566063296d47ba6139b18995511b375ac9a98899fd1afc8a6d05a7ac558 Apr 24 17:46:40.373964 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:40.373932 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ckvvn_ed7942f7-292f-402b-af38-8f0c16de0ee3/dns/0.log" Apr 24 17:46:40.395325 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:40.395294 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ckvvn_ed7942f7-292f-402b-af38-8f0c16de0ee3/kube-rbac-proxy/0.log" Apr 24 17:46:40.469781 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:40.469753 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7hmxs_06388f4e-daeb-4db0-906e-01adfa3e3b97/dns-node-resolver/0.log" Apr 24 17:46:40.527131 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:40.527074 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq" event={"ID":"30026867-2d9f-49e5-9785-4e3f5e7e8a13","Type":"ContainerStarted","Data":"b46a80ea978e17cbbfc6d0eacd84e6cfecab921c92d67fa60ae7157cc6d3630c"} Apr 24 17:46:40.527131 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:40.527133 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq" event={"ID":"30026867-2d9f-49e5-9785-4e3f5e7e8a13","Type":"ContainerStarted","Data":"7be8f566063296d47ba6139b18995511b375ac9a98899fd1afc8a6d05a7ac558"} Apr 24 17:46:40.527347 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:40.527215 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq" Apr 24 17:46:40.543407 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:40.543343 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq" podStartSLOduration=1.543329084 podStartE2EDuration="1.543329084s" podCreationTimestamp="2026-04-24 17:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 17:46:40.541939907 +0000 UTC m=+4088.256007944" watchObservedRunningTime="2026-04-24 17:46:40.543329084 +0000 UTC m=+4088.257397120" Apr 24 17:46:41.027315 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:41.027273 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-psxtk_34943398-acf4-440b-900d-999cb567a483/node-ca/0.log" Apr 24 17:46:42.133511 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:42.133478 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-z6w7t_cdbe4c96-edde-4285-9466-eeb5fd3f169b/serve-healthcheck-canary/0.log" Apr 24 17:46:42.467595 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:42.467507 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-df7xx_c1786d47-e613-4796-a98f-1ea71904bff8/insights-operator/0.log" Apr 24 17:46:42.468444 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:42.468426 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-df7xx_c1786d47-e613-4796-a98f-1ea71904bff8/insights-operator/1.log" Apr 24 17:46:42.628396 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:42.628365 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wttnf_8a70b0e4-967f-4814-87bb-2bc980391a01/kube-rbac-proxy/0.log" Apr 24 17:46:42.648954 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:42.648927 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wttnf_8a70b0e4-967f-4814-87bb-2bc980391a01/exporter/0.log" Apr 24 17:46:42.669761 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:42.669733 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-wttnf_8a70b0e4-967f-4814-87bb-2bc980391a01/extractor/0.log" Apr 24 17:46:45.237684 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:45.237658 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-n6cxb_34711f0e-be64-4a74-815d-7a4a00dd0413/s3-init/0.log" Apr 24 17:46:45.260456 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:45.260419 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-8b82n_edccb082-bc45-4540-a39a-9f307d807d40/s3-tls-init-custom/0.log" Apr 24 17:46:45.285778 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:45.285730 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-t5r9z_741965f2-4863-4fc9-9231-dd2bb9ab650a/s3-tls-init-serving/0.log" Apr 24 17:46:46.540838 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:46.540809 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-twkp6/perf-node-gather-daemonset-2l2dq" Apr 24 17:46:49.205870 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:49.205832 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-wsp7g_6675872f-5466-4ba3-93fa-2d8f6edfc801/migrator/0.log" Apr 24 17:46:49.235045 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:49.235018 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-wsp7g_6675872f-5466-4ba3-93fa-2d8f6edfc801/graceful-termination/0.log" Apr 24 17:46:49.565035 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:49.565001 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-cf274_a3504429-9f84-4f23-a196-d187ad6d16d6/kube-storage-version-migrator-operator/1.log" Apr 24 17:46:49.565917 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:49.565894 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-cf274_a3504429-9f84-4f23-a196-d187ad6d16d6/kube-storage-version-migrator-operator/0.log" Apr 24 17:46:50.401651 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:50.401622 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9hv6v_0306168c-6c00-4a89-9e2b-fff3d030b0e2/kube-multus/0.log" Apr 24 17:46:50.428066 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:50.428041 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5hcrw_050c36c1-b0b2-434a-91f9-c53a02f67059/kube-multus-additional-cni-plugins/0.log" Apr 24 17:46:50.456795 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:50.456770 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5hcrw_050c36c1-b0b2-434a-91f9-c53a02f67059/egress-router-binary-copy/0.log" Apr 24 17:46:50.481365 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:50.481331 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5hcrw_050c36c1-b0b2-434a-91f9-c53a02f67059/cni-plugins/0.log" Apr 24 17:46:50.505499 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:50.505468 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5hcrw_050c36c1-b0b2-434a-91f9-c53a02f67059/bond-cni-plugin/0.log" Apr 24 17:46:50.525952 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:50.525918 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5hcrw_050c36c1-b0b2-434a-91f9-c53a02f67059/routeoverride-cni/0.log" Apr 24 17:46:50.545900 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:50.545869 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5hcrw_050c36c1-b0b2-434a-91f9-c53a02f67059/whereabouts-cni-bincopy/0.log" Apr 24 17:46:50.570583 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:50.570551 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-5hcrw_050c36c1-b0b2-434a-91f9-c53a02f67059/whereabouts-cni/0.log" Apr 24 17:46:51.001755 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:51.001679 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-f9bsr_10ab450a-933f-4b41-8316-09109770ac99/network-metrics-daemon/0.log" Apr 24 17:46:51.021994 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:51.021964 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-f9bsr_10ab450a-933f-4b41-8316-09109770ac99/kube-rbac-proxy/0.log" Apr 24 17:46:52.118456 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:52.118423 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-controller/0.log" Apr 24 17:46:52.137535 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:52.137507 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/0.log" Apr 24 17:46:52.153847 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:52.153813 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovn-acl-logging/1.log" Apr 24 17:46:52.173960 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:52.173925 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/kube-rbac-proxy-node/0.log" Apr 24 17:46:52.196006 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:52.195977 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 17:46:52.215196 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:52.215170 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/northd/0.log" Apr 24 17:46:52.237666 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:52.237636 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/nbdb/0.log" Apr 24 17:46:52.258533 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:52.258501 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/sbdb/0.log" Apr 24 17:46:52.360829 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:52.360789 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jt59q_c2b71d82-71f3-4cf6-95a9-73b3d509e492/ovnkube-controller/0.log" Apr 24 17:46:53.717912 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:53.717874 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-4d6qk_47d71194-f92b-4ce8-a112-c73134f86aa4/check-endpoints/0.log" Apr 24 17:46:53.740924 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:53.740894 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-85nzt_afaef099-a861-4606-97c5-485da57c818f/network-check-target-container/0.log" Apr 24 17:46:54.636769 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:54.636737 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-f5ds8_7fd4e710-ea9f-4927-b943-ca92fb5629da/iptables-alerter/0.log" Apr 24 17:46:55.242346 ip-10-0-131-37 kubenswrapper[2575]: I0424 17:46:55.242313 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-8sqwm_59d1c6f6-5f2f-4c93-b15d-47374722a0fe/tuned/0.log"