Apr 16 16:02:37.104459 ip-10-0-129-182 systemd[1]: Starting Kubernetes Kubelet... Apr 16 16:02:37.630555 ip-10-0-129-182 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:02:37.630555 ip-10-0-129-182 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 16:02:37.630555 ip-10-0-129-182 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:02:37.630555 ip-10-0-129-182 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 16:02:37.630555 ip-10-0-129-182 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:02:37.632535 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.632397 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 16:02:37.638694 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638658 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:02:37.638694 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638683 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:02:37.638694 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638689 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:02:37.638694 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638694 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:02:37.638694 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638697 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:02:37.638694 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638702 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:02:37.639067 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638706 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:02:37.639067 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638710 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:02:37.639067 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638714 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:02:37.639067 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638718 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:02:37.639067 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638721 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:02:37.639067 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638725 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:02:37.639067 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638729 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:02:37.639067 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638733 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:02:37.639067 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638736 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:02:37.639067 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638739 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:02:37.639067 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638743 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:02:37.639067 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638747 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:02:37.639067 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638750 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:02:37.639067 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638756 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:02:37.639067 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638764 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:02:37.639067 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638770 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:02:37.639067 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638774 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:02:37.639067 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638778 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:02:37.639067 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638782 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:02:37.639845 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638787 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:02:37.639845 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638790 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:02:37.639845 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638801 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:02:37.639845 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638806 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:02:37.639845 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638811 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:02:37.639845 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638815 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:02:37.639845 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638820 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:02:37.639845 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638825 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:02:37.639845 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638829 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:02:37.639845 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638833 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:02:37.639845 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638837 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:02:37.639845 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638840 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:02:37.639845 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638844 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:02:37.639845 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638848 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:02:37.639845 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638852 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:02:37.639845 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638856 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:02:37.639845 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638860 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:02:37.639845 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638864 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:02:37.639845 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638868 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:02:37.639845 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638873 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:02:37.640732 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638877 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:02:37.640732 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638882 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:02:37.640732 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638886 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:02:37.640732 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638891 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:02:37.640732 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638895 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:02:37.640732 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638899 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:02:37.640732 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638903 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:02:37.640732 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638908 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:02:37.640732 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638913 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:02:37.640732 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638917 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:02:37.640732 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638921 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:02:37.640732 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638926 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:02:37.640732 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638930 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:02:37.640732 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638934 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:02:37.640732 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638938 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:02:37.640732 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638942 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:02:37.640732 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638947 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:02:37.640732 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638951 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:02:37.640732 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638956 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:02:37.640732 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638961 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:02:37.641596 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638965 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:02:37.641596 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638969 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:02:37.641596 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638973 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:02:37.641596 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638977 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:02:37.641596 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638982 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:02:37.641596 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638986 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:02:37.641596 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638991 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:02:37.641596 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.638996 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:02:37.641596 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639000 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:02:37.641596 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639006 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:02:37.641596 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639010 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:02:37.641596 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639014 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:02:37.641596 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639021 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:02:37.641596 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639026 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:02:37.641596 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639031 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:02:37.641596 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639036 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:02:37.641596 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639040 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:02:37.641596 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639044 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:02:37.641596 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639048 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:02:37.641596 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639053 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:02:37.642186 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639057 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:02:37.642186 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639722 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:02:37.642186 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639731 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:02:37.642186 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639736 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:02:37.642186 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639741 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:02:37.642186 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639745 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:02:37.642186 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639749 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:02:37.642186 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639754 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:02:37.642186 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639759 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:02:37.642186 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639763 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:02:37.642186 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639769 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:02:37.642186 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639773 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:02:37.642186 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639778 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:02:37.642186 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639782 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:02:37.642186 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639786 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:02:37.642186 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639790 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:02:37.642186 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639794 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:02:37.642186 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639801 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:02:37.642186 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639809 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:02:37.642742 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639814 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:02:37.642742 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639819 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:02:37.642742 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639824 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:02:37.642742 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639828 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:02:37.642742 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639833 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:02:37.642742 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639837 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:02:37.642742 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639842 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:02:37.642742 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639847 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:02:37.642742 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639851 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:02:37.642742 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639857 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:02:37.642742 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639861 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:02:37.642742 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639866 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:02:37.642742 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639870 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:02:37.642742 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639874 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:02:37.642742 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639878 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:02:37.642742 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639883 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:02:37.642742 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639887 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:02:37.642742 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639891 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:02:37.642742 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639896 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:02:37.643352 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639902 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:02:37.643352 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639906 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:02:37.643352 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639910 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:02:37.643352 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639915 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:02:37.643352 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639920 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:02:37.643352 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639924 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:02:37.643352 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639928 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:02:37.643352 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639932 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:02:37.643352 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639937 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:02:37.643352 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639941 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:02:37.643352 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639945 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:02:37.643352 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639949 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:02:37.643352 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639955 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:02:37.643352 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639959 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:02:37.643352 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639963 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:02:37.643352 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639967 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:02:37.643352 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639971 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:02:37.643352 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639975 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:02:37.643352 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639979 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:02:37.643352 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639983 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:02:37.644055 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639988 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:02:37.644055 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639993 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:02:37.644055 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.639997 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:02:37.644055 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640001 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:02:37.644055 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640005 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:02:37.644055 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640010 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:02:37.644055 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640014 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:02:37.644055 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640018 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:02:37.644055 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640023 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:02:37.644055 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640027 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:02:37.644055 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640033 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:02:37.644055 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640037 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:02:37.644055 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640041 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:02:37.644055 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640046 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:02:37.644055 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640050 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:02:37.644055 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640055 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:02:37.644055 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640059 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:02:37.644055 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640063 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:02:37.644055 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640067 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:02:37.644055 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640071 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:02:37.644578 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640075 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:02:37.644578 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640079 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:02:37.644578 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640084 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:02:37.644578 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640088 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:02:37.644578 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640093 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:02:37.644578 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640098 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:02:37.644578 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640102 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:02:37.644578 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640106 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:02:37.644578 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.640110 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:02:37.644578 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640253 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 16:02:37.644578 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640271 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 16:02:37.644578 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640283 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 16:02:37.644578 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640289 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 16:02:37.644578 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640297 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 16:02:37.644578 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640302 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 16:02:37.644578 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640309 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 16:02:37.644578 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640317 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 16:02:37.644578 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640322 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 16:02:37.644578 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640327 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 16:02:37.644578 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640333 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 16:02:37.644578 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640338 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 16:02:37.644578 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640343 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640348 2577 flags.go:64] FLAG: --cgroup-root="" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640353 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640358 2577 flags.go:64] FLAG: --client-ca-file="" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640363 2577 flags.go:64] FLAG: --cloud-config="" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640367 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640372 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640379 2577 flags.go:64] FLAG: --cluster-domain="" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640383 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640388 2577 flags.go:64] FLAG: --config-dir="" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640393 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640398 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640405 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640410 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640415 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640420 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640425 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640430 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640435 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640440 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640445 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640452 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640457 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640462 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640466 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 16:02:37.645222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640472 2577 flags.go:64] FLAG: --enable-server="true" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640477 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640485 2577 flags.go:64] FLAG: --event-burst="100" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640491 2577 flags.go:64] FLAG: --event-qps="50" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640495 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640500 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640505 2577 flags.go:64] FLAG: --eviction-hard="" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640511 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640516 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640521 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640526 2577 flags.go:64] FLAG: --eviction-soft="" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640531 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640542 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640547 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640551 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640556 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640561 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640566 2577 flags.go:64] FLAG: --feature-gates="" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640572 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640577 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640582 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640588 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640593 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640599 2577 flags.go:64] FLAG: --help="false" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640603 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-129-182.ec2.internal" Apr 16 16:02:37.645855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640608 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640614 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640619 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640624 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640630 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640635 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640640 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640644 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640650 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640656 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640662 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640666 2577 flags.go:64] FLAG: --kube-reserved="" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640671 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640676 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640681 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640686 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640690 2577 flags.go:64] FLAG: --lock-file="" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640695 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640700 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640707 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640716 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640721 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640726 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 16:02:37.646493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640731 2577 flags.go:64] FLAG: --logging-format="text" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640735 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640740 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640745 2577 flags.go:64] FLAG: --manifest-url="" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640749 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640756 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640764 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640771 2577 flags.go:64] FLAG: --max-pods="110" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640776 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640781 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640785 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640790 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640795 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640800 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640806 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640818 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640824 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640828 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640834 2577 flags.go:64] FLAG: --pod-cidr="" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640839 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640849 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640854 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640859 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640864 2577 flags.go:64] FLAG: --port="10250" Apr 16 16:02:37.647056 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640869 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640873 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0073621a631d53b9d" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640878 2577 flags.go:64] FLAG: --qos-reserved="" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640883 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640889 2577 flags.go:64] FLAG: --register-node="true" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640894 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640898 2577 flags.go:64] FLAG: --register-with-taints="" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640904 2577 flags.go:64] FLAG: --registry-burst="10" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640909 2577 flags.go:64] FLAG: --registry-qps="5" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640913 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640918 2577 flags.go:64] FLAG: --reserved-memory="" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640924 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640929 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640934 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640941 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640945 2577 flags.go:64] FLAG: --runonce="false" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640950 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640955 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640959 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640964 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640969 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640974 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640978 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640984 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640988 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640993 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 16:02:37.647686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.640997 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.641003 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.641008 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.641013 2577 flags.go:64] FLAG: --system-cgroups="" Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.641017 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.641025 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.641030 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.641034 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.641041 2577 flags.go:64] FLAG: --tls-min-version="" Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.641045 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.641050 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.641055 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.641060 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.641065 2577 flags.go:64] FLAG: --v="2" Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.641072 2577 flags.go:64] FLAG: --version="false" Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.641078 2577 flags.go:64] FLAG: --vmodule="" Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.641085 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.641090 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641262 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641270 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641280 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641287 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641292 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:02:37.648398 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641297 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:02:37.648965 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641302 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:02:37.648965 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641306 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:02:37.648965 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641310 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:02:37.648965 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641315 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:02:37.648965 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641319 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:02:37.648965 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641324 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:02:37.648965 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641328 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:02:37.648965 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641333 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:02:37.648965 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641337 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:02:37.648965 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641341 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:02:37.648965 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641347 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:02:37.648965 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641352 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:02:37.648965 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641356 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:02:37.648965 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641361 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:02:37.648965 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641365 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:02:37.648965 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641370 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:02:37.648965 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641374 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:02:37.648965 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641378 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:02:37.648965 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641382 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:02:37.648965 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641388 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:02:37.649498 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641392 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:02:37.649498 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641397 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:02:37.649498 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641401 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:02:37.649498 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641405 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:02:37.649498 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641409 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:02:37.649498 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641413 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:02:37.649498 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641419 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:02:37.649498 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641425 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:02:37.649498 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641431 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:02:37.649498 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641436 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:02:37.649498 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641441 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:02:37.649498 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641446 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:02:37.649498 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641450 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:02:37.649498 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641454 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:02:37.649498 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641459 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:02:37.649498 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641463 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:02:37.649498 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641467 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:02:37.649498 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641471 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:02:37.649498 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641476 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:02:37.649498 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641480 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:02:37.650010 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641484 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:02:37.650010 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641488 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:02:37.650010 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641493 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:02:37.650010 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641498 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:02:37.650010 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641502 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:02:37.650010 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641506 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:02:37.650010 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641510 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:02:37.650010 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641514 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:02:37.650010 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641519 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:02:37.650010 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641523 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:02:37.650010 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641527 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:02:37.650010 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641532 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:02:37.650010 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641536 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:02:37.650010 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641540 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:02:37.650010 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641545 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:02:37.650010 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641549 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:02:37.650010 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641553 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:02:37.650010 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641557 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:02:37.650010 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641561 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:02:37.650010 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641565 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:02:37.650532 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641571 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:02:37.650532 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641575 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:02:37.650532 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641580 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:02:37.650532 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641584 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:02:37.650532 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641588 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:02:37.650532 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641592 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:02:37.650532 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641596 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:02:37.650532 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641600 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:02:37.650532 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641604 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:02:37.650532 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641608 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:02:37.650532 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641612 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:02:37.650532 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641616 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:02:37.650532 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641620 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:02:37.650532 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641626 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:02:37.650532 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641630 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:02:37.650532 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641634 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:02:37.650532 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641640 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:02:37.650532 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641644 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:02:37.650532 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641648 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:02:37.650532 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.641652 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:02:37.651061 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.642390 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:02:37.651061 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.650609 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 16:02:37.651061 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.650628 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 16:02:37.651061 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650682 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:02:37.651061 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650687 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:02:37.651061 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650690 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:02:37.651061 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650694 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:02:37.651061 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650697 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:02:37.651061 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650700 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:02:37.651061 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650703 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:02:37.651061 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650706 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:02:37.651061 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650708 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:02:37.651061 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650711 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:02:37.651061 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650714 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:02:37.651061 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650717 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:02:37.651462 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650720 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:02:37.651462 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650722 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:02:37.651462 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650725 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:02:37.651462 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650728 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:02:37.651462 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650730 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:02:37.651462 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650733 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:02:37.651462 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650736 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:02:37.651462 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650738 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:02:37.651462 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650741 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:02:37.651462 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650744 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:02:37.651462 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650746 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:02:37.651462 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650749 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:02:37.651462 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650752 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:02:37.651462 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650754 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:02:37.651462 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650757 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:02:37.651462 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650760 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:02:37.651462 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650762 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:02:37.651462 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650765 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:02:37.651462 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650769 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:02:37.651952 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650774 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:02:37.651952 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650778 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:02:37.651952 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650781 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:02:37.651952 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650784 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:02:37.651952 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650787 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:02:37.651952 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650790 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:02:37.651952 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650793 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:02:37.651952 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650795 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:02:37.651952 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650798 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:02:37.651952 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650801 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:02:37.651952 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650804 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:02:37.651952 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650807 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:02:37.651952 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650810 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:02:37.651952 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650812 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:02:37.651952 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650815 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:02:37.651952 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650818 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:02:37.651952 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650820 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:02:37.651952 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650823 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:02:37.651952 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650827 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:02:37.651952 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650830 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:02:37.652459 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650833 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:02:37.652459 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650836 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:02:37.652459 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650838 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:02:37.652459 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650841 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:02:37.652459 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650844 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:02:37.652459 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650847 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:02:37.652459 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650849 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:02:37.652459 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650852 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:02:37.652459 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650854 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:02:37.652459 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650857 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:02:37.652459 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650859 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:02:37.652459 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650862 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:02:37.652459 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650865 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:02:37.652459 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650868 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:02:37.652459 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650871 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:02:37.652459 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650874 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:02:37.652459 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650876 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:02:37.652459 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650879 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:02:37.652459 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650881 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:02:37.652459 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650884 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:02:37.652941 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650887 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:02:37.652941 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650889 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:02:37.652941 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650892 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:02:37.652941 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650895 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:02:37.652941 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650897 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:02:37.652941 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650900 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:02:37.652941 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650903 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:02:37.652941 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650906 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:02:37.652941 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650909 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:02:37.652941 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650911 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:02:37.652941 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650914 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:02:37.652941 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650917 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:02:37.652941 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650919 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:02:37.652941 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650922 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:02:37.652941 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.650924 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:02:37.652941 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.650930 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:02:37.653363 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651035 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:02:37.653363 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651042 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:02:37.653363 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651045 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:02:37.653363 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651049 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:02:37.653363 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651052 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:02:37.653363 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651054 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:02:37.653363 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651057 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:02:37.653363 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651059 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:02:37.653363 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651062 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:02:37.653363 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651065 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:02:37.653363 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651068 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:02:37.653363 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651071 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:02:37.653363 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651074 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:02:37.653363 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651076 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:02:37.653363 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651079 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:02:37.653363 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651082 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:02:37.653363 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651084 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:02:37.653363 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651087 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:02:37.653363 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651090 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:02:37.653363 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651092 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:02:37.653850 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651095 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:02:37.653850 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651097 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:02:37.653850 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651100 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:02:37.653850 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651102 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:02:37.653850 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651105 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:02:37.653850 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651108 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:02:37.653850 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651110 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:02:37.653850 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651113 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:02:37.653850 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651115 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:02:37.653850 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651118 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:02:37.653850 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651120 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:02:37.653850 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651125 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:02:37.653850 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651144 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:02:37.653850 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651148 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:02:37.653850 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651151 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:02:37.653850 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651155 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:02:37.653850 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651157 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:02:37.653850 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651160 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:02:37.653850 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651163 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:02:37.653850 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651165 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:02:37.654360 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651168 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:02:37.654360 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651170 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:02:37.654360 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651173 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:02:37.654360 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651177 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:02:37.654360 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651180 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:02:37.654360 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651182 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:02:37.654360 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651185 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:02:37.654360 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651188 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:02:37.654360 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651190 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:02:37.654360 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651193 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:02:37.654360 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651196 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:02:37.654360 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651198 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:02:37.654360 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651201 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:02:37.654360 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651204 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:02:37.654360 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651206 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:02:37.654360 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651209 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:02:37.654360 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651211 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:02:37.654360 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651214 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:02:37.654360 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651216 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:02:37.654864 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651219 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:02:37.654864 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651221 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:02:37.654864 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651224 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:02:37.654864 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651227 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:02:37.654864 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651232 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:02:37.654864 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651235 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:02:37.654864 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651238 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:02:37.654864 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651241 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:02:37.654864 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651243 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:02:37.654864 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651246 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:02:37.654864 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651249 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:02:37.654864 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651251 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:02:37.654864 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651254 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:02:37.654864 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651256 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:02:37.654864 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651259 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:02:37.654864 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651261 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:02:37.654864 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651264 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:02:37.654864 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651267 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:02:37.654864 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651269 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:02:37.654864 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651272 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:02:37.655379 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651274 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:02:37.655379 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651277 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:02:37.655379 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651280 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:02:37.655379 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651282 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:02:37.655379 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651285 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:02:37.655379 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651289 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:02:37.655379 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:37.651292 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:02:37.655379 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.651298 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:02:37.655379 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.651446 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 16:02:37.655379 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.654258 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 16:02:37.655379 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.655275 2577 server.go:1019] "Starting client certificate rotation" Apr 16 16:02:37.655379 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.655374 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:02:37.655680 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.655422 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:02:37.686366 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.686337 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:02:37.691423 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.691395 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:02:37.709910 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.709884 2577 log.go:25] "Validated CRI v1 runtime API" Apr 16 16:02:37.716497 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.716475 2577 log.go:25] "Validated CRI v1 image API" Apr 16 16:02:37.717792 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.717777 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 16:02:37.723403 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.723372 2577 fs.go:135] Filesystem UUIDs: map[1d3ffd8e-31f1-4453-aaba-8d117d88ddd0:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 a46a4a5a-92e9-4216-b79a-76d0c2bb9b54:/dev/nvme0n1p4] Apr 16 16:02:37.723481 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.723400 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 16:02:37.725819 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.725796 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:02:37.729426 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.729302 2577 manager.go:217] Machine: {Timestamp:2026-04-16 16:02:37.727430135 +0000 UTC m=+0.491054837 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098858 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec26c2892ec9b413a5c5a39d33347dd9 SystemUUID:ec26c289-2ec9-b413-a5c5-a39d33347dd9 BootID:d06d5a3a-b057-4773-97b7-15293f8d70cf Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:e0:11:e1:15:27 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:e0:11:e1:15:27 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:46:ef:b2:77:6e:80 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 16:02:37.729426 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.729421 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 16:02:37.729539 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.729512 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 16:02:37.731333 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.731305 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 16:02:37.731476 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.731335 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-182.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 16:02:37.731524 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.731484 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 16:02:37.731524 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.731493 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 16:02:37.731524 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.731505 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:02:37.732338 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.732327 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:02:37.734636 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.734624 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:02:37.734759 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.734750 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 16:02:37.738178 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.738163 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 16 16:02:37.738219 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.738182 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 16:02:37.738219 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.738197 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 16:02:37.738219 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.738208 2577 kubelet.go:397] "Adding apiserver pod source" Apr 16 16:02:37.738322 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.738226 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 16:02:37.740002 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.739983 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:02:37.740052 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.740016 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:02:37.743874 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.743841 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 16:02:37.745986 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.745966 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 16:02:37.747510 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.747495 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 16:02:37.747601 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.747515 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 16:02:37.747601 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.747525 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 16:02:37.747601 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.747533 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 16:02:37.747601 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.747543 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 16:02:37.747601 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.747551 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 16:02:37.747601 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.747559 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 16:02:37.747601 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.747567 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 16:02:37.747601 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.747579 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 16:02:37.747601 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.747587 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 16:02:37.747601 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.747599 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 16:02:37.747880 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.747613 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 16:02:37.748501 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.748489 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 16:02:37.748551 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.748504 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 16:02:37.751497 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.751466 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-182.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 16:02:37.751594 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:37.751490 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-182.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 16:02:37.751594 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:37.751527 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 16:02:37.752037 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.752023 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 16:02:37.752086 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.752072 2577 server.go:1295] "Started kubelet" Apr 16 16:02:37.752178 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.752154 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 16:02:37.752966 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.752864 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 16:02:37.753049 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.752988 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 16:02:37.753716 ip-10-0-129-182 systemd[1]: Started Kubernetes Kubelet. Apr 16 16:02:37.754297 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.754249 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 16:02:37.755516 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.755503 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 16 16:02:37.765348 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.765325 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 16:02:37.765873 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:37.765851 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 16:02:37.766419 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.766404 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 16:02:37.767259 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.767242 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 16:02:37.767259 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.767247 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 16:02:37.767406 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.767274 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 16:02:37.767406 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.767312 2577 factory.go:153] Registering CRI-O factory Apr 16 16:02:37.767406 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.767339 2577 factory.go:223] Registration of the crio container factory successfully Apr 16 16:02:37.767406 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.767359 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 16 16:02:37.767406 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.767369 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 16 16:02:37.767406 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.767397 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 16:02:37.767406 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.767406 2577 factory.go:55] Registering systemd factory Apr 16 16:02:37.767406 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.767411 2577 factory.go:223] Registration of the systemd container factory successfully Apr 16 16:02:37.767744 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.767426 2577 factory.go:103] Registering Raw factory Apr 16 16:02:37.767744 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.767437 2577 manager.go:1196] Started watching for new ooms in manager Apr 16 16:02:37.767943 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:37.767925 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-182.ec2.internal\" not found" Apr 16 16:02:37.769313 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.769278 2577 manager.go:319] Starting recovery of all containers Apr 16 16:02:37.770743 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:37.770713 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-182.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 16:02:37.770743 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:37.770725 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 16:02:37.771749 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:37.770699 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-182.ec2.internal.18a6e1c94dac1261 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-182.ec2.internal,UID:ip-10-0-129-182.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-182.ec2.internal,},FirstTimestamp:2026-04-16 16:02:37.752037985 +0000 UTC m=+0.515662689,LastTimestamp:2026-04-16 16:02:37.752037985 +0000 UTC m=+0.515662689,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-182.ec2.internal,}" Apr 16 16:02:37.771887 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.771869 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zzv99" Apr 16 16:02:37.779458 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.779299 2577 manager.go:324] Recovery completed Apr 16 16:02:37.780307 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.780291 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zzv99" Apr 16 16:02:37.783548 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.783535 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:02:37.786104 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.786087 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-182.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:02:37.786208 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.786118 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:02:37.786208 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.786144 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-182.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:02:37.786600 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.786586 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 16:02:37.786600 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.786598 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 16:02:37.786684 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.786613 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:02:37.788650 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:37.788575 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-182.ec2.internal.18a6e1c94fb3e1cc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-182.ec2.internal,UID:ip-10-0-129-182.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-129-182.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-129-182.ec2.internal,},FirstTimestamp:2026-04-16 16:02:37.786104268 +0000 UTC m=+0.549728969,LastTimestamp:2026-04-16 16:02:37.786104268 +0000 UTC m=+0.549728969,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-182.ec2.internal,}" Apr 16 16:02:37.789966 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.789952 2577 policy_none.go:49] "None policy: Start" Apr 16 16:02:37.790036 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.789969 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 16:02:37.790036 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.789980 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 16 16:02:37.837800 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.837774 2577 manager.go:341] "Starting Device Plugin manager" Apr 16 16:02:37.843288 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:37.837841 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 16:02:37.843288 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.837857 2577 server.go:85] "Starting device plugin registration server" Apr 16 16:02:37.843288 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.838178 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 16:02:37.843288 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.838191 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 16:02:37.843288 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.838281 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 16:02:37.843288 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.838362 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 16:02:37.843288 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.838368 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 16:02:37.843288 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:37.838943 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 16:02:37.843288 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:37.838980 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-182.ec2.internal\" not found" Apr 16 16:02:37.912566 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.912470 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 16:02:37.913762 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.913737 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 16:02:37.913873 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.913777 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 16:02:37.913873 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.913802 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 16:02:37.913873 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.913811 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 16:02:37.913873 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:37.913851 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 16:02:37.917658 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.917638 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:02:37.938336 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.938313 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:02:37.939312 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.939294 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-182.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:02:37.939411 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.939329 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:02:37.939411 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.939343 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-182.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:02:37.939411 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.939374 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-182.ec2.internal" Apr 16 16:02:37.947549 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:37.947525 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-182.ec2.internal" Apr 16 16:02:37.947549 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:37.947550 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-182.ec2.internal\": node \"ip-10-0-129-182.ec2.internal\" not found" Apr 16 16:02:37.967170 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:37.967145 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-182.ec2.internal\" not found" Apr 16 16:02:38.014353 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.014311 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-182.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-182.ec2.internal"] Apr 16 16:02:38.014453 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.014406 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:02:38.016523 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.016508 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-182.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:02:38.016593 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.016543 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:02:38.016593 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.016553 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-182.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:02:38.017868 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.017856 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:02:38.017998 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.017982 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-182.ec2.internal" Apr 16 16:02:38.018077 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.018011 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:02:38.018615 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.018596 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-182.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:02:38.018718 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.018624 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:02:38.018718 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.018635 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-182.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:02:38.018718 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.018602 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-182.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:02:38.018718 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.018707 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:02:38.018718 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.018721 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-182.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:02:38.020204 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.020188 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-182.ec2.internal" Apr 16 16:02:38.020297 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.020218 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:02:38.020936 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.020920 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-182.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:02:38.021017 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.020947 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-182.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:02:38.021017 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.020959 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-182.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:02:38.045985 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:38.045956 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-182.ec2.internal\" not found" node="ip-10-0-129-182.ec2.internal" Apr 16 16:02:38.050613 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:38.050597 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-182.ec2.internal\" not found" node="ip-10-0-129-182.ec2.internal" Apr 16 16:02:38.068192 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:38.068169 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-182.ec2.internal\" not found" Apr 16 16:02:38.168621 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:38.168539 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-182.ec2.internal\" not found" Apr 16 16:02:38.168763 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.168623 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/406c6ba715b6247850e61153d48508c5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-182.ec2.internal\" (UID: \"406c6ba715b6247850e61153d48508c5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-182.ec2.internal" Apr 16 16:02:38.168763 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.168652 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/406c6ba715b6247850e61153d48508c5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-182.ec2.internal\" (UID: \"406c6ba715b6247850e61153d48508c5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-182.ec2.internal" Apr 16 16:02:38.168763 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.168672 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f8506f0b2e52254b607e38e68b2f4dae-config\") pod \"kube-apiserver-proxy-ip-10-0-129-182.ec2.internal\" (UID: \"f8506f0b2e52254b607e38e68b2f4dae\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-182.ec2.internal" Apr 16 16:02:38.269288 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:38.269260 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-182.ec2.internal\" not found" Apr 16 16:02:38.269288 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.269282 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/406c6ba715b6247850e61153d48508c5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-182.ec2.internal\" (UID: \"406c6ba715b6247850e61153d48508c5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-182.ec2.internal" Apr 16 16:02:38.269436 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.269307 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/406c6ba715b6247850e61153d48508c5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-182.ec2.internal\" (UID: \"406c6ba715b6247850e61153d48508c5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-182.ec2.internal" Apr 16 16:02:38.269436 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.269324 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f8506f0b2e52254b607e38e68b2f4dae-config\") pod \"kube-apiserver-proxy-ip-10-0-129-182.ec2.internal\" (UID: \"f8506f0b2e52254b607e38e68b2f4dae\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-182.ec2.internal" Apr 16 16:02:38.269436 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.269365 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f8506f0b2e52254b607e38e68b2f4dae-config\") pod \"kube-apiserver-proxy-ip-10-0-129-182.ec2.internal\" (UID: \"f8506f0b2e52254b607e38e68b2f4dae\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-182.ec2.internal" Apr 16 16:02:38.269436 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.269387 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/406c6ba715b6247850e61153d48508c5-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-182.ec2.internal\" (UID: \"406c6ba715b6247850e61153d48508c5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-182.ec2.internal" Apr 16 16:02:38.269436 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.269394 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/406c6ba715b6247850e61153d48508c5-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-182.ec2.internal\" (UID: \"406c6ba715b6247850e61153d48508c5\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-182.ec2.internal" Apr 16 16:02:38.349504 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.349466 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-182.ec2.internal" Apr 16 16:02:38.354211 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.354191 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-182.ec2.internal" Apr 16 16:02:38.369817 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:38.369782 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-182.ec2.internal\" not found" Apr 16 16:02:38.470353 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:38.470264 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-182.ec2.internal\" not found" Apr 16 16:02:38.570724 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:38.570681 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-182.ec2.internal\" not found" Apr 16 16:02:38.654954 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.654909 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 16:02:38.655496 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.655103 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:02:38.671344 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:38.671309 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-182.ec2.internal\" not found" Apr 16 16:02:38.765952 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.765888 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 16:02:38.772195 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:38.772170 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-182.ec2.internal\" not found" Apr 16 16:02:38.777471 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.777452 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:02:38.782379 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.782354 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 15:57:37 +0000 UTC" deadline="2027-10-26 19:46:41.447788287 +0000 UTC" Apr 16 16:02:38.782432 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.782380 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13395h44m2.665411463s" Apr 16 16:02:38.783461 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.783445 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:02:38.804269 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.804240 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-jls8b" Apr 16 16:02:38.812609 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.812582 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-jls8b" Apr 16 16:02:38.862521 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.862492 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:02:38.866649 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.866620 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-182.ec2.internal" Apr 16 16:02:38.867122 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:38.867099 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod406c6ba715b6247850e61153d48508c5.slice/crio-cbf1291ae41e8fdefe160cde02b3b9b258a8c65bf0b86835b027ec3f06d8577b WatchSource:0}: Error finding container cbf1291ae41e8fdefe160cde02b3b9b258a8c65bf0b86835b027ec3f06d8577b: Status 404 returned error can't find the container with id cbf1291ae41e8fdefe160cde02b3b9b258a8c65bf0b86835b027ec3f06d8577b Apr 16 16:02:38.867330 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:38.867311 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8506f0b2e52254b607e38e68b2f4dae.slice/crio-20d7e5729718473fd0b322e78c74567df49d1ed0e1cfc81dd11f0deed3ecd3f8 WatchSource:0}: Error finding container 20d7e5729718473fd0b322e78c74567df49d1ed0e1cfc81dd11f0deed3ecd3f8: Status 404 returned error can't find the container with id 20d7e5729718473fd0b322e78c74567df49d1ed0e1cfc81dd11f0deed3ecd3f8 Apr 16 16:02:38.872488 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.872472 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:02:38.879444 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.879425 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:02:38.880754 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.880737 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-182.ec2.internal" Apr 16 16:02:38.888889 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.888868 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:02:38.916848 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.916795 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-182.ec2.internal" event={"ID":"f8506f0b2e52254b607e38e68b2f4dae","Type":"ContainerStarted","Data":"20d7e5729718473fd0b322e78c74567df49d1ed0e1cfc81dd11f0deed3ecd3f8"} Apr 16 16:02:38.917750 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:38.917715 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-182.ec2.internal" event={"ID":"406c6ba715b6247850e61153d48508c5","Type":"ContainerStarted","Data":"cbf1291ae41e8fdefe160cde02b3b9b258a8c65bf0b86835b027ec3f06d8577b"} Apr 16 16:02:39.202728 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.202694 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:02:39.598516 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.598345 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:02:39.739057 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.739025 2577 apiserver.go:52] "Watching apiserver" Apr 16 16:02:39.747869 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.747838 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 16:02:39.748921 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.748891 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-vjpkx","openshift-network-operator/iptables-alerter-8bxsq","openshift-ovn-kubernetes/ovnkube-node-m2jms","kube-system/kube-apiserver-proxy-ip-10-0-129-182.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-182.ec2.internal","openshift-multus/multus-additional-cni-plugins-jftn9","openshift-multus/multus-wfqhd","openshift-multus/network-metrics-daemon-2b9mp","openshift-network-diagnostics/network-check-target-j96wv","kube-system/konnectivity-agent-dkplw","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w","openshift-cluster-node-tuning-operator/tuned-4mstk"] Apr 16 16:02:39.751817 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.751797 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.753313 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.753283 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vjpkx" Apr 16 16:02:39.753423 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.753379 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8bxsq" Apr 16 16:02:39.754193 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.754174 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:02:39.754295 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.754172 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-bbkpr\"" Apr 16 16:02:39.754295 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.754268 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 16:02:39.754767 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.754751 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.755572 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.755515 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 16:02:39.756117 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.755714 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 16:02:39.756117 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.755852 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 16:02:39.756117 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.755855 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:02:39.756117 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.755925 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 16:02:39.756117 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.756100 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.756538 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.756254 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-dpq8v\"" Apr 16 16:02:39.756886 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.756664 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 16:02:39.756886 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.756688 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-7lvrj\"" Apr 16 16:02:39.756886 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.756741 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 16:02:39.757186 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.756982 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-8nt5q\"" Apr 16 16:02:39.757186 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.757028 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 16:02:39.757186 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.757033 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 16:02:39.757757 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.757474 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.757757 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.757583 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 16:02:39.757757 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.757609 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 16:02:39.757757 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.757615 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 16:02:39.758125 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.758106 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 16:02:39.758709 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.758689 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 16:02:39.758983 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.758868 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 16:02:39.758983 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.758882 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 16:02:39.758983 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.758869 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 16:02:39.759246 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.759069 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-kkxt5\"" Apr 16 16:02:39.759652 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.759636 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 16:02:39.759897 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.759880 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-rrbwl\"" Apr 16 16:02:39.760560 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.760526 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:02:39.760664 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.760605 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:02:39.760664 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:39.760603 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2b9mp" podUID="64649692-472e-4f06-9640-7e6075d1e84f" Apr 16 16:02:39.760664 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:39.760652 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j96wv" podUID="2d37db46-a278-4fec-8cea-0900b1dfb12d" Apr 16 16:02:39.762033 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.762014 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dkplw" Apr 16 16:02:39.763659 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.763640 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:39.764104 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.764081 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 16:02:39.764212 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.764201 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-5h28f\"" Apr 16 16:02:39.764291 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.764270 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 16:02:39.765651 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.765633 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 16:02:39.765760 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.765738 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 16:02:39.766323 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.766150 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-crzgr\"" Apr 16 16:02:39.766527 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.766512 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 16:02:39.768591 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.768577 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 16:02:39.777786 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.777552 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-multus-conf-dir\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.777786 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.777611 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-host-run-multus-certs\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.777786 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.777642 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-etc-sysctl-conf\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.777786 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.777676 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-node-log\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.777786 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.777709 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.777786 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.777767 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-etc-modprobe-d\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.778159 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.777810 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-etc-kubernetes\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.778159 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.777837 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-etc-sysctl-d\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.778159 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.777868 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f6a5a36-0127-4a26-a722-43d4a0b49496-host-slash\") pod \"iptables-alerter-8bxsq\" (UID: \"0f6a5a36-0127-4a26-a722-43d4a0b49496\") " pod="openshift-network-operator/iptables-alerter-8bxsq" Apr 16 16:02:39.778159 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.777911 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-cni-binary-copy\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.778159 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.777938 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-multus-cni-dir\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.778159 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.777965 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-hostroot\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.778159 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.777993 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-567lt\" (UniqueName: \"kubernetes.io/projected/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-kube-api-access-567lt\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.778159 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778055 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-run-ovn\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.778159 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778093 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-host-var-lib-cni-multus\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.778159 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778145 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5148421b-8077-4ac5-bad5-00c661d71b12-registration-dir\") pod \"aws-ebs-csi-driver-node-gcn2w\" (UID: \"5148421b-8077-4ac5-bad5-00c661d71b12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:39.778642 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778170 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-lib-modules\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.778642 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778202 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-cnibin\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.778642 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778229 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.778642 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778261 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.778642 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778324 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-host-run-k8s-cni-cncf-io\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.778642 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778354 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0f6a5a36-0127-4a26-a722-43d4a0b49496-iptables-alerter-script\") pod \"iptables-alerter-8bxsq\" (UID: \"0f6a5a36-0127-4a26-a722-43d4a0b49496\") " pod="openshift-network-operator/iptables-alerter-8bxsq" Apr 16 16:02:39.778642 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778377 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjrt7\" (UniqueName: \"kubernetes.io/projected/64649692-472e-4f06-9640-7e6075d1e84f-kube-api-access-vjrt7\") pod \"network-metrics-daemon-2b9mp\" (UID: \"64649692-472e-4f06-9640-7e6075d1e84f\") " pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:02:39.778642 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778408 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-system-cni-dir\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.778642 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778434 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-os-release\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.778642 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778464 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5148421b-8077-4ac5-bad5-00c661d71b12-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gcn2w\" (UID: \"5148421b-8077-4ac5-bad5-00c661d71b12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:39.778642 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778491 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-run-openvswitch\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.778642 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778516 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-cnibin\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.778642 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778541 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-host-run-netns\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.778642 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778568 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-etc-kubernetes\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.778642 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778594 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpkck\" (UniqueName: \"kubernetes.io/projected/024bdd5b-3034-463f-aa2d-3e55d292bbd0-kube-api-access-fpkck\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.778642 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778622 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/36fdb312-0ed7-418b-b720-16a3c46fff51-agent-certs\") pod \"konnectivity-agent-dkplw\" (UID: \"36fdb312-0ed7-418b-b720-16a3c46fff51\") " pod="kube-system/konnectivity-agent-dkplw" Apr 16 16:02:39.779650 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778648 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-os-release\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.779650 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778672 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-host-run-netns\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.779650 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778705 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-host\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.779650 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778732 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-etc-tuned\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.779650 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778759 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-system-cni-dir\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.779650 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778786 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-host-cni-netd\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.779650 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778936 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a2826faf-f579-4f51-8772-9882e98d4593-ovnkube-script-lib\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.779650 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778969 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4dz7\" (UniqueName: \"kubernetes.io/projected/0f6a5a36-0127-4a26-a722-43d4a0b49496-kube-api-access-w4dz7\") pod \"iptables-alerter-8bxsq\" (UID: \"0f6a5a36-0127-4a26-a722-43d4a0b49496\") " pod="openshift-network-operator/iptables-alerter-8bxsq" Apr 16 16:02:39.779650 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.778997 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-host-run-ovn-kubernetes\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.779650 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.779024 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a2826faf-f579-4f51-8772-9882e98d4593-ovnkube-config\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.779650 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.779082 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2826faf-f579-4f51-8772-9882e98d4593-ovn-node-metrics-cert\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.779650 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.779111 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-host-var-lib-cni-bin\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.779650 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.779150 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5148421b-8077-4ac5-bad5-00c661d71b12-socket-dir\") pod \"aws-ebs-csi-driver-node-gcn2w\" (UID: \"5148421b-8077-4ac5-bad5-00c661d71b12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:39.779650 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.779179 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5148421b-8077-4ac5-bad5-00c661d71b12-device-dir\") pod \"aws-ebs-csi-driver-node-gcn2w\" (UID: \"5148421b-8077-4ac5-bad5-00c661d71b12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:39.779650 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.779206 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5148421b-8077-4ac5-bad5-00c661d71b12-sys-fs\") pod \"aws-ebs-csi-driver-node-gcn2w\" (UID: \"5148421b-8077-4ac5-bad5-00c661d71b12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:39.779650 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.779234 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-run\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.780365 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.779262 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-host-cni-bin\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.780365 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.779288 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/024bdd5b-3034-463f-aa2d-3e55d292bbd0-cni-binary-copy\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.780365 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.779319 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-multus-socket-dir-parent\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.780365 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.779348 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-var-lib-kubelet\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.780365 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.779373 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1b804e33-713f-4c6f-a4a1-0181d77254e1-serviceca\") pod \"node-ca-vjpkx\" (UID: \"1b804e33-713f-4c6f-a4a1-0181d77254e1\") " pod="openshift-image-registry/node-ca-vjpkx" Apr 16 16:02:39.780365 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.779401 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrhgl\" (UniqueName: \"kubernetes.io/projected/1b804e33-713f-4c6f-a4a1-0181d77254e1-kube-api-access-xrhgl\") pod \"node-ca-vjpkx\" (UID: \"1b804e33-713f-4c6f-a4a1-0181d77254e1\") " pod="openshift-image-registry/node-ca-vjpkx" Apr 16 16:02:39.780365 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.779431 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs\") pod \"network-metrics-daemon-2b9mp\" (UID: \"64649692-472e-4f06-9640-7e6075d1e84f\") " pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:02:39.780365 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.779481 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-etc-sysconfig\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.780365 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.779908 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-etc-openvswitch\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.780365 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.779949 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-log-socket\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.780365 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.779978 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a2826faf-f579-4f51-8772-9882e98d4593-env-overrides\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.780365 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.780033 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/36fdb312-0ed7-418b-b720-16a3c46fff51-konnectivity-ca\") pod \"konnectivity-agent-dkplw\" (UID: \"36fdb312-0ed7-418b-b720-16a3c46fff51\") " pod="kube-system/konnectivity-agent-dkplw" Apr 16 16:02:39.780365 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.780100 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5148421b-8077-4ac5-bad5-00c661d71b12-etc-selinux\") pod \"aws-ebs-csi-driver-node-gcn2w\" (UID: \"5148421b-8077-4ac5-bad5-00c661d71b12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:39.780365 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.780156 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-sys\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.781296 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.780525 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-systemd-units\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.781383 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.781325 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-var-lib-openvswitch\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.781383 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.781366 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/024bdd5b-3034-463f-aa2d-3e55d292bbd0-multus-daemon-config\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.781929 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.781909 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b804e33-713f-4c6f-a4a1-0181d77254e1-host\") pod \"node-ca-vjpkx\" (UID: \"1b804e33-713f-4c6f-a4a1-0181d77254e1\") " pod="openshift-image-registry/node-ca-vjpkx" Apr 16 16:02:39.782042 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.781952 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.782042 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.781989 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-host-kubelet\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.782042 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.782022 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-host-var-lib-kubelet\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.782226 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.782162 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl5bp\" (UniqueName: \"kubernetes.io/projected/5148421b-8077-4ac5-bad5-00c661d71b12-kube-api-access-nl5bp\") pod \"aws-ebs-csi-driver-node-gcn2w\" (UID: \"5148421b-8077-4ac5-bad5-00c661d71b12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:39.782226 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.782203 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-etc-systemd\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.782325 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.782261 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-tmp\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.782325 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.782295 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wqsr\" (UniqueName: \"kubernetes.io/projected/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-kube-api-access-9wqsr\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.782775 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.782359 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h826f\" (UniqueName: \"kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f\") pod \"network-check-target-j96wv\" (UID: \"2d37db46-a278-4fec-8cea-0900b1dfb12d\") " pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:02:39.782775 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.782760 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-host-slash\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.782916 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.782784 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-run-systemd\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.782916 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.782807 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6stk\" (UniqueName: \"kubernetes.io/projected/a2826faf-f579-4f51-8772-9882e98d4593-kube-api-access-v6stk\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.813658 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.813619 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 15:57:38 +0000 UTC" deadline="2027-11-12 14:28:43.875794114 +0000 UTC" Apr 16 16:02:39.813658 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.813655 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13798h26m4.062142357s" Apr 16 16:02:39.883244 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883142 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-etc-tuned\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.883244 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883194 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-system-cni-dir\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.883244 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883222 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-host-cni-netd\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.883521 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883280 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-host-cni-netd\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.883521 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883287 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-system-cni-dir\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.883521 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883298 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a2826faf-f579-4f51-8772-9882e98d4593-ovnkube-script-lib\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.883521 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883351 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4dz7\" (UniqueName: \"kubernetes.io/projected/0f6a5a36-0127-4a26-a722-43d4a0b49496-kube-api-access-w4dz7\") pod \"iptables-alerter-8bxsq\" (UID: \"0f6a5a36-0127-4a26-a722-43d4a0b49496\") " pod="openshift-network-operator/iptables-alerter-8bxsq" Apr 16 16:02:39.883521 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883381 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-host-run-ovn-kubernetes\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.883521 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883430 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-host-run-ovn-kubernetes\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.883800 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883535 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a2826faf-f579-4f51-8772-9882e98d4593-ovnkube-config\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.883800 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883564 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2826faf-f579-4f51-8772-9882e98d4593-ovn-node-metrics-cert\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.883800 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883575 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 16:02:39.883800 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883590 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-host-var-lib-cni-bin\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.883800 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883614 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5148421b-8077-4ac5-bad5-00c661d71b12-socket-dir\") pod \"aws-ebs-csi-driver-node-gcn2w\" (UID: \"5148421b-8077-4ac5-bad5-00c661d71b12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:39.883800 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883637 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5148421b-8077-4ac5-bad5-00c661d71b12-device-dir\") pod \"aws-ebs-csi-driver-node-gcn2w\" (UID: \"5148421b-8077-4ac5-bad5-00c661d71b12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:39.883800 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883697 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5148421b-8077-4ac5-bad5-00c661d71b12-device-dir\") pod \"aws-ebs-csi-driver-node-gcn2w\" (UID: \"5148421b-8077-4ac5-bad5-00c661d71b12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:39.883800 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883742 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-host-var-lib-cni-bin\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.884239 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883836 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5148421b-8077-4ac5-bad5-00c661d71b12-socket-dir\") pod \"aws-ebs-csi-driver-node-gcn2w\" (UID: \"5148421b-8077-4ac5-bad5-00c661d71b12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:39.884239 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883867 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5148421b-8077-4ac5-bad5-00c661d71b12-sys-fs\") pod \"aws-ebs-csi-driver-node-gcn2w\" (UID: \"5148421b-8077-4ac5-bad5-00c661d71b12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:39.884239 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883893 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-run\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.884239 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883917 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-host-cni-bin\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.884239 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883934 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a2826faf-f579-4f51-8772-9882e98d4593-ovnkube-script-lib\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.884239 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883941 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/024bdd5b-3034-463f-aa2d-3e55d292bbd0-cni-binary-copy\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.884239 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883967 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-multus-socket-dir-parent\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.884239 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883976 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5148421b-8077-4ac5-bad5-00c661d71b12-sys-fs\") pod \"aws-ebs-csi-driver-node-gcn2w\" (UID: \"5148421b-8077-4ac5-bad5-00c661d71b12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:39.884239 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.883993 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-var-lib-kubelet\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.884239 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884019 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1b804e33-713f-4c6f-a4a1-0181d77254e1-serviceca\") pod \"node-ca-vjpkx\" (UID: \"1b804e33-713f-4c6f-a4a1-0181d77254e1\") " pod="openshift-image-registry/node-ca-vjpkx" Apr 16 16:02:39.884239 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884012 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-host-cni-bin\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.884239 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884068 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrhgl\" (UniqueName: \"kubernetes.io/projected/1b804e33-713f-4c6f-a4a1-0181d77254e1-kube-api-access-xrhgl\") pod \"node-ca-vjpkx\" (UID: \"1b804e33-713f-4c6f-a4a1-0181d77254e1\") " pod="openshift-image-registry/node-ca-vjpkx" Apr 16 16:02:39.884239 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884096 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs\") pod \"network-metrics-daemon-2b9mp\" (UID: \"64649692-472e-4f06-9640-7e6075d1e84f\") " pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:02:39.884239 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884123 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-etc-sysconfig\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.884239 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884171 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-var-lib-kubelet\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.884239 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884175 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-etc-openvswitch\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.884239 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884199 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-log-socket\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.884239 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884214 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-run\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.885051 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884236 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a2826faf-f579-4f51-8772-9882e98d4593-env-overrides\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.885051 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884261 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/36fdb312-0ed7-418b-b720-16a3c46fff51-konnectivity-ca\") pod \"konnectivity-agent-dkplw\" (UID: \"36fdb312-0ed7-418b-b720-16a3c46fff51\") " pod="kube-system/konnectivity-agent-dkplw" Apr 16 16:02:39.885051 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884286 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5148421b-8077-4ac5-bad5-00c661d71b12-etc-selinux\") pod \"aws-ebs-csi-driver-node-gcn2w\" (UID: \"5148421b-8077-4ac5-bad5-00c661d71b12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:39.885051 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884309 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-sys\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.885051 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884334 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-systemd-units\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.885051 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884358 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-var-lib-openvswitch\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.885051 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884171 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a2826faf-f579-4f51-8772-9882e98d4593-ovnkube-config\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.885051 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884386 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/024bdd5b-3034-463f-aa2d-3e55d292bbd0-multus-daemon-config\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.885051 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884412 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b804e33-713f-4c6f-a4a1-0181d77254e1-host\") pod \"node-ca-vjpkx\" (UID: \"1b804e33-713f-4c6f-a4a1-0181d77254e1\") " pod="openshift-image-registry/node-ca-vjpkx" Apr 16 16:02:39.885051 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884438 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.885051 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884465 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-host-kubelet\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.885051 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884492 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-host-var-lib-kubelet\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.885051 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884512 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-etc-sysconfig\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.885051 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884518 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nl5bp\" (UniqueName: \"kubernetes.io/projected/5148421b-8077-4ac5-bad5-00c661d71b12-kube-api-access-nl5bp\") pod \"aws-ebs-csi-driver-node-gcn2w\" (UID: \"5148421b-8077-4ac5-bad5-00c661d71b12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:39.885051 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884548 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-etc-systemd\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.885051 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884554 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-etc-openvswitch\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.885051 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884558 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/024bdd5b-3034-463f-aa2d-3e55d292bbd0-cni-binary-copy\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.885835 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884572 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-tmp\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.885835 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884594 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-log-socket\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.885835 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884604 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1b804e33-713f-4c6f-a4a1-0181d77254e1-serviceca\") pod \"node-ca-vjpkx\" (UID: \"1b804e33-713f-4c6f-a4a1-0181d77254e1\") " pod="openshift-image-registry/node-ca-vjpkx" Apr 16 16:02:39.885835 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884630 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b804e33-713f-4c6f-a4a1-0181d77254e1-host\") pod \"node-ca-vjpkx\" (UID: \"1b804e33-713f-4c6f-a4a1-0181d77254e1\") " pod="openshift-image-registry/node-ca-vjpkx" Apr 16 16:02:39.885835 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884650 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-host-var-lib-kubelet\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.885835 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884665 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wqsr\" (UniqueName: \"kubernetes.io/projected/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-kube-api-access-9wqsr\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.885835 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884696 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h826f\" (UniqueName: \"kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f\") pod \"network-check-target-j96wv\" (UID: \"2d37db46-a278-4fec-8cea-0900b1dfb12d\") " pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:02:39.885835 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884723 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-host-slash\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.885835 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884748 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-run-systemd\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.885835 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884773 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6stk\" (UniqueName: \"kubernetes.io/projected/a2826faf-f579-4f51-8772-9882e98d4593-kube-api-access-v6stk\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.885835 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884809 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-multus-conf-dir\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.885835 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884833 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-host-run-multus-certs\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.885835 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884095 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-multus-socket-dir-parent\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.885835 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884856 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-etc-sysctl-conf\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.885835 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884902 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-node-log\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.885835 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884942 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.885835 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884971 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-etc-modprobe-d\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.886569 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.884997 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-etc-kubernetes\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.886569 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885022 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-etc-sysctl-d\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.886569 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885048 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f6a5a36-0127-4a26-a722-43d4a0b49496-host-slash\") pod \"iptables-alerter-8bxsq\" (UID: \"0f6a5a36-0127-4a26-a722-43d4a0b49496\") " pod="openshift-network-operator/iptables-alerter-8bxsq" Apr 16 16:02:39.886569 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885070 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.886569 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885073 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-cni-binary-copy\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.886569 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885117 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-multus-cni-dir\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.886569 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885117 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-etc-modprobe-d\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.886569 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885161 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-hostroot\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.886569 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885189 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-node-log\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.886569 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885189 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-567lt\" (UniqueName: \"kubernetes.io/projected/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-kube-api-access-567lt\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.886569 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885231 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-run-ovn\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.886569 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885239 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a2826faf-f579-4f51-8772-9882e98d4593-env-overrides\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.886569 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885260 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-host-var-lib-cni-multus\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.886569 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885283 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5148421b-8077-4ac5-bad5-00c661d71b12-registration-dir\") pod \"aws-ebs-csi-driver-node-gcn2w\" (UID: \"5148421b-8077-4ac5-bad5-00c661d71b12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:39.886569 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885305 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-lib-modules\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.886569 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885325 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-cnibin\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.886569 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885353 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-etc-sysctl-conf\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.887327 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885352 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.887327 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885385 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.887327 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885415 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-host-run-k8s-cni-cncf-io\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.887327 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885406 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-host-kubelet\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.887327 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:39.884465 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:39.887327 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885447 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0f6a5a36-0127-4a26-a722-43d4a0b49496-iptables-alerter-script\") pod \"iptables-alerter-8bxsq\" (UID: \"0f6a5a36-0127-4a26-a722-43d4a0b49496\") " pod="openshift-network-operator/iptables-alerter-8bxsq" Apr 16 16:02:39.887327 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885476 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjrt7\" (UniqueName: \"kubernetes.io/projected/64649692-472e-4f06-9640-7e6075d1e84f-kube-api-access-vjrt7\") pod \"network-metrics-daemon-2b9mp\" (UID: \"64649692-472e-4f06-9640-7e6075d1e84f\") " pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:02:39.887327 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885513 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-cni-binary-copy\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.887327 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:39.885527 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs podName:64649692-472e-4f06-9640-7e6075d1e84f nodeName:}" failed. No retries permitted until 2026-04-16 16:02:40.385492331 +0000 UTC m=+3.149117040 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs") pod "network-metrics-daemon-2b9mp" (UID: "64649692-472e-4f06-9640-7e6075d1e84f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:39.887327 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885548 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-system-cni-dir\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.887327 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885571 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-etc-kubernetes\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.887327 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885577 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-os-release\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.887327 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885603 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5148421b-8077-4ac5-bad5-00c661d71b12-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gcn2w\" (UID: \"5148421b-8077-4ac5-bad5-00c661d71b12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:39.887327 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885629 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-run-openvswitch\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.887327 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885651 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-etc-sysctl-d\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.887327 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885655 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-cnibin\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.887327 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885681 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-host-run-netns\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.888038 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885706 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-etc-kubernetes\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.888038 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885741 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpkck\" (UniqueName: \"kubernetes.io/projected/024bdd5b-3034-463f-aa2d-3e55d292bbd0-kube-api-access-fpkck\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.888038 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885765 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5148421b-8077-4ac5-bad5-00c661d71b12-etc-selinux\") pod \"aws-ebs-csi-driver-node-gcn2w\" (UID: \"5148421b-8077-4ac5-bad5-00c661d71b12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:39.888038 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885769 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/36fdb312-0ed7-418b-b720-16a3c46fff51-agent-certs\") pod \"konnectivity-agent-dkplw\" (UID: \"36fdb312-0ed7-418b-b720-16a3c46fff51\") " pod="kube-system/konnectivity-agent-dkplw" Apr 16 16:02:39.888038 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885815 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-os-release\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.888038 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885845 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-host-run-netns\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.888038 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885873 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-host\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.888038 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885872 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.888038 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885965 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-sys\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.888038 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885965 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-host-run-multus-certs\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.888038 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886002 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-multus-conf-dir\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.888038 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886044 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-systemd-units\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.888038 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886080 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-var-lib-openvswitch\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.888038 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886111 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-multus-cni-dir\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.888038 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.885706 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0f6a5a36-0127-4a26-a722-43d4a0b49496-host-slash\") pod \"iptables-alerter-8bxsq\" (UID: \"0f6a5a36-0127-4a26-a722-43d4a0b49496\") " pod="openshift-network-operator/iptables-alerter-8bxsq" Apr 16 16:02:39.888038 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886433 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.888038 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886518 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/024bdd5b-3034-463f-aa2d-3e55d292bbd0-multus-daemon-config\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.888038 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886574 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-hostroot\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.888781 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886609 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-host-run-netns\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.888781 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886608 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-os-release\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.888781 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886631 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-run-systemd\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.888781 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886643 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-system-cni-dir\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.888781 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886649 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-host-var-lib-cni-multus\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.888781 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886673 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-host\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.888781 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886689 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-os-release\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.888781 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886705 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-host-slash\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.888781 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886728 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5148421b-8077-4ac5-bad5-00c661d71b12-registration-dir\") pod \"aws-ebs-csi-driver-node-gcn2w\" (UID: \"5148421b-8077-4ac5-bad5-00c661d71b12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:39.888781 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886733 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-lib-modules\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.888781 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886757 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-cnibin\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.888781 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886761 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5148421b-8077-4ac5-bad5-00c661d71b12-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gcn2w\" (UID: \"5148421b-8077-4ac5-bad5-00c661d71b12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:39.888781 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886780 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-run-openvswitch\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.888781 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886796 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-etc-systemd\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.888781 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886819 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-cnibin\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.888781 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886853 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-host-run-netns\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.888781 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.886891 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-etc-kubernetes\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.888781 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.887052 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a2826faf-f579-4f51-8772-9882e98d4593-run-ovn\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.889413 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.887092 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/024bdd5b-3034-463f-aa2d-3e55d292bbd0-host-run-k8s-cni-cncf-io\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:39.889413 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.887322 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.889413 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.887372 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/36fdb312-0ed7-418b-b720-16a3c46fff51-konnectivity-ca\") pod \"konnectivity-agent-dkplw\" (UID: \"36fdb312-0ed7-418b-b720-16a3c46fff51\") " pod="kube-system/konnectivity-agent-dkplw" Apr 16 16:02:39.889413 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.887556 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0f6a5a36-0127-4a26-a722-43d4a0b49496-iptables-alerter-script\") pod \"iptables-alerter-8bxsq\" (UID: \"0f6a5a36-0127-4a26-a722-43d4a0b49496\") " pod="openshift-network-operator/iptables-alerter-8bxsq" Apr 16 16:02:39.889413 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.888064 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-tmp\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.889413 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.888259 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a2826faf-f579-4f51-8772-9882e98d4593-ovn-node-metrics-cert\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.889413 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.888599 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-etc-tuned\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.889805 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.889787 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/36fdb312-0ed7-418b-b720-16a3c46fff51-agent-certs\") pod \"konnectivity-agent-dkplw\" (UID: \"36fdb312-0ed7-418b-b720-16a3c46fff51\") " pod="kube-system/konnectivity-agent-dkplw" Apr 16 16:02:39.907110 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:39.907083 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:02:39.907110 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:39.907106 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:02:39.907110 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:39.907116 2577 projected.go:194] Error preparing data for projected volume kube-api-access-h826f for pod openshift-network-diagnostics/network-check-target-j96wv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:39.907370 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:39.907195 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f podName:2d37db46-a278-4fec-8cea-0900b1dfb12d nodeName:}" failed. No retries permitted until 2026-04-16 16:02:40.407173803 +0000 UTC m=+3.170798515 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-h826f" (UniqueName: "kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f") pod "network-check-target-j96wv" (UID: "2d37db46-a278-4fec-8cea-0900b1dfb12d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:39.909611 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.909581 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6stk\" (UniqueName: \"kubernetes.io/projected/a2826faf-f579-4f51-8772-9882e98d4593-kube-api-access-v6stk\") pod \"ovnkube-node-m2jms\" (UID: \"a2826faf-f579-4f51-8772-9882e98d4593\") " pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:39.910829 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.910671 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl5bp\" (UniqueName: \"kubernetes.io/projected/5148421b-8077-4ac5-bad5-00c661d71b12-kube-api-access-nl5bp\") pod \"aws-ebs-csi-driver-node-gcn2w\" (UID: \"5148421b-8077-4ac5-bad5-00c661d71b12\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:39.910951 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.910848 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjrt7\" (UniqueName: \"kubernetes.io/projected/64649692-472e-4f06-9640-7e6075d1e84f-kube-api-access-vjrt7\") pod \"network-metrics-daemon-2b9mp\" (UID: \"64649692-472e-4f06-9640-7e6075d1e84f\") " pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:02:39.912242 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.912217 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrhgl\" (UniqueName: \"kubernetes.io/projected/1b804e33-713f-4c6f-a4a1-0181d77254e1-kube-api-access-xrhgl\") pod \"node-ca-vjpkx\" (UID: \"1b804e33-713f-4c6f-a4a1-0181d77254e1\") " pod="openshift-image-registry/node-ca-vjpkx" Apr 16 16:02:39.912544 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.912504 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-567lt\" (UniqueName: \"kubernetes.io/projected/bb2a3f4f-a59a-4bd9-867e-c1fc03475427-kube-api-access-567lt\") pod \"tuned-4mstk\" (UID: \"bb2a3f4f-a59a-4bd9-867e-c1fc03475427\") " pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:39.912884 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.912861 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wqsr\" (UniqueName: \"kubernetes.io/projected/6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf-kube-api-access-9wqsr\") pod \"multus-additional-cni-plugins-jftn9\" (UID: \"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf\") " pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:39.913061 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.913037 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4dz7\" (UniqueName: \"kubernetes.io/projected/0f6a5a36-0127-4a26-a722-43d4a0b49496-kube-api-access-w4dz7\") pod \"iptables-alerter-8bxsq\" (UID: \"0f6a5a36-0127-4a26-a722-43d4a0b49496\") " pod="openshift-network-operator/iptables-alerter-8bxsq" Apr 16 16:02:39.913841 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:39.913808 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpkck\" (UniqueName: \"kubernetes.io/projected/024bdd5b-3034-463f-aa2d-3e55d292bbd0-kube-api-access-fpkck\") pod \"multus-wfqhd\" (UID: \"024bdd5b-3034-463f-aa2d-3e55d292bbd0\") " pod="openshift-multus/multus-wfqhd" Apr 16 16:02:40.069274 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:40.069230 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4mstk" Apr 16 16:02:40.076092 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:40.076060 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vjpkx" Apr 16 16:02:40.084790 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:40.084758 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-8bxsq" Apr 16 16:02:40.091034 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:40.091006 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:02:40.097672 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:40.097643 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jftn9" Apr 16 16:02:40.104353 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:40.104328 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-wfqhd" Apr 16 16:02:40.112041 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:40.112020 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-dkplw" Apr 16 16:02:40.116665 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:40.116636 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" Apr 16 16:02:40.389686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:40.389653 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs\") pod \"network-metrics-daemon-2b9mp\" (UID: \"64649692-472e-4f06-9640-7e6075d1e84f\") " pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:02:40.389873 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:40.389821 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:40.389929 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:40.389897 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs podName:64649692-472e-4f06-9640-7e6075d1e84f nodeName:}" failed. No retries permitted until 2026-04-16 16:02:41.389875291 +0000 UTC m=+4.153499998 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs") pod "network-metrics-daemon-2b9mp" (UID: "64649692-472e-4f06-9640-7e6075d1e84f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:40.490988 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:40.490949 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h826f\" (UniqueName: \"kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f\") pod \"network-check-target-j96wv\" (UID: \"2d37db46-a278-4fec-8cea-0900b1dfb12d\") " pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:02:40.491182 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:40.491119 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:02:40.491182 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:40.491157 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:02:40.491182 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:40.491171 2577 projected.go:194] Error preparing data for projected volume kube-api-access-h826f for pod openshift-network-diagnostics/network-check-target-j96wv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:40.491350 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:40.491229 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f podName:2d37db46-a278-4fec-8cea-0900b1dfb12d nodeName:}" failed. No retries permitted until 2026-04-16 16:02:41.491215377 +0000 UTC m=+4.254840066 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-h826f" (UniqueName: "kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f") pod "network-check-target-j96wv" (UID: "2d37db46-a278-4fec-8cea-0900b1dfb12d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:40.559626 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:40.559595 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod024bdd5b_3034_463f_aa2d_3e55d292bbd0.slice/crio-ebf7de9476eec1366185166075e48373741916344997bb3d8363b0bb4f068c03 WatchSource:0}: Error finding container ebf7de9476eec1366185166075e48373741916344997bb3d8363b0bb4f068c03: Status 404 returned error can't find the container with id ebf7de9476eec1366185166075e48373741916344997bb3d8363b0bb4f068c03 Apr 16 16:02:40.560906 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:40.560870 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2826faf_f579_4f51_8772_9882e98d4593.slice/crio-6abbd791621d98c3d23550b9591c6c63e2c517c455ae75e7fb7a0e09c7e13e4b WatchSource:0}: Error finding container 6abbd791621d98c3d23550b9591c6c63e2c517c455ae75e7fb7a0e09c7e13e4b: Status 404 returned error can't find the container with id 6abbd791621d98c3d23550b9591c6c63e2c517c455ae75e7fb7a0e09c7e13e4b Apr 16 16:02:40.561672 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:40.561634 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5148421b_8077_4ac5_bad5_00c661d71b12.slice/crio-910d0fd9cb6a708d4c6ea2b4961dfec155d3bbe2305c4f5455db55fecccfcdec WatchSource:0}: Error finding container 910d0fd9cb6a708d4c6ea2b4961dfec155d3bbe2305c4f5455db55fecccfcdec: Status 404 returned error can't find the container with id 910d0fd9cb6a708d4c6ea2b4961dfec155d3bbe2305c4f5455db55fecccfcdec Apr 16 16:02:40.564651 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:40.564616 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fbc43fe_a6ce_49d9_81b1_17e5e07dfacf.slice/crio-6164cd869061fffc0a0cb829855e09a715ce1542ebc2ff515cb4b06a1cc2c7b0 WatchSource:0}: Error finding container 6164cd869061fffc0a0cb829855e09a715ce1542ebc2ff515cb4b06a1cc2c7b0: Status 404 returned error can't find the container with id 6164cd869061fffc0a0cb829855e09a715ce1542ebc2ff515cb4b06a1cc2c7b0 Apr 16 16:02:40.565501 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:40.565464 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b804e33_713f_4c6f_a4a1_0181d77254e1.slice/crio-7df86fea07a5f6a6c119d555b386a8edab8625d610348e0c0e6347423ebc24c9 WatchSource:0}: Error finding container 7df86fea07a5f6a6c119d555b386a8edab8625d610348e0c0e6347423ebc24c9: Status 404 returned error can't find the container with id 7df86fea07a5f6a6c119d555b386a8edab8625d610348e0c0e6347423ebc24c9 Apr 16 16:02:40.566829 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:40.566807 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f6a5a36_0127_4a26_a722_43d4a0b49496.slice/crio-9a8b613bd1a81c916b4d43374c85f3a30125eba93ab0cde6260e30185a55edb9 WatchSource:0}: Error finding container 9a8b613bd1a81c916b4d43374c85f3a30125eba93ab0cde6260e30185a55edb9: Status 404 returned error can't find the container with id 9a8b613bd1a81c916b4d43374c85f3a30125eba93ab0cde6260e30185a55edb9 Apr 16 16:02:40.568499 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:40.568476 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36fdb312_0ed7_418b_b720_16a3c46fff51.slice/crio-fb5af72aaadd8288cf08ed257368c88e9f0387d2ea25f7ff4fd0836a59316c7c WatchSource:0}: Error finding container fb5af72aaadd8288cf08ed257368c88e9f0387d2ea25f7ff4fd0836a59316c7c: Status 404 returned error can't find the container with id fb5af72aaadd8288cf08ed257368c88e9f0387d2ea25f7ff4fd0836a59316c7c Apr 16 16:02:40.570004 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:02:40.569979 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb2a3f4f_a59a_4bd9_867e_c1fc03475427.slice/crio-6ab1c4a7376f6715a7b469a4ea33ce8f9cc6df8930bc70026c734bb70e72d4f1 WatchSource:0}: Error finding container 6ab1c4a7376f6715a7b469a4ea33ce8f9cc6df8930bc70026c734bb70e72d4f1: Status 404 returned error can't find the container with id 6ab1c4a7376f6715a7b469a4ea33ce8f9cc6df8930bc70026c734bb70e72d4f1 Apr 16 16:02:40.814934 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:40.814741 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 15:57:38 +0000 UTC" deadline="2027-10-17 22:20:42.575540792 +0000 UTC" Apr 16 16:02:40.814934 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:40.814925 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13182h18m1.760618354s" Apr 16 16:02:40.922082 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:40.921982 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" event={"ID":"5148421b-8077-4ac5-bad5-00c661d71b12","Type":"ContainerStarted","Data":"910d0fd9cb6a708d4c6ea2b4961dfec155d3bbe2305c4f5455db55fecccfcdec"} Apr 16 16:02:40.924422 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:40.924389 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-182.ec2.internal" event={"ID":"f8506f0b2e52254b607e38e68b2f4dae","Type":"ContainerStarted","Data":"852693c460b5abefb32685673181b934a8b4c2f02c0702558345262eed81193e"} Apr 16 16:02:40.929025 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:40.928999 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4mstk" event={"ID":"bb2a3f4f-a59a-4bd9-867e-c1fc03475427","Type":"ContainerStarted","Data":"6ab1c4a7376f6715a7b469a4ea33ce8f9cc6df8930bc70026c734bb70e72d4f1"} Apr 16 16:02:40.930836 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:40.930695 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dkplw" event={"ID":"36fdb312-0ed7-418b-b720-16a3c46fff51","Type":"ContainerStarted","Data":"fb5af72aaadd8288cf08ed257368c88e9f0387d2ea25f7ff4fd0836a59316c7c"} Apr 16 16:02:40.932190 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:40.932164 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" event={"ID":"a2826faf-f579-4f51-8772-9882e98d4593","Type":"ContainerStarted","Data":"6abbd791621d98c3d23550b9591c6c63e2c517c455ae75e7fb7a0e09c7e13e4b"} Apr 16 16:02:40.934711 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:40.934689 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wfqhd" event={"ID":"024bdd5b-3034-463f-aa2d-3e55d292bbd0","Type":"ContainerStarted","Data":"ebf7de9476eec1366185166075e48373741916344997bb3d8363b0bb4f068c03"} Apr 16 16:02:40.935860 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:40.935822 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8bxsq" event={"ID":"0f6a5a36-0127-4a26-a722-43d4a0b49496","Type":"ContainerStarted","Data":"9a8b613bd1a81c916b4d43374c85f3a30125eba93ab0cde6260e30185a55edb9"} Apr 16 16:02:40.937577 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:40.937553 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vjpkx" event={"ID":"1b804e33-713f-4c6f-a4a1-0181d77254e1","Type":"ContainerStarted","Data":"7df86fea07a5f6a6c119d555b386a8edab8625d610348e0c0e6347423ebc24c9"} Apr 16 16:02:40.939462 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:40.939091 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-182.ec2.internal" podStartSLOduration=2.939075736 podStartE2EDuration="2.939075736s" podCreationTimestamp="2026-04-16 16:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:02:40.938743728 +0000 UTC m=+3.702368440" watchObservedRunningTime="2026-04-16 16:02:40.939075736 +0000 UTC m=+3.702700467" Apr 16 16:02:40.939462 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:40.939365 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jftn9" event={"ID":"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf","Type":"ContainerStarted","Data":"6164cd869061fffc0a0cb829855e09a715ce1542ebc2ff515cb4b06a1cc2c7b0"} Apr 16 16:02:41.398951 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:41.398867 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs\") pod \"network-metrics-daemon-2b9mp\" (UID: \"64649692-472e-4f06-9640-7e6075d1e84f\") " pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:02:41.399170 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:41.399048 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:41.399170 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:41.399114 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs podName:64649692-472e-4f06-9640-7e6075d1e84f nodeName:}" failed. No retries permitted until 2026-04-16 16:02:43.399095872 +0000 UTC m=+6.162720564 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs") pod "network-metrics-daemon-2b9mp" (UID: "64649692-472e-4f06-9640-7e6075d1e84f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:41.453755 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:41.453717 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-jt7zj"] Apr 16 16:02:41.457886 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:41.456991 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jt7zj" Apr 16 16:02:41.462566 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:41.461918 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 16:02:41.462566 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:41.462204 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 16:02:41.462566 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:41.462401 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-bfcxf\"" Apr 16 16:02:41.499312 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:41.499275 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6463e7fd-1eba-4228-8782-edee4a55c601-hosts-file\") pod \"node-resolver-jt7zj\" (UID: \"6463e7fd-1eba-4228-8782-edee4a55c601\") " pod="openshift-dns/node-resolver-jt7zj" Apr 16 16:02:41.499489 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:41.499340 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h826f\" (UniqueName: \"kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f\") pod \"network-check-target-j96wv\" (UID: \"2d37db46-a278-4fec-8cea-0900b1dfb12d\") " pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:02:41.499489 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:41.499395 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6463e7fd-1eba-4228-8782-edee4a55c601-tmp-dir\") pod \"node-resolver-jt7zj\" (UID: \"6463e7fd-1eba-4228-8782-edee4a55c601\") " pod="openshift-dns/node-resolver-jt7zj" Apr 16 16:02:41.499489 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:41.499420 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmhfh\" (UniqueName: \"kubernetes.io/projected/6463e7fd-1eba-4228-8782-edee4a55c601-kube-api-access-xmhfh\") pod \"node-resolver-jt7zj\" (UID: \"6463e7fd-1eba-4228-8782-edee4a55c601\") " pod="openshift-dns/node-resolver-jt7zj" Apr 16 16:02:41.499881 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:41.499588 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:02:41.499881 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:41.499609 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:02:41.499881 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:41.499621 2577 projected.go:194] Error preparing data for projected volume kube-api-access-h826f for pod openshift-network-diagnostics/network-check-target-j96wv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:41.499881 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:41.499676 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f podName:2d37db46-a278-4fec-8cea-0900b1dfb12d nodeName:}" failed. No retries permitted until 2026-04-16 16:02:43.499658232 +0000 UTC m=+6.263282923 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-h826f" (UniqueName: "kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f") pod "network-check-target-j96wv" (UID: "2d37db46-a278-4fec-8cea-0900b1dfb12d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:41.599855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:41.599814 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6463e7fd-1eba-4228-8782-edee4a55c601-tmp-dir\") pod \"node-resolver-jt7zj\" (UID: \"6463e7fd-1eba-4228-8782-edee4a55c601\") " pod="openshift-dns/node-resolver-jt7zj" Apr 16 16:02:41.600047 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:41.599867 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xmhfh\" (UniqueName: \"kubernetes.io/projected/6463e7fd-1eba-4228-8782-edee4a55c601-kube-api-access-xmhfh\") pod \"node-resolver-jt7zj\" (UID: \"6463e7fd-1eba-4228-8782-edee4a55c601\") " pod="openshift-dns/node-resolver-jt7zj" Apr 16 16:02:41.600047 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:41.599924 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6463e7fd-1eba-4228-8782-edee4a55c601-hosts-file\") pod \"node-resolver-jt7zj\" (UID: \"6463e7fd-1eba-4228-8782-edee4a55c601\") " pod="openshift-dns/node-resolver-jt7zj" Apr 16 16:02:41.600201 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:41.600010 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6463e7fd-1eba-4228-8782-edee4a55c601-hosts-file\") pod \"node-resolver-jt7zj\" (UID: \"6463e7fd-1eba-4228-8782-edee4a55c601\") " pod="openshift-dns/node-resolver-jt7zj" Apr 16 16:02:41.600448 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:41.600427 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6463e7fd-1eba-4228-8782-edee4a55c601-tmp-dir\") pod \"node-resolver-jt7zj\" (UID: \"6463e7fd-1eba-4228-8782-edee4a55c601\") " pod="openshift-dns/node-resolver-jt7zj" Apr 16 16:02:41.632900 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:41.627755 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmhfh\" (UniqueName: \"kubernetes.io/projected/6463e7fd-1eba-4228-8782-edee4a55c601-kube-api-access-xmhfh\") pod \"node-resolver-jt7zj\" (UID: \"6463e7fd-1eba-4228-8782-edee4a55c601\") " pod="openshift-dns/node-resolver-jt7zj" Apr 16 16:02:41.772586 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:41.772500 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-jt7zj" Apr 16 16:02:41.916584 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:41.914662 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:02:41.916584 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:41.914801 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2b9mp" podUID="64649692-472e-4f06-9640-7e6075d1e84f" Apr 16 16:02:41.916584 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:41.916302 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:02:41.916584 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:41.916537 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j96wv" podUID="2d37db46-a278-4fec-8cea-0900b1dfb12d" Apr 16 16:02:41.963154 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:41.962401 2577 generic.go:358] "Generic (PLEG): container finished" podID="406c6ba715b6247850e61153d48508c5" containerID="045c8bcb2b605c6dcff3bfb77e79a035e823c70d1eefffc3ffd0d59a67a4001f" exitCode=0 Apr 16 16:02:41.963154 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:41.962495 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-182.ec2.internal" event={"ID":"406c6ba715b6247850e61153d48508c5","Type":"ContainerDied","Data":"045c8bcb2b605c6dcff3bfb77e79a035e823c70d1eefffc3ffd0d59a67a4001f"} Apr 16 16:02:41.969889 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:41.969754 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jt7zj" event={"ID":"6463e7fd-1eba-4228-8782-edee4a55c601","Type":"ContainerStarted","Data":"1be9ebb4632424efb4bf637db1af76a19012b55f0d0f6fc91a97fb74cb7d5cd0"} Apr 16 16:02:42.975990 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:42.975936 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-182.ec2.internal" event={"ID":"406c6ba715b6247850e61153d48508c5","Type":"ContainerStarted","Data":"4fade1d6a09f3cbb4e24e06f1509751357d5451b389100d608ccb76997c84b90"} Apr 16 16:02:42.990764 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:42.990147 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-182.ec2.internal" podStartSLOduration=4.990112434 podStartE2EDuration="4.990112434s" podCreationTimestamp="2026-04-16 16:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:02:42.989408836 +0000 UTC m=+5.753033776" watchObservedRunningTime="2026-04-16 16:02:42.990112434 +0000 UTC m=+5.753737146" Apr 16 16:02:43.415198 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:43.415159 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs\") pod \"network-metrics-daemon-2b9mp\" (UID: \"64649692-472e-4f06-9640-7e6075d1e84f\") " pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:02:43.415444 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:43.415335 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:43.415444 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:43.415396 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs podName:64649692-472e-4f06-9640-7e6075d1e84f nodeName:}" failed. No retries permitted until 2026-04-16 16:02:47.415377849 +0000 UTC m=+10.179002543 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs") pod "network-metrics-daemon-2b9mp" (UID: "64649692-472e-4f06-9640-7e6075d1e84f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:43.515833 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:43.515792 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h826f\" (UniqueName: \"kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f\") pod \"network-check-target-j96wv\" (UID: \"2d37db46-a278-4fec-8cea-0900b1dfb12d\") " pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:02:43.516003 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:43.515980 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:02:43.516062 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:43.516009 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:02:43.516062 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:43.516023 2577 projected.go:194] Error preparing data for projected volume kube-api-access-h826f for pod openshift-network-diagnostics/network-check-target-j96wv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:43.516189 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:43.516091 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f podName:2d37db46-a278-4fec-8cea-0900b1dfb12d nodeName:}" failed. No retries permitted until 2026-04-16 16:02:47.516071649 +0000 UTC m=+10.279696356 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-h826f" (UniqueName: "kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f") pod "network-check-target-j96wv" (UID: "2d37db46-a278-4fec-8cea-0900b1dfb12d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:43.915290 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:43.915153 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:02:43.915290 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:43.915186 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:02:43.915496 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:43.915302 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2b9mp" podUID="64649692-472e-4f06-9640-7e6075d1e84f" Apr 16 16:02:43.915496 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:43.915387 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j96wv" podUID="2d37db46-a278-4fec-8cea-0900b1dfb12d" Apr 16 16:02:45.914920 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:45.914884 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:02:45.915454 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:45.915026 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2b9mp" podUID="64649692-472e-4f06-9640-7e6075d1e84f" Apr 16 16:02:45.915522 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:45.915503 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:02:45.915625 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:45.915596 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j96wv" podUID="2d37db46-a278-4fec-8cea-0900b1dfb12d" Apr 16 16:02:47.449876 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:47.449833 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs\") pod \"network-metrics-daemon-2b9mp\" (UID: \"64649692-472e-4f06-9640-7e6075d1e84f\") " pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:02:47.450355 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:47.450030 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:47.450355 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:47.450112 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs podName:64649692-472e-4f06-9640-7e6075d1e84f nodeName:}" failed. No retries permitted until 2026-04-16 16:02:55.450091101 +0000 UTC m=+18.213715795 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs") pod "network-metrics-daemon-2b9mp" (UID: "64649692-472e-4f06-9640-7e6075d1e84f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:47.551648 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:47.551185 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h826f\" (UniqueName: \"kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f\") pod \"network-check-target-j96wv\" (UID: \"2d37db46-a278-4fec-8cea-0900b1dfb12d\") " pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:02:47.551648 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:47.551364 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:02:47.551648 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:47.551385 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:02:47.551648 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:47.551398 2577 projected.go:194] Error preparing data for projected volume kube-api-access-h826f for pod openshift-network-diagnostics/network-check-target-j96wv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:47.551648 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:47.551461 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f podName:2d37db46-a278-4fec-8cea-0900b1dfb12d nodeName:}" failed. No retries permitted until 2026-04-16 16:02:55.551442696 +0000 UTC m=+18.315067386 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-h826f" (UniqueName: "kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f") pod "network-check-target-j96wv" (UID: "2d37db46-a278-4fec-8cea-0900b1dfb12d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:47.915979 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:47.915467 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:02:47.915979 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:47.915599 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2b9mp" podUID="64649692-472e-4f06-9640-7e6075d1e84f" Apr 16 16:02:47.915979 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:47.915791 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:02:47.915979 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:47.915892 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j96wv" podUID="2d37db46-a278-4fec-8cea-0900b1dfb12d" Apr 16 16:02:49.914455 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:49.914358 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:02:49.914455 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:49.914358 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:02:49.914901 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:49.914478 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j96wv" podUID="2d37db46-a278-4fec-8cea-0900b1dfb12d" Apr 16 16:02:49.914901 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:49.914561 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2b9mp" podUID="64649692-472e-4f06-9640-7e6075d1e84f" Apr 16 16:02:51.914247 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:51.914200 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:02:51.914673 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:51.914256 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:02:51.914673 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:51.914348 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2b9mp" podUID="64649692-472e-4f06-9640-7e6075d1e84f" Apr 16 16:02:51.914673 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:51.914481 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j96wv" podUID="2d37db46-a278-4fec-8cea-0900b1dfb12d" Apr 16 16:02:52.609801 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:52.609766 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-gtdb4"] Apr 16 16:02:52.611740 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:52.611715 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:02:52.611896 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:52.611803 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtdb4" podUID="489cc865-55f8-43eb-9e86-909e0581fa83" Apr 16 16:02:52.691828 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:52.691792 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/489cc865-55f8-43eb-9e86-909e0581fa83-kubelet-config\") pod \"global-pull-secret-syncer-gtdb4\" (UID: \"489cc865-55f8-43eb-9e86-909e0581fa83\") " pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:02:52.691828 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:52.691842 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/489cc865-55f8-43eb-9e86-909e0581fa83-original-pull-secret\") pod \"global-pull-secret-syncer-gtdb4\" (UID: \"489cc865-55f8-43eb-9e86-909e0581fa83\") " pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:02:52.692062 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:52.691958 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/489cc865-55f8-43eb-9e86-909e0581fa83-dbus\") pod \"global-pull-secret-syncer-gtdb4\" (UID: \"489cc865-55f8-43eb-9e86-909e0581fa83\") " pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:02:52.792280 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:52.792249 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/489cc865-55f8-43eb-9e86-909e0581fa83-kubelet-config\") pod \"global-pull-secret-syncer-gtdb4\" (UID: \"489cc865-55f8-43eb-9e86-909e0581fa83\") " pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:02:52.792280 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:52.792286 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/489cc865-55f8-43eb-9e86-909e0581fa83-original-pull-secret\") pod \"global-pull-secret-syncer-gtdb4\" (UID: \"489cc865-55f8-43eb-9e86-909e0581fa83\") " pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:02:52.792527 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:52.792352 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/489cc865-55f8-43eb-9e86-909e0581fa83-dbus\") pod \"global-pull-secret-syncer-gtdb4\" (UID: \"489cc865-55f8-43eb-9e86-909e0581fa83\") " pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:02:52.792527 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:52.792393 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/489cc865-55f8-43eb-9e86-909e0581fa83-kubelet-config\") pod \"global-pull-secret-syncer-gtdb4\" (UID: \"489cc865-55f8-43eb-9e86-909e0581fa83\") " pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:02:52.792527 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:52.792498 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/489cc865-55f8-43eb-9e86-909e0581fa83-dbus\") pod \"global-pull-secret-syncer-gtdb4\" (UID: \"489cc865-55f8-43eb-9e86-909e0581fa83\") " pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:02:52.792527 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:52.792499 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:02:52.792695 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:52.792564 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/489cc865-55f8-43eb-9e86-909e0581fa83-original-pull-secret podName:489cc865-55f8-43eb-9e86-909e0581fa83 nodeName:}" failed. No retries permitted until 2026-04-16 16:02:53.292547598 +0000 UTC m=+16.056172290 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/489cc865-55f8-43eb-9e86-909e0581fa83-original-pull-secret") pod "global-pull-secret-syncer-gtdb4" (UID: "489cc865-55f8-43eb-9e86-909e0581fa83") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:02:53.297118 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:53.297080 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/489cc865-55f8-43eb-9e86-909e0581fa83-original-pull-secret\") pod \"global-pull-secret-syncer-gtdb4\" (UID: \"489cc865-55f8-43eb-9e86-909e0581fa83\") " pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:02:53.297540 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:53.297239 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:02:53.297540 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:53.297313 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/489cc865-55f8-43eb-9e86-909e0581fa83-original-pull-secret podName:489cc865-55f8-43eb-9e86-909e0581fa83 nodeName:}" failed. No retries permitted until 2026-04-16 16:02:54.29729622 +0000 UTC m=+17.060920908 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/489cc865-55f8-43eb-9e86-909e0581fa83-original-pull-secret") pod "global-pull-secret-syncer-gtdb4" (UID: "489cc865-55f8-43eb-9e86-909e0581fa83") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:02:53.914444 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:53.914405 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:02:53.914444 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:53.914429 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:02:53.914828 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:53.914421 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:02:53.914828 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:53.914533 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtdb4" podUID="489cc865-55f8-43eb-9e86-909e0581fa83" Apr 16 16:02:53.914828 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:53.914615 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2b9mp" podUID="64649692-472e-4f06-9640-7e6075d1e84f" Apr 16 16:02:53.914828 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:53.914718 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j96wv" podUID="2d37db46-a278-4fec-8cea-0900b1dfb12d" Apr 16 16:02:54.304000 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:54.303954 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/489cc865-55f8-43eb-9e86-909e0581fa83-original-pull-secret\") pod \"global-pull-secret-syncer-gtdb4\" (UID: \"489cc865-55f8-43eb-9e86-909e0581fa83\") " pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:02:54.304500 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:54.304071 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:02:54.304500 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:54.304142 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/489cc865-55f8-43eb-9e86-909e0581fa83-original-pull-secret podName:489cc865-55f8-43eb-9e86-909e0581fa83 nodeName:}" failed. No retries permitted until 2026-04-16 16:02:56.304113564 +0000 UTC m=+19.067738268 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/489cc865-55f8-43eb-9e86-909e0581fa83-original-pull-secret") pod "global-pull-secret-syncer-gtdb4" (UID: "489cc865-55f8-43eb-9e86-909e0581fa83") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:02:55.512817 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:55.512782 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs\") pod \"network-metrics-daemon-2b9mp\" (UID: \"64649692-472e-4f06-9640-7e6075d1e84f\") " pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:02:55.513228 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:55.512910 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:55.513228 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:55.512972 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs podName:64649692-472e-4f06-9640-7e6075d1e84f nodeName:}" failed. No retries permitted until 2026-04-16 16:03:11.512958304 +0000 UTC m=+34.276582993 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs") pod "network-metrics-daemon-2b9mp" (UID: "64649692-472e-4f06-9640-7e6075d1e84f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:55.613116 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:55.613079 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h826f\" (UniqueName: \"kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f\") pod \"network-check-target-j96wv\" (UID: \"2d37db46-a278-4fec-8cea-0900b1dfb12d\") " pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:02:55.613296 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:55.613225 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:02:55.613296 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:55.613242 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:02:55.613296 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:55.613251 2577 projected.go:194] Error preparing data for projected volume kube-api-access-h826f for pod openshift-network-diagnostics/network-check-target-j96wv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:55.613430 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:55.613307 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f podName:2d37db46-a278-4fec-8cea-0900b1dfb12d nodeName:}" failed. No retries permitted until 2026-04-16 16:03:11.613292861 +0000 UTC m=+34.376917550 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-h826f" (UniqueName: "kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f") pod "network-check-target-j96wv" (UID: "2d37db46-a278-4fec-8cea-0900b1dfb12d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:55.915051 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:55.914963 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:02:55.915240 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:55.915090 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:02:55.915240 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:55.915142 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:02:55.915240 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:55.915098 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j96wv" podUID="2d37db46-a278-4fec-8cea-0900b1dfb12d" Apr 16 16:02:55.915240 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:55.915224 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtdb4" podUID="489cc865-55f8-43eb-9e86-909e0581fa83" Apr 16 16:02:55.915424 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:55.915324 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2b9mp" podUID="64649692-472e-4f06-9640-7e6075d1e84f" Apr 16 16:02:56.317872 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:56.317837 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/489cc865-55f8-43eb-9e86-909e0581fa83-original-pull-secret\") pod \"global-pull-secret-syncer-gtdb4\" (UID: \"489cc865-55f8-43eb-9e86-909e0581fa83\") " pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:02:56.318040 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:56.317951 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:02:56.318040 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:56.318011 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/489cc865-55f8-43eb-9e86-909e0581fa83-original-pull-secret podName:489cc865-55f8-43eb-9e86-909e0581fa83 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:00.317994957 +0000 UTC m=+23.081619647 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/489cc865-55f8-43eb-9e86-909e0581fa83-original-pull-secret") pod "global-pull-secret-syncer-gtdb4" (UID: "489cc865-55f8-43eb-9e86-909e0581fa83") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:02:57.916105 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:57.915681 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:02:57.917617 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:57.917304 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtdb4" podUID="489cc865-55f8-43eb-9e86-909e0581fa83" Apr 16 16:02:57.917617 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:57.915796 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:02:57.917617 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:57.915768 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:02:57.917617 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:57.917448 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j96wv" podUID="2d37db46-a278-4fec-8cea-0900b1dfb12d" Apr 16 16:02:57.917994 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:57.917947 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2b9mp" podUID="64649692-472e-4f06-9640-7e6075d1e84f" Apr 16 16:02:58.003176 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.003121 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vjpkx" event={"ID":"1b804e33-713f-4c6f-a4a1-0181d77254e1","Type":"ContainerStarted","Data":"dee0246ce58b1a040b0bf79a38ba42e8c6a26115d65dec84d458a166ca7a5de9"} Apr 16 16:02:58.004694 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.004668 2577 generic.go:358] "Generic (PLEG): container finished" podID="6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf" containerID="f82b0f8a1b8f1ea55fcf608c2b47b83f7c7398c0b32111c8f2da0cf8571e1cd5" exitCode=0 Apr 16 16:02:58.004801 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.004747 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jftn9" event={"ID":"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf","Type":"ContainerDied","Data":"f82b0f8a1b8f1ea55fcf608c2b47b83f7c7398c0b32111c8f2da0cf8571e1cd5"} Apr 16 16:02:58.006187 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.006165 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" event={"ID":"5148421b-8077-4ac5-bad5-00c661d71b12","Type":"ContainerStarted","Data":"56800b2cae2050dfb971deb3264e786c3b1910bae39c24daa1c60f6496c61c71"} Apr 16 16:02:58.008545 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.008519 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-jt7zj" event={"ID":"6463e7fd-1eba-4228-8782-edee4a55c601","Type":"ContainerStarted","Data":"4b4298ef2806135e0ad27fb3bdf729ca9ef6c852190c464dc523d77683f1ba52"} Apr 16 16:02:58.010619 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.010593 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4mstk" event={"ID":"bb2a3f4f-a59a-4bd9-867e-c1fc03475427","Type":"ContainerStarted","Data":"c5ae17964d2af61980f25b04fcf466555c66ff267129eb2928e239b22e09fea8"} Apr 16 16:02:58.012366 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.012343 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-dkplw" event={"ID":"36fdb312-0ed7-418b-b720-16a3c46fff51","Type":"ContainerStarted","Data":"5adcf1c5a3f9bd1551da58c1a90defb3f412c29fee1529c980774b8e6c7f43d4"} Apr 16 16:02:58.014886 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.014870 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/ovn-acl-logging/0.log" Apr 16 16:02:58.015178 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.015160 2577 generic.go:358] "Generic (PLEG): container finished" podID="a2826faf-f579-4f51-8772-9882e98d4593" containerID="2f7cd34d74b03496af4d5467691f409eaff79d9a0591b63b8ce6efa775884bcc" exitCode=1 Apr 16 16:02:58.015247 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.015217 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" event={"ID":"a2826faf-f579-4f51-8772-9882e98d4593","Type":"ContainerStarted","Data":"a2d7aaba5ca0110e96424537e41d65b9537c297cd54fb6ad536a2537a8e4026e"} Apr 16 16:02:58.015247 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.015243 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" event={"ID":"a2826faf-f579-4f51-8772-9882e98d4593","Type":"ContainerStarted","Data":"958f9f73a0bdf79362af14f8196ef156532608280bcdc1688ad4228df2817947"} Apr 16 16:02:58.015323 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.015258 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" event={"ID":"a2826faf-f579-4f51-8772-9882e98d4593","Type":"ContainerStarted","Data":"83c9f29749001f24e239edf7ade08a15cbb9cf5d3c1c54d5a0e9a9fc62bf3ee0"} Apr 16 16:02:58.015323 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.015270 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" event={"ID":"a2826faf-f579-4f51-8772-9882e98d4593","Type":"ContainerDied","Data":"2f7cd34d74b03496af4d5467691f409eaff79d9a0591b63b8ce6efa775884bcc"} Apr 16 16:02:58.015323 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.015284 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" event={"ID":"a2826faf-f579-4f51-8772-9882e98d4593","Type":"ContainerStarted","Data":"60f054f4dab316af0bc058325603ab310fb8c0e732fb962a90a7d7c9c8edd244"} Apr 16 16:02:58.016266 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.016247 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-wfqhd" event={"ID":"024bdd5b-3034-463f-aa2d-3e55d292bbd0","Type":"ContainerStarted","Data":"518a135d7dae18f6f7744bcfe77dad5cf9f0d61ccedcfe5bc95e6c99d9002c02"} Apr 16 16:02:58.017640 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.017597 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vjpkx" podStartSLOduration=3.4092419720000002 podStartE2EDuration="20.017584167s" podCreationTimestamp="2026-04-16 16:02:38 +0000 UTC" firstStartedPulling="2026-04-16 16:02:40.567696223 +0000 UTC m=+3.331320923" lastFinishedPulling="2026-04-16 16:02:57.176038231 +0000 UTC m=+19.939663118" observedRunningTime="2026-04-16 16:02:58.016596363 +0000 UTC m=+20.780221074" watchObservedRunningTime="2026-04-16 16:02:58.017584167 +0000 UTC m=+20.781208881" Apr 16 16:02:58.030359 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.030281 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-jt7zj" podStartSLOduration=1.6494988739999998 podStartE2EDuration="17.030264121s" podCreationTimestamp="2026-04-16 16:02:41 +0000 UTC" firstStartedPulling="2026-04-16 16:02:41.795600255 +0000 UTC m=+4.559224959" lastFinishedPulling="2026-04-16 16:02:57.176365504 +0000 UTC m=+19.939990206" observedRunningTime="2026-04-16 16:02:58.02957158 +0000 UTC m=+20.793196294" watchObservedRunningTime="2026-04-16 16:02:58.030264121 +0000 UTC m=+20.793888833" Apr 16 16:02:58.044443 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.044396 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-wfqhd" podStartSLOduration=3.386255701 podStartE2EDuration="20.044382606s" podCreationTimestamp="2026-04-16 16:02:38 +0000 UTC" firstStartedPulling="2026-04-16 16:02:40.562153791 +0000 UTC m=+3.325778483" lastFinishedPulling="2026-04-16 16:02:57.220280694 +0000 UTC m=+19.983905388" observedRunningTime="2026-04-16 16:02:58.043996279 +0000 UTC m=+20.807621001" watchObservedRunningTime="2026-04-16 16:02:58.044382606 +0000 UTC m=+20.808007317" Apr 16 16:02:58.068695 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.068637 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-4mstk" podStartSLOduration=4.45815639 podStartE2EDuration="21.068618248s" podCreationTimestamp="2026-04-16 16:02:37 +0000 UTC" firstStartedPulling="2026-04-16 16:02:40.57150673 +0000 UTC m=+3.335131419" lastFinishedPulling="2026-04-16 16:02:57.181968581 +0000 UTC m=+19.945593277" observedRunningTime="2026-04-16 16:02:58.068075013 +0000 UTC m=+20.831699746" watchObservedRunningTime="2026-04-16 16:02:58.068618248 +0000 UTC m=+20.832242963" Apr 16 16:02:58.445810 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.445781 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 16:02:58.849248 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.849125 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T16:02:58.445802619Z","UUID":"ea8b04ad-c20f-4221-a8e4-d7faa3b65797","Handler":null,"Name":"","Endpoint":""} Apr 16 16:02:58.851358 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.851332 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 16:02:58.851358 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:58.851361 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 16:02:59.021639 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:59.021611 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/ovn-acl-logging/0.log" Apr 16 16:02:59.022253 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:59.022226 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" event={"ID":"a2826faf-f579-4f51-8772-9882e98d4593","Type":"ContainerStarted","Data":"de8717b7265c3bf32bca569400c2543418fb13f6df2caa73294e70eafc6bbf56"} Apr 16 16:02:59.023639 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:59.023612 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-8bxsq" event={"ID":"0f6a5a36-0127-4a26-a722-43d4a0b49496","Type":"ContainerStarted","Data":"21baedbb6366e8ffa0d2fbe919af86487dc0c75ac175e52a89e80a414391a762"} Apr 16 16:02:59.026661 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:59.026634 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" event={"ID":"5148421b-8077-4ac5-bad5-00c661d71b12","Type":"ContainerStarted","Data":"bfddc2402b29d52362bea65fea56652cc40a9a131a8de96039d5c6571dfd441e"} Apr 16 16:02:59.038810 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:59.038759 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-dkplw" podStartSLOduration=4.433998495 podStartE2EDuration="21.03874198s" podCreationTimestamp="2026-04-16 16:02:38 +0000 UTC" firstStartedPulling="2026-04-16 16:02:40.571862574 +0000 UTC m=+3.335487267" lastFinishedPulling="2026-04-16 16:02:57.176606025 +0000 UTC m=+19.940230752" observedRunningTime="2026-04-16 16:02:58.1153124 +0000 UTC m=+20.878937112" watchObservedRunningTime="2026-04-16 16:02:59.03874198 +0000 UTC m=+21.802366693" Apr 16 16:02:59.915006 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:59.914822 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:02:59.915237 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:59.914854 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:02:59.915237 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:59.915094 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtdb4" podUID="489cc865-55f8-43eb-9e86-909e0581fa83" Apr 16 16:02:59.915237 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:59.915194 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2b9mp" podUID="64649692-472e-4f06-9640-7e6075d1e84f" Apr 16 16:02:59.915237 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:02:59.914894 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:02:59.915391 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:02:59.915276 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j96wv" podUID="2d37db46-a278-4fec-8cea-0900b1dfb12d" Apr 16 16:03:00.030324 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:00.030290 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" event={"ID":"5148421b-8077-4ac5-bad5-00c661d71b12","Type":"ContainerStarted","Data":"8ae0bd6d4787a6451aa50f4b7c8fcb26995cbdc23d1b3ef3b73c7ee1ee5f9778"} Apr 16 16:03:00.062470 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:00.062408 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-8bxsq" podStartSLOduration=5.465589808 podStartE2EDuration="22.062387018s" podCreationTimestamp="2026-04-16 16:02:38 +0000 UTC" firstStartedPulling="2026-04-16 16:02:40.570207014 +0000 UTC m=+3.333831710" lastFinishedPulling="2026-04-16 16:02:57.167004227 +0000 UTC m=+19.930628920" observedRunningTime="2026-04-16 16:02:59.038190605 +0000 UTC m=+21.801815327" watchObservedRunningTime="2026-04-16 16:03:00.062387018 +0000 UTC m=+22.826011731" Apr 16 16:03:00.063261 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:00.063220 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gcn2w" podStartSLOduration=2.953191671 podStartE2EDuration="22.063210993s" podCreationTimestamp="2026-04-16 16:02:38 +0000 UTC" firstStartedPulling="2026-04-16 16:02:40.563864015 +0000 UTC m=+3.327488704" lastFinishedPulling="2026-04-16 16:02:59.673883321 +0000 UTC m=+22.437508026" observedRunningTime="2026-04-16 16:03:00.06214711 +0000 UTC m=+22.825771811" watchObservedRunningTime="2026-04-16 16:03:00.063210993 +0000 UTC m=+22.826835704" Apr 16 16:03:00.349385 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:00.349301 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/489cc865-55f8-43eb-9e86-909e0581fa83-original-pull-secret\") pod \"global-pull-secret-syncer-gtdb4\" (UID: \"489cc865-55f8-43eb-9e86-909e0581fa83\") " pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:03:00.349553 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:00.349431 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:00.349553 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:00.349505 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/489cc865-55f8-43eb-9e86-909e0581fa83-original-pull-secret podName:489cc865-55f8-43eb-9e86-909e0581fa83 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:08.349483993 +0000 UTC m=+31.113108683 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/489cc865-55f8-43eb-9e86-909e0581fa83-original-pull-secret") pod "global-pull-secret-syncer-gtdb4" (UID: "489cc865-55f8-43eb-9e86-909e0581fa83") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:01.035328 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:01.035297 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/ovn-acl-logging/0.log" Apr 16 16:03:01.035911 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:01.035696 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" event={"ID":"a2826faf-f579-4f51-8772-9882e98d4593","Type":"ContainerStarted","Data":"a87c9028cfc1a66fe6d45d27b2e5a85eb97b27c6b7d53a334413e69f6471f68e"} Apr 16 16:03:01.532690 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:01.532658 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-dkplw" Apr 16 16:03:01.533532 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:01.533508 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-dkplw" Apr 16 16:03:01.914987 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:01.914884 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:03:01.915175 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:01.914989 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:03:01.915175 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:01.915006 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtdb4" podUID="489cc865-55f8-43eb-9e86-909e0581fa83" Apr 16 16:03:01.915175 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:01.915097 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j96wv" podUID="2d37db46-a278-4fec-8cea-0900b1dfb12d" Apr 16 16:03:01.915175 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:01.915165 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:03:01.915396 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:01.915255 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2b9mp" podUID="64649692-472e-4f06-9640-7e6075d1e84f" Apr 16 16:03:02.767850 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:02.767816 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-dkplw" Apr 16 16:03:02.768455 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:02.768435 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-dkplw" Apr 16 16:03:03.041768 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:03.041686 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/ovn-acl-logging/0.log" Apr 16 16:03:03.042054 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:03.042030 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" event={"ID":"a2826faf-f579-4f51-8772-9882e98d4593","Type":"ContainerStarted","Data":"d89a03650a6377e94a6a3654698f6ea5cf36eb202891fb14c8613a8ec4f4b5f2"} Apr 16 16:03:03.042416 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:03.042399 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:03:03.042506 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:03.042491 2577 scope.go:117] "RemoveContainer" containerID="2f7cd34d74b03496af4d5467691f409eaff79d9a0591b63b8ce6efa775884bcc" Apr 16 16:03:03.043963 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:03.043942 2577 generic.go:358] "Generic (PLEG): container finished" podID="6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf" containerID="418cf1e4a93f95cd11776f04d506a21ba0594e7c063d6227bfc9c5b03b0d7f59" exitCode=0 Apr 16 16:03:03.044050 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:03.044031 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jftn9" event={"ID":"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf","Type":"ContainerDied","Data":"418cf1e4a93f95cd11776f04d506a21ba0594e7c063d6227bfc9c5b03b0d7f59"} Apr 16 16:03:03.058881 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:03.058770 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:03:03.914845 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:03.914817 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:03:03.915327 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:03.914850 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:03:03.915327 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:03.914937 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtdb4" podUID="489cc865-55f8-43eb-9e86-909e0581fa83" Apr 16 16:03:03.915327 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:03.915012 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j96wv" podUID="2d37db46-a278-4fec-8cea-0900b1dfb12d" Apr 16 16:03:03.915327 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:03.915056 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:03:03.915327 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:03.915122 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2b9mp" podUID="64649692-472e-4f06-9640-7e6075d1e84f" Apr 16 16:03:04.049178 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:04.049120 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/ovn-acl-logging/0.log" Apr 16 16:03:04.049503 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:04.049472 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" event={"ID":"a2826faf-f579-4f51-8772-9882e98d4593","Type":"ContainerStarted","Data":"7b345ab0ec3a73ad8fac1c4a70aa5ef24a35c7239ba030d928369bc06cbc1b33"} Apr 16 16:03:04.049808 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:04.049790 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:03:04.049944 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:04.049925 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:03:04.063575 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:04.063553 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:03:04.112008 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:04.111970 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" podStartSLOduration=9.31873233 podStartE2EDuration="26.111955481s" podCreationTimestamp="2026-04-16 16:02:38 +0000 UTC" firstStartedPulling="2026-04-16 16:02:40.562891365 +0000 UTC m=+3.326516054" lastFinishedPulling="2026-04-16 16:02:57.356114501 +0000 UTC m=+20.119739205" observedRunningTime="2026-04-16 16:03:04.082685509 +0000 UTC m=+26.846310220" watchObservedRunningTime="2026-04-16 16:03:04.111955481 +0000 UTC m=+26.875580184" Apr 16 16:03:04.215424 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:04.215395 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2b9mp"] Apr 16 16:03:04.215572 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:04.215498 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:03:04.215615 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:04.215584 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2b9mp" podUID="64649692-472e-4f06-9640-7e6075d1e84f" Apr 16 16:03:04.223890 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:04.223862 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gtdb4"] Apr 16 16:03:04.224079 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:04.223959 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:03:04.224079 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:04.224070 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtdb4" podUID="489cc865-55f8-43eb-9e86-909e0581fa83" Apr 16 16:03:04.225442 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:04.225408 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-j96wv"] Apr 16 16:03:04.225699 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:04.225486 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:03:04.225699 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:04.225566 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j96wv" podUID="2d37db46-a278-4fec-8cea-0900b1dfb12d" Apr 16 16:03:05.053050 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:05.053022 2577 generic.go:358] "Generic (PLEG): container finished" podID="6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf" containerID="d9df5aa4d79c366d11c796d7ae82a19e8c5bbf6f2b7a8902e484eb4c755bf0f4" exitCode=0 Apr 16 16:03:05.053493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:05.053104 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jftn9" event={"ID":"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf","Type":"ContainerDied","Data":"d9df5aa4d79c366d11c796d7ae82a19e8c5bbf6f2b7a8902e484eb4c755bf0f4"} Apr 16 16:03:05.915195 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:05.914961 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:03:05.915363 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:05.914961 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:03:05.915363 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:05.915223 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j96wv" podUID="2d37db46-a278-4fec-8cea-0900b1dfb12d" Apr 16 16:03:05.915363 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:05.914967 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:03:05.915363 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:05.915297 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtdb4" podUID="489cc865-55f8-43eb-9e86-909e0581fa83" Apr 16 16:03:05.915506 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:05.915395 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2b9mp" podUID="64649692-472e-4f06-9640-7e6075d1e84f" Apr 16 16:03:06.056738 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:06.056703 2577 generic.go:358] "Generic (PLEG): container finished" podID="6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf" containerID="efc288af7abf7d0f554c42115f351e648e51c1e925b52e5d57dcb301c7ddaf92" exitCode=0 Apr 16 16:03:06.057168 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:06.056800 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jftn9" event={"ID":"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf","Type":"ContainerDied","Data":"efc288af7abf7d0f554c42115f351e648e51c1e925b52e5d57dcb301c7ddaf92"} Apr 16 16:03:07.915098 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:07.915029 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:03:07.915098 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:07.915074 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:03:07.915915 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:07.915029 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:03:07.915915 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:07.915180 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2b9mp" podUID="64649692-472e-4f06-9640-7e6075d1e84f" Apr 16 16:03:07.915915 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:07.915248 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j96wv" podUID="2d37db46-a278-4fec-8cea-0900b1dfb12d" Apr 16 16:03:07.915915 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:07.915317 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtdb4" podUID="489cc865-55f8-43eb-9e86-909e0581fa83" Apr 16 16:03:08.408094 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:08.408054 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/489cc865-55f8-43eb-9e86-909e0581fa83-original-pull-secret\") pod \"global-pull-secret-syncer-gtdb4\" (UID: \"489cc865-55f8-43eb-9e86-909e0581fa83\") " pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:03:08.408283 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:08.408238 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:08.408349 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:08.408317 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/489cc865-55f8-43eb-9e86-909e0581fa83-original-pull-secret podName:489cc865-55f8-43eb-9e86-909e0581fa83 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:24.408295462 +0000 UTC m=+47.171920153 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/489cc865-55f8-43eb-9e86-909e0581fa83-original-pull-secret") pod "global-pull-secret-syncer-gtdb4" (UID: "489cc865-55f8-43eb-9e86-909e0581fa83") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:09.915072 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:09.915036 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:03:09.915647 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:09.915036 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:03:09.915647 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:09.915185 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gtdb4" podUID="489cc865-55f8-43eb-9e86-909e0581fa83" Apr 16 16:03:09.915647 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:09.915036 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:03:09.915647 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:09.915255 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j96wv" podUID="2d37db46-a278-4fec-8cea-0900b1dfb12d" Apr 16 16:03:09.915647 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:09.915343 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2b9mp" podUID="64649692-472e-4f06-9640-7e6075d1e84f" Apr 16 16:03:10.559612 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.559585 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-182.ec2.internal" event="NodeReady" Apr 16 16:03:10.559894 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.559742 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 16:03:10.608618 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.608590 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-f7df78bf6-m4srg"] Apr 16 16:03:10.637507 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.637475 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2fk94"] Apr 16 16:03:10.637668 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.637648 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.640613 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.640586 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6298j\"" Apr 16 16:03:10.641841 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.641480 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 16:03:10.642904 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.642573 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 16:03:10.642904 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.642647 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 16:03:10.659046 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.658973 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-f7df78bf6-m4srg"] Apr 16 16:03:10.659046 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.659012 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ckdqq"] Apr 16 16:03:10.659283 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.659249 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2fk94" Apr 16 16:03:10.659387 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.659362 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 16:03:10.662195 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.662155 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-v9cqx\"" Apr 16 16:03:10.662392 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.662372 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 16:03:10.662487 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.662409 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 16:03:10.680389 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.680361 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2fk94"] Apr 16 16:03:10.680389 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.680392 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ckdqq"] Apr 16 16:03:10.680595 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.680521 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ckdqq" Apr 16 16:03:10.684277 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.684249 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 16:03:10.684421 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.684328 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wjzs4\"" Apr 16 16:03:10.684482 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.684464 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 16:03:10.684663 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.684643 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 16:03:10.724026 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.723988 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4680da5f-5f61-4016-b39c-64017ebd7fa4-image-registry-private-configuration\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.724224 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.724044 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.724224 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.724073 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4680da5f-5f61-4016-b39c-64017ebd7fa4-ca-trust-extracted\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.724224 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.724094 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-certificates\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.724224 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.724116 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4680da5f-5f61-4016-b39c-64017ebd7fa4-installation-pull-secrets\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.724224 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.724196 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtmd6\" (UniqueName: \"kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-kube-api-access-dtmd6\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.724454 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.724278 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-bound-sa-token\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.724454 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.724302 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4680da5f-5f61-4016-b39c-64017ebd7fa4-trusted-ca\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.825409 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.825319 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98c6de58-d4f8-4f67-ab26-d582517a2717-config-volume\") pod \"dns-default-2fk94\" (UID: \"98c6de58-d4f8-4f67-ab26-d582517a2717\") " pod="openshift-dns/dns-default-2fk94" Apr 16 16:03:10.825409 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.825367 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncqg7\" (UniqueName: \"kubernetes.io/projected/98c6de58-d4f8-4f67-ab26-d582517a2717-kube-api-access-ncqg7\") pod \"dns-default-2fk94\" (UID: \"98c6de58-d4f8-4f67-ab26-d582517a2717\") " pod="openshift-dns/dns-default-2fk94" Apr 16 16:03:10.825409 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.825392 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg6cz\" (UniqueName: \"kubernetes.io/projected/24ed6202-e738-4abc-b26d-eec84a76b75b-kube-api-access-xg6cz\") pod \"ingress-canary-ckdqq\" (UID: \"24ed6202-e738-4abc-b26d-eec84a76b75b\") " pod="openshift-ingress-canary/ingress-canary-ckdqq" Apr 16 16:03:10.825657 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.825488 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-bound-sa-token\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.825657 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.825516 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4680da5f-5f61-4016-b39c-64017ebd7fa4-trusted-ca\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.825657 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.825540 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/98c6de58-d4f8-4f67-ab26-d582517a2717-tmp-dir\") pod \"dns-default-2fk94\" (UID: \"98c6de58-d4f8-4f67-ab26-d582517a2717\") " pod="openshift-dns/dns-default-2fk94" Apr 16 16:03:10.825799 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.825664 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert\") pod \"ingress-canary-ckdqq\" (UID: \"24ed6202-e738-4abc-b26d-eec84a76b75b\") " pod="openshift-ingress-canary/ingress-canary-ckdqq" Apr 16 16:03:10.825799 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.825752 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4680da5f-5f61-4016-b39c-64017ebd7fa4-image-registry-private-configuration\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.825895 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.825798 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.825895 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.825828 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4680da5f-5f61-4016-b39c-64017ebd7fa4-ca-trust-extracted\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.825988 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:10.825915 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:03:10.825988 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:10.825934 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f7df78bf6-m4srg: secret "image-registry-tls" not found Apr 16 16:03:10.825988 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.825945 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-certificates\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.826089 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.825986 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4680da5f-5f61-4016-b39c-64017ebd7fa4-installation-pull-secrets\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.826089 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:10.826003 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls podName:4680da5f-5f61-4016-b39c-64017ebd7fa4 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:11.325981009 +0000 UTC m=+34.089605721 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls") pod "image-registry-f7df78bf6-m4srg" (UID: "4680da5f-5f61-4016-b39c-64017ebd7fa4") : secret "image-registry-tls" not found Apr 16 16:03:10.826089 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.826040 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtmd6\" (UniqueName: \"kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-kube-api-access-dtmd6\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.826089 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.826070 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls\") pod \"dns-default-2fk94\" (UID: \"98c6de58-d4f8-4f67-ab26-d582517a2717\") " pod="openshift-dns/dns-default-2fk94" Apr 16 16:03:10.826301 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.826185 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4680da5f-5f61-4016-b39c-64017ebd7fa4-ca-trust-extracted\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.826629 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.826599 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4680da5f-5f61-4016-b39c-64017ebd7fa4-trusted-ca\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.826754 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.826735 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-certificates\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.830605 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.830577 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4680da5f-5f61-4016-b39c-64017ebd7fa4-installation-pull-secrets\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.830605 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.830592 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4680da5f-5f61-4016-b39c-64017ebd7fa4-image-registry-private-configuration\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.837432 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.837403 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-bound-sa-token\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.837815 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.837788 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtmd6\" (UniqueName: \"kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-kube-api-access-dtmd6\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:10.927039 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.926987 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xg6cz\" (UniqueName: \"kubernetes.io/projected/24ed6202-e738-4abc-b26d-eec84a76b75b-kube-api-access-xg6cz\") pod \"ingress-canary-ckdqq\" (UID: \"24ed6202-e738-4abc-b26d-eec84a76b75b\") " pod="openshift-ingress-canary/ingress-canary-ckdqq" Apr 16 16:03:10.927516 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.927051 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/98c6de58-d4f8-4f67-ab26-d582517a2717-tmp-dir\") pod \"dns-default-2fk94\" (UID: \"98c6de58-d4f8-4f67-ab26-d582517a2717\") " pod="openshift-dns/dns-default-2fk94" Apr 16 16:03:10.927516 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.927075 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert\") pod \"ingress-canary-ckdqq\" (UID: \"24ed6202-e738-4abc-b26d-eec84a76b75b\") " pod="openshift-ingress-canary/ingress-canary-ckdqq" Apr 16 16:03:10.927516 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:10.927189 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:10.927516 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:10.927257 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert podName:24ed6202-e738-4abc-b26d-eec84a76b75b nodeName:}" failed. No retries permitted until 2026-04-16 16:03:11.427235942 +0000 UTC m=+34.190860638 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert") pod "ingress-canary-ckdqq" (UID: "24ed6202-e738-4abc-b26d-eec84a76b75b") : secret "canary-serving-cert" not found Apr 16 16:03:10.927516 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.927407 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls\") pod \"dns-default-2fk94\" (UID: \"98c6de58-d4f8-4f67-ab26-d582517a2717\") " pod="openshift-dns/dns-default-2fk94" Apr 16 16:03:10.927516 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.927479 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98c6de58-d4f8-4f67-ab26-d582517a2717-config-volume\") pod \"dns-default-2fk94\" (UID: \"98c6de58-d4f8-4f67-ab26-d582517a2717\") " pod="openshift-dns/dns-default-2fk94" Apr 16 16:03:10.927516 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.927506 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ncqg7\" (UniqueName: \"kubernetes.io/projected/98c6de58-d4f8-4f67-ab26-d582517a2717-kube-api-access-ncqg7\") pod \"dns-default-2fk94\" (UID: \"98c6de58-d4f8-4f67-ab26-d582517a2717\") " pod="openshift-dns/dns-default-2fk94" Apr 16 16:03:10.927877 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:10.927531 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:10.927877 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:10.927594 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls podName:98c6de58-d4f8-4f67-ab26-d582517a2717 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:11.427580467 +0000 UTC m=+34.191205156 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls") pod "dns-default-2fk94" (UID: "98c6de58-d4f8-4f67-ab26-d582517a2717") : secret "dns-default-metrics-tls" not found Apr 16 16:03:10.927877 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.927608 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/98c6de58-d4f8-4f67-ab26-d582517a2717-tmp-dir\") pod \"dns-default-2fk94\" (UID: \"98c6de58-d4f8-4f67-ab26-d582517a2717\") " pod="openshift-dns/dns-default-2fk94" Apr 16 16:03:10.928089 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.928067 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98c6de58-d4f8-4f67-ab26-d582517a2717-config-volume\") pod \"dns-default-2fk94\" (UID: \"98c6de58-d4f8-4f67-ab26-d582517a2717\") " pod="openshift-dns/dns-default-2fk94" Apr 16 16:03:10.935493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.935468 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncqg7\" (UniqueName: \"kubernetes.io/projected/98c6de58-d4f8-4f67-ab26-d582517a2717-kube-api-access-ncqg7\") pod \"dns-default-2fk94\" (UID: \"98c6de58-d4f8-4f67-ab26-d582517a2717\") " pod="openshift-dns/dns-default-2fk94" Apr 16 16:03:10.935687 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:10.935661 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg6cz\" (UniqueName: \"kubernetes.io/projected/24ed6202-e738-4abc-b26d-eec84a76b75b-kube-api-access-xg6cz\") pod \"ingress-canary-ckdqq\" (UID: \"24ed6202-e738-4abc-b26d-eec84a76b75b\") " pod="openshift-ingress-canary/ingress-canary-ckdqq" Apr 16 16:03:11.330422 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:11.330380 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:11.330684 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:11.330554 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:03:11.330684 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:11.330574 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f7df78bf6-m4srg: secret "image-registry-tls" not found Apr 16 16:03:11.330684 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:11.330665 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls podName:4680da5f-5f61-4016-b39c-64017ebd7fa4 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:12.330632704 +0000 UTC m=+35.094257429 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls") pod "image-registry-f7df78bf6-m4srg" (UID: "4680da5f-5f61-4016-b39c-64017ebd7fa4") : secret "image-registry-tls" not found Apr 16 16:03:11.431245 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:11.431203 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls\") pod \"dns-default-2fk94\" (UID: \"98c6de58-d4f8-4f67-ab26-d582517a2717\") " pod="openshift-dns/dns-default-2fk94" Apr 16 16:03:11.431438 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:11.431303 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert\") pod \"ingress-canary-ckdqq\" (UID: \"24ed6202-e738-4abc-b26d-eec84a76b75b\") " pod="openshift-ingress-canary/ingress-canary-ckdqq" Apr 16 16:03:11.431438 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:11.431404 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:11.431438 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:11.431430 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:11.431588 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:11.431487 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls podName:98c6de58-d4f8-4f67-ab26-d582517a2717 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:12.431465055 +0000 UTC m=+35.195089771 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls") pod "dns-default-2fk94" (UID: "98c6de58-d4f8-4f67-ab26-d582517a2717") : secret "dns-default-metrics-tls" not found Apr 16 16:03:11.431588 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:11.431507 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert podName:24ed6202-e738-4abc-b26d-eec84a76b75b nodeName:}" failed. No retries permitted until 2026-04-16 16:03:12.431498825 +0000 UTC m=+35.195123519 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert") pod "ingress-canary-ckdqq" (UID: "24ed6202-e738-4abc-b26d-eec84a76b75b") : secret "canary-serving-cert" not found Apr 16 16:03:11.532666 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:11.532619 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs\") pod \"network-metrics-daemon-2b9mp\" (UID: \"64649692-472e-4f06-9640-7e6075d1e84f\") " pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:03:11.532830 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:11.532771 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:11.532898 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:11.532853 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs podName:64649692-472e-4f06-9640-7e6075d1e84f nodeName:}" failed. No retries permitted until 2026-04-16 16:03:43.532833212 +0000 UTC m=+66.296457918 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs") pod "network-metrics-daemon-2b9mp" (UID: "64649692-472e-4f06-9640-7e6075d1e84f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:11.634097 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:11.634017 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h826f\" (UniqueName: \"kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f\") pod \"network-check-target-j96wv\" (UID: \"2d37db46-a278-4fec-8cea-0900b1dfb12d\") " pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:03:11.634281 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:11.634244 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:03:11.634281 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:11.634268 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:03:11.634281 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:11.634282 2577 projected.go:194] Error preparing data for projected volume kube-api-access-h826f for pod openshift-network-diagnostics/network-check-target-j96wv: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:11.634433 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:11.634359 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f podName:2d37db46-a278-4fec-8cea-0900b1dfb12d nodeName:}" failed. No retries permitted until 2026-04-16 16:03:43.634329143 +0000 UTC m=+66.397953838 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-h826f" (UniqueName: "kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f") pod "network-check-target-j96wv" (UID: "2d37db46-a278-4fec-8cea-0900b1dfb12d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:11.914997 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:11.914921 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:03:11.914997 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:11.914966 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:03:11.915218 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:11.915002 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:03:11.917625 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:11.917605 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 16:03:11.918540 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:11.918523 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vp75c\"" Apr 16 16:03:11.918658 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:11.918537 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:03:11.918658 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:11.918541 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:03:11.918857 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:11.918843 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:03:11.918926 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:11.918907 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x7n6v\"" Apr 16 16:03:12.339194 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:12.338986 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:12.339539 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:12.339146 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:03:12.339539 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:12.339241 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f7df78bf6-m4srg: secret "image-registry-tls" not found Apr 16 16:03:12.339539 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:12.339298 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls podName:4680da5f-5f61-4016-b39c-64017ebd7fa4 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:14.339284415 +0000 UTC m=+37.102909103 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls") pod "image-registry-f7df78bf6-m4srg" (UID: "4680da5f-5f61-4016-b39c-64017ebd7fa4") : secret "image-registry-tls" not found Apr 16 16:03:12.439695 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:12.439667 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert\") pod \"ingress-canary-ckdqq\" (UID: \"24ed6202-e738-4abc-b26d-eec84a76b75b\") " pod="openshift-ingress-canary/ingress-canary-ckdqq" Apr 16 16:03:12.439831 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:12.439751 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls\") pod \"dns-default-2fk94\" (UID: \"98c6de58-d4f8-4f67-ab26-d582517a2717\") " pod="openshift-dns/dns-default-2fk94" Apr 16 16:03:12.439831 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:12.439816 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:12.439907 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:12.439878 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:12.439945 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:12.439879 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert podName:24ed6202-e738-4abc-b26d-eec84a76b75b nodeName:}" failed. No retries permitted until 2026-04-16 16:03:14.439863639 +0000 UTC m=+37.203488327 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert") pod "ingress-canary-ckdqq" (UID: "24ed6202-e738-4abc-b26d-eec84a76b75b") : secret "canary-serving-cert" not found Apr 16 16:03:12.439945 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:12.439930 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls podName:98c6de58-d4f8-4f67-ab26-d582517a2717 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:14.43991842 +0000 UTC m=+37.203543112 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls") pod "dns-default-2fk94" (UID: "98c6de58-d4f8-4f67-ab26-d582517a2717") : secret "dns-default-metrics-tls" not found Apr 16 16:03:13.073239 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:13.073208 2577 generic.go:358] "Generic (PLEG): container finished" podID="6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf" containerID="bd60332499c9dd124a9dfdac45549c206a34b90c27c389f0faaad15c0a72eceb" exitCode=0 Apr 16 16:03:13.073432 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:13.073276 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jftn9" event={"ID":"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf","Type":"ContainerDied","Data":"bd60332499c9dd124a9dfdac45549c206a34b90c27c389f0faaad15c0a72eceb"} Apr 16 16:03:14.078076 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:14.077991 2577 generic.go:358] "Generic (PLEG): container finished" podID="6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf" containerID="e775fec6b491fb5ca6a6f7469716140c388ae71419b5d72589dd0d34c8b6afc0" exitCode=0 Apr 16 16:03:14.078076 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:14.078036 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jftn9" event={"ID":"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf","Type":"ContainerDied","Data":"e775fec6b491fb5ca6a6f7469716140c388ae71419b5d72589dd0d34c8b6afc0"} Apr 16 16:03:14.357849 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:14.357763 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:14.357987 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:14.357887 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:03:14.357987 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:14.357902 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f7df78bf6-m4srg: secret "image-registry-tls" not found Apr 16 16:03:14.357987 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:14.357957 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls podName:4680da5f-5f61-4016-b39c-64017ebd7fa4 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:18.357939281 +0000 UTC m=+41.121563991 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls") pod "image-registry-f7df78bf6-m4srg" (UID: "4680da5f-5f61-4016-b39c-64017ebd7fa4") : secret "image-registry-tls" not found Apr 16 16:03:14.458144 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:14.458098 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert\") pod \"ingress-canary-ckdqq\" (UID: \"24ed6202-e738-4abc-b26d-eec84a76b75b\") " pod="openshift-ingress-canary/ingress-canary-ckdqq" Apr 16 16:03:14.458319 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:14.458203 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls\") pod \"dns-default-2fk94\" (UID: \"98c6de58-d4f8-4f67-ab26-d582517a2717\") " pod="openshift-dns/dns-default-2fk94" Apr 16 16:03:14.458319 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:14.458242 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:14.458319 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:14.458300 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:14.458319 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:14.458308 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert podName:24ed6202-e738-4abc-b26d-eec84a76b75b nodeName:}" failed. No retries permitted until 2026-04-16 16:03:18.458294085 +0000 UTC m=+41.221918774 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert") pod "ingress-canary-ckdqq" (UID: "24ed6202-e738-4abc-b26d-eec84a76b75b") : secret "canary-serving-cert" not found Apr 16 16:03:14.458517 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:14.458336 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls podName:98c6de58-d4f8-4f67-ab26-d582517a2717 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:18.458325269 +0000 UTC m=+41.221949958 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls") pod "dns-default-2fk94" (UID: "98c6de58-d4f8-4f67-ab26-d582517a2717") : secret "dns-default-metrics-tls" not found Apr 16 16:03:15.082259 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:15.082220 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jftn9" event={"ID":"6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf","Type":"ContainerStarted","Data":"dca4344c1879689f24b99a9ba7b9413f14fa0b899c5d2161255a47dd14b000b8"} Apr 16 16:03:15.108079 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:15.107995 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jftn9" podStartSLOduration=5.516602992 podStartE2EDuration="37.107979622s" podCreationTimestamp="2026-04-16 16:02:38 +0000 UTC" firstStartedPulling="2026-04-16 16:02:40.56680099 +0000 UTC m=+3.330425680" lastFinishedPulling="2026-04-16 16:03:12.158177618 +0000 UTC m=+34.921802310" observedRunningTime="2026-04-16 16:03:15.106503443 +0000 UTC m=+37.870128154" watchObservedRunningTime="2026-04-16 16:03:15.107979622 +0000 UTC m=+37.871604332" Apr 16 16:03:18.389653 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:18.389614 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:18.390056 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:18.389739 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:03:18.390056 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:18.389752 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f7df78bf6-m4srg: secret "image-registry-tls" not found Apr 16 16:03:18.390056 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:18.389803 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls podName:4680da5f-5f61-4016-b39c-64017ebd7fa4 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:26.389788632 +0000 UTC m=+49.153413320 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls") pod "image-registry-f7df78bf6-m4srg" (UID: "4680da5f-5f61-4016-b39c-64017ebd7fa4") : secret "image-registry-tls" not found Apr 16 16:03:18.490979 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:18.490939 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls\") pod \"dns-default-2fk94\" (UID: \"98c6de58-d4f8-4f67-ab26-d582517a2717\") " pod="openshift-dns/dns-default-2fk94" Apr 16 16:03:18.491151 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:18.491008 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert\") pod \"ingress-canary-ckdqq\" (UID: \"24ed6202-e738-4abc-b26d-eec84a76b75b\") " pod="openshift-ingress-canary/ingress-canary-ckdqq" Apr 16 16:03:18.491151 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:18.491088 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:18.491253 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:18.491165 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert podName:24ed6202-e738-4abc-b26d-eec84a76b75b nodeName:}" failed. No retries permitted until 2026-04-16 16:03:26.491146422 +0000 UTC m=+49.254771124 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert") pod "ingress-canary-ckdqq" (UID: "24ed6202-e738-4abc-b26d-eec84a76b75b") : secret "canary-serving-cert" not found Apr 16 16:03:18.491253 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:18.491089 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:18.491253 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:18.491230 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls podName:98c6de58-d4f8-4f67-ab26-d582517a2717 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:26.491217083 +0000 UTC m=+49.254841778 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls") pod "dns-default-2fk94" (UID: "98c6de58-d4f8-4f67-ab26-d582517a2717") : secret "dns-default-metrics-tls" not found Apr 16 16:03:24.435846 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:24.435809 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/489cc865-55f8-43eb-9e86-909e0581fa83-original-pull-secret\") pod \"global-pull-secret-syncer-gtdb4\" (UID: \"489cc865-55f8-43eb-9e86-909e0581fa83\") " pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:03:24.439069 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:24.439047 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/489cc865-55f8-43eb-9e86-909e0581fa83-original-pull-secret\") pod \"global-pull-secret-syncer-gtdb4\" (UID: \"489cc865-55f8-43eb-9e86-909e0581fa83\") " pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:03:24.535042 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:24.535011 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gtdb4" Apr 16 16:03:24.737097 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:24.737015 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gtdb4"] Apr 16 16:03:25.103284 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:25.103205 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gtdb4" event={"ID":"489cc865-55f8-43eb-9e86-909e0581fa83","Type":"ContainerStarted","Data":"81a85610e06c2566ac3cbd45dd34cfd7cd26ba842287c70cb659464ddb164407"} Apr 16 16:03:26.450992 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:26.450957 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:26.451481 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:26.451097 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:03:26.451481 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:26.451111 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f7df78bf6-m4srg: secret "image-registry-tls" not found Apr 16 16:03:26.451481 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:26.451180 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls podName:4680da5f-5f61-4016-b39c-64017ebd7fa4 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:42.451161771 +0000 UTC m=+65.214786477 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls") pod "image-registry-f7df78bf6-m4srg" (UID: "4680da5f-5f61-4016-b39c-64017ebd7fa4") : secret "image-registry-tls" not found Apr 16 16:03:26.551807 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:26.551773 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls\") pod \"dns-default-2fk94\" (UID: \"98c6de58-d4f8-4f67-ab26-d582517a2717\") " pod="openshift-dns/dns-default-2fk94" Apr 16 16:03:26.551980 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:26.551838 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert\") pod \"ingress-canary-ckdqq\" (UID: \"24ed6202-e738-4abc-b26d-eec84a76b75b\") " pod="openshift-ingress-canary/ingress-canary-ckdqq" Apr 16 16:03:26.551980 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:26.551932 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:26.552052 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:26.551932 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:26.552052 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:26.552044 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert podName:24ed6202-e738-4abc-b26d-eec84a76b75b nodeName:}" failed. No retries permitted until 2026-04-16 16:03:42.552023782 +0000 UTC m=+65.315648492 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert") pod "ingress-canary-ckdqq" (UID: "24ed6202-e738-4abc-b26d-eec84a76b75b") : secret "canary-serving-cert" not found Apr 16 16:03:26.552163 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:26.552063 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls podName:98c6de58-d4f8-4f67-ab26-d582517a2717 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:42.552052219 +0000 UTC m=+65.315676919 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls") pod "dns-default-2fk94" (UID: "98c6de58-d4f8-4f67-ab26-d582517a2717") : secret "dns-default-metrics-tls" not found Apr 16 16:03:30.114443 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:30.114404 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gtdb4" event={"ID":"489cc865-55f8-43eb-9e86-909e0581fa83","Type":"ContainerStarted","Data":"c9430fb49106a8cc5c8ba596a6a96e6dd35389fb7f49dcba77927111ef4a3c67"} Apr 16 16:03:30.132520 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:30.132462 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-gtdb4" podStartSLOduration=33.363467971 podStartE2EDuration="38.132443162s" podCreationTimestamp="2026-04-16 16:02:52 +0000 UTC" firstStartedPulling="2026-04-16 16:03:24.74416563 +0000 UTC m=+47.507790330" lastFinishedPulling="2026-04-16 16:03:29.513140816 +0000 UTC m=+52.276765521" observedRunningTime="2026-04-16 16:03:30.131921338 +0000 UTC m=+52.895546048" watchObservedRunningTime="2026-04-16 16:03:30.132443162 +0000 UTC m=+52.896067876" Apr 16 16:03:36.067008 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:36.066979 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m2jms" Apr 16 16:03:42.469064 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:42.469027 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:03:42.469469 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:42.469166 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:03:42.469469 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:42.469179 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f7df78bf6-m4srg: secret "image-registry-tls" not found Apr 16 16:03:42.469469 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:42.469263 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls podName:4680da5f-5f61-4016-b39c-64017ebd7fa4 nodeName:}" failed. No retries permitted until 2026-04-16 16:04:14.469241846 +0000 UTC m=+97.232866555 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls") pod "image-registry-f7df78bf6-m4srg" (UID: "4680da5f-5f61-4016-b39c-64017ebd7fa4") : secret "image-registry-tls" not found Apr 16 16:03:42.569571 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:42.569523 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert\") pod \"ingress-canary-ckdqq\" (UID: \"24ed6202-e738-4abc-b26d-eec84a76b75b\") " pod="openshift-ingress-canary/ingress-canary-ckdqq" Apr 16 16:03:42.569683 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:42.569629 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls\") pod \"dns-default-2fk94\" (UID: \"98c6de58-d4f8-4f67-ab26-d582517a2717\") " pod="openshift-dns/dns-default-2fk94" Apr 16 16:03:42.569729 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:42.569672 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:42.569764 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:42.569729 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:42.569764 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:42.569754 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert podName:24ed6202-e738-4abc-b26d-eec84a76b75b nodeName:}" failed. No retries permitted until 2026-04-16 16:04:14.569739191 +0000 UTC m=+97.333363880 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert") pod "ingress-canary-ckdqq" (UID: "24ed6202-e738-4abc-b26d-eec84a76b75b") : secret "canary-serving-cert" not found Apr 16 16:03:42.569830 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:42.569795 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls podName:98c6de58-d4f8-4f67-ab26-d582517a2717 nodeName:}" failed. No retries permitted until 2026-04-16 16:04:14.569781554 +0000 UTC m=+97.333406242 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls") pod "dns-default-2fk94" (UID: "98c6de58-d4f8-4f67-ab26-d582517a2717") : secret "dns-default-metrics-tls" not found Apr 16 16:03:43.576058 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:43.576024 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs\") pod \"network-metrics-daemon-2b9mp\" (UID: \"64649692-472e-4f06-9640-7e6075d1e84f\") " pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:03:43.578665 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:43.578645 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:03:43.586693 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:43.586669 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:03:43.586748 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:03:43.586740 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs podName:64649692-472e-4f06-9640-7e6075d1e84f nodeName:}" failed. No retries permitted until 2026-04-16 16:04:47.586720165 +0000 UTC m=+130.350344853 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs") pod "network-metrics-daemon-2b9mp" (UID: "64649692-472e-4f06-9640-7e6075d1e84f") : secret "metrics-daemon-secret" not found Apr 16 16:03:43.676851 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:43.676805 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h826f\" (UniqueName: \"kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f\") pod \"network-check-target-j96wv\" (UID: \"2d37db46-a278-4fec-8cea-0900b1dfb12d\") " pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:03:43.679595 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:43.679574 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:03:43.690314 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:43.690289 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:03:43.700636 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:43.700617 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h826f\" (UniqueName: \"kubernetes.io/projected/2d37db46-a278-4fec-8cea-0900b1dfb12d-kube-api-access-h826f\") pod \"network-check-target-j96wv\" (UID: \"2d37db46-a278-4fec-8cea-0900b1dfb12d\") " pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:03:43.733634 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:43.733605 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x7n6v\"" Apr 16 16:03:43.740869 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:43.740849 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:03:43.853178 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:43.853078 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-j96wv"] Apr 16 16:03:43.856005 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:03:43.855978 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d37db46_a278_4fec_8cea_0900b1dfb12d.slice/crio-1a9ba4b318a056e20006aea7e011112cad11cd25241485b5a3134833f6ba2810 WatchSource:0}: Error finding container 1a9ba4b318a056e20006aea7e011112cad11cd25241485b5a3134833f6ba2810: Status 404 returned error can't find the container with id 1a9ba4b318a056e20006aea7e011112cad11cd25241485b5a3134833f6ba2810 Apr 16 16:03:44.140742 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:44.140664 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-j96wv" event={"ID":"2d37db46-a278-4fec-8cea-0900b1dfb12d","Type":"ContainerStarted","Data":"1a9ba4b318a056e20006aea7e011112cad11cd25241485b5a3134833f6ba2810"} Apr 16 16:03:47.147434 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:47.147399 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-j96wv" event={"ID":"2d37db46-a278-4fec-8cea-0900b1dfb12d","Type":"ContainerStarted","Data":"6fda2af19df62871911da1bab4a860b408f7265cb0263002974b11fca435466d"} Apr 16 16:03:47.147831 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:03:47.147525 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:04:14.495518 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:04:14.495477 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:04:14.495989 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:04:14.495633 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:04:14.495989 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:04:14.495652 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f7df78bf6-m4srg: secret "image-registry-tls" not found Apr 16 16:04:14.495989 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:04:14.495711 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls podName:4680da5f-5f61-4016-b39c-64017ebd7fa4 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:18.495691713 +0000 UTC m=+161.259316427 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls") pod "image-registry-f7df78bf6-m4srg" (UID: "4680da5f-5f61-4016-b39c-64017ebd7fa4") : secret "image-registry-tls" not found Apr 16 16:04:14.596171 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:04:14.596113 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls\") pod \"dns-default-2fk94\" (UID: \"98c6de58-d4f8-4f67-ab26-d582517a2717\") " pod="openshift-dns/dns-default-2fk94" Apr 16 16:04:14.596326 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:04:14.596197 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert\") pod \"ingress-canary-ckdqq\" (UID: \"24ed6202-e738-4abc-b26d-eec84a76b75b\") " pod="openshift-ingress-canary/ingress-canary-ckdqq" Apr 16 16:04:14.596326 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:04:14.596278 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:04:14.596400 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:04:14.596348 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls podName:98c6de58-d4f8-4f67-ab26-d582517a2717 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:18.596334979 +0000 UTC m=+161.359959673 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls") pod "dns-default-2fk94" (UID: "98c6de58-d4f8-4f67-ab26-d582517a2717") : secret "dns-default-metrics-tls" not found Apr 16 16:04:14.596400 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:04:14.596284 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:04:14.596468 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:04:14.596419 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert podName:24ed6202-e738-4abc-b26d-eec84a76b75b nodeName:}" failed. No retries permitted until 2026-04-16 16:05:18.596407697 +0000 UTC m=+161.360032386 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert") pod "ingress-canary-ckdqq" (UID: "24ed6202-e738-4abc-b26d-eec84a76b75b") : secret "canary-serving-cert" not found Apr 16 16:04:18.152323 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:04:18.152295 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-j96wv" Apr 16 16:04:18.194554 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:04:18.194499 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-j96wv" podStartSLOduration=97.120559471 podStartE2EDuration="1m40.194483989s" podCreationTimestamp="2026-04-16 16:02:38 +0000 UTC" firstStartedPulling="2026-04-16 16:03:43.858493061 +0000 UTC m=+66.622117756" lastFinishedPulling="2026-04-16 16:03:46.932417569 +0000 UTC m=+69.696042274" observedRunningTime="2026-04-16 16:03:47.177924464 +0000 UTC m=+69.941549175" watchObservedRunningTime="2026-04-16 16:04:18.194483989 +0000 UTC m=+100.958108701" Apr 16 16:04:47.620974 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:04:47.620919 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs\") pod \"network-metrics-daemon-2b9mp\" (UID: \"64649692-472e-4f06-9640-7e6075d1e84f\") " pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:04:47.621527 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:04:47.621069 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:04:47.621527 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:04:47.621178 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs podName:64649692-472e-4f06-9640-7e6075d1e84f nodeName:}" failed. No retries permitted until 2026-04-16 16:06:49.621161356 +0000 UTC m=+252.384786057 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs") pod "network-metrics-daemon-2b9mp" (UID: "64649692-472e-4f06-9640-7e6075d1e84f") : secret "metrics-daemon-secret" not found Apr 16 16:05:13.654498 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:13.654443 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" podUID="4680da5f-5f61-4016-b39c-64017ebd7fa4" Apr 16 16:05:13.669650 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:13.669613 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-2fk94" podUID="98c6de58-d4f8-4f67-ab26-d582517a2717" Apr 16 16:05:13.689973 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:13.689936 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-ckdqq" podUID="24ed6202-e738-4abc-b26d-eec84a76b75b" Apr 16 16:05:14.311867 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:14.311840 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2fk94" Apr 16 16:05:14.312037 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:14.311840 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:05:14.924691 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:14.924641 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-2b9mp" podUID="64649692-472e-4f06-9640-7e6075d1e84f" Apr 16 16:05:17.787301 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.787266 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wjq4m"] Apr 16 16:05:17.789019 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.789002 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wjq4m" Apr 16 16:05:17.791337 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.791312 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 16:05:17.791619 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.791603 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:05:17.794003 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.793980 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-mnh8f\"" Apr 16 16:05:17.797804 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.797781 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-czs75"] Apr 16 16:05:17.799542 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.799522 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-2894b"] Apr 16 16:05:17.799694 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.799678 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-czs75" Apr 16 16:05:17.803338 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.803312 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-8n252"] Apr 16 16:05:17.803792 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.803773 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" Apr 16 16:05:17.804144 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.803835 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-8ldv9\"" Apr 16 16:05:17.805816 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.805761 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-8n252" Apr 16 16:05:17.806026 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.806005 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:05:17.807145 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.807110 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 16:05:17.807240 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.807119 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 16:05:17.807399 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.807379 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wjq4m"] Apr 16 16:05:17.811924 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.811899 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 16:05:17.812034 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.811907 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 16:05:17.812034 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.811910 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 16:05:17.812170 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.812120 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:05:17.812170 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.812154 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 16:05:17.812484 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.812469 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 16:05:17.813216 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.813192 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 16:05:17.813315 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.813194 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-nxrtv\"" Apr 16 16:05:17.813315 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.813283 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-vstnw\"" Apr 16 16:05:17.813498 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.813481 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 16:05:17.819609 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.819584 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-2894b"] Apr 16 16:05:17.823593 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.823555 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 16:05:17.824190 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.824171 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 16:05:17.824605 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.824587 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-8n252"] Apr 16 16:05:17.828725 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.828701 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-czs75"] Apr 16 16:05:17.841066 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.841039 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67kh2\" (UniqueName: \"kubernetes.io/projected/0d70c29e-9509-41d0-b800-db4a02a3db76-kube-api-access-67kh2\") pod \"insights-operator-5785d4fcdd-8n252\" (UID: \"0d70c29e-9509-41d0-b800-db4a02a3db76\") " pod="openshift-insights/insights-operator-5785d4fcdd-8n252" Apr 16 16:05:17.841241 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.841083 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d70c29e-9509-41d0-b800-db4a02a3db76-serving-cert\") pod \"insights-operator-5785d4fcdd-8n252\" (UID: \"0d70c29e-9509-41d0-b800-db4a02a3db76\") " pod="openshift-insights/insights-operator-5785d4fcdd-8n252" Apr 16 16:05:17.841241 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.841101 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff7fcfa4-9774-4831-b686-678a7f92a456-config\") pod \"console-operator-d87b8d5fc-2894b\" (UID: \"ff7fcfa4-9774-4831-b686-678a7f92a456\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" Apr 16 16:05:17.841241 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.841119 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d70c29e-9509-41d0-b800-db4a02a3db76-tmp\") pod \"insights-operator-5785d4fcdd-8n252\" (UID: \"0d70c29e-9509-41d0-b800-db4a02a3db76\") " pod="openshift-insights/insights-operator-5785d4fcdd-8n252" Apr 16 16:05:17.841241 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.841200 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr64b\" (UniqueName: \"kubernetes.io/projected/ff7fcfa4-9774-4831-b686-678a7f92a456-kube-api-access-rr64b\") pod \"console-operator-d87b8d5fc-2894b\" (UID: \"ff7fcfa4-9774-4831-b686-678a7f92a456\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" Apr 16 16:05:17.841405 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.841265 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf4j7\" (UniqueName: \"kubernetes.io/projected/3520921b-a85a-4467-a9fe-2c2eb4308569-kube-api-access-zf4j7\") pod \"cluster-samples-operator-667775844f-czs75\" (UID: \"3520921b-a85a-4467-a9fe-2c2eb4308569\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-czs75" Apr 16 16:05:17.841405 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.841298 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff7fcfa4-9774-4831-b686-678a7f92a456-trusted-ca\") pod \"console-operator-d87b8d5fc-2894b\" (UID: \"ff7fcfa4-9774-4831-b686-678a7f92a456\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" Apr 16 16:05:17.841405 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.841319 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0d70c29e-9509-41d0-b800-db4a02a3db76-snapshots\") pod \"insights-operator-5785d4fcdd-8n252\" (UID: \"0d70c29e-9509-41d0-b800-db4a02a3db76\") " pod="openshift-insights/insights-operator-5785d4fcdd-8n252" Apr 16 16:05:17.841405 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.841337 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5ld4\" (UniqueName: \"kubernetes.io/projected/e55b08d2-3aff-471f-9d16-c3b3eb293f68-kube-api-access-z5ld4\") pod \"volume-data-source-validator-7d955d5dd4-wjq4m\" (UID: \"e55b08d2-3aff-471f-9d16-c3b3eb293f68\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wjq4m" Apr 16 16:05:17.841405 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.841359 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d70c29e-9509-41d0-b800-db4a02a3db76-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-8n252\" (UID: \"0d70c29e-9509-41d0-b800-db4a02a3db76\") " pod="openshift-insights/insights-operator-5785d4fcdd-8n252" Apr 16 16:05:17.841405 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.841395 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3520921b-a85a-4467-a9fe-2c2eb4308569-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-czs75\" (UID: \"3520921b-a85a-4467-a9fe-2c2eb4308569\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-czs75" Apr 16 16:05:17.841610 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.841417 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d70c29e-9509-41d0-b800-db4a02a3db76-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-8n252\" (UID: \"0d70c29e-9509-41d0-b800-db4a02a3db76\") " pod="openshift-insights/insights-operator-5785d4fcdd-8n252" Apr 16 16:05:17.841610 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.841463 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff7fcfa4-9774-4831-b686-678a7f92a456-serving-cert\") pod \"console-operator-d87b8d5fc-2894b\" (UID: \"ff7fcfa4-9774-4831-b686-678a7f92a456\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" Apr 16 16:05:17.887032 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.887002 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vfg5m"] Apr 16 16:05:17.888859 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.888843 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vfg5m" Apr 16 16:05:17.891788 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.891767 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 16:05:17.891788 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.891780 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-tncz8\"" Apr 16 16:05:17.892252 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.892234 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 16:05:17.892305 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.892262 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:05:17.892809 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.892796 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 16:05:17.903928 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.903898 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vfg5m"] Apr 16 16:05:17.942612 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.942579 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff7fcfa4-9774-4831-b686-678a7f92a456-trusted-ca\") pod \"console-operator-d87b8d5fc-2894b\" (UID: \"ff7fcfa4-9774-4831-b686-678a7f92a456\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" Apr 16 16:05:17.942612 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.942618 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0d70c29e-9509-41d0-b800-db4a02a3db76-snapshots\") pod \"insights-operator-5785d4fcdd-8n252\" (UID: \"0d70c29e-9509-41d0-b800-db4a02a3db76\") " pod="openshift-insights/insights-operator-5785d4fcdd-8n252" Apr 16 16:05:17.942883 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.942646 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5ld4\" (UniqueName: \"kubernetes.io/projected/e55b08d2-3aff-471f-9d16-c3b3eb293f68-kube-api-access-z5ld4\") pod \"volume-data-source-validator-7d955d5dd4-wjq4m\" (UID: \"e55b08d2-3aff-471f-9d16-c3b3eb293f68\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wjq4m" Apr 16 16:05:17.942883 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.942664 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d70c29e-9509-41d0-b800-db4a02a3db76-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-8n252\" (UID: \"0d70c29e-9509-41d0-b800-db4a02a3db76\") " pod="openshift-insights/insights-operator-5785d4fcdd-8n252" Apr 16 16:05:17.942883 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.942684 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdrbf\" (UniqueName: \"kubernetes.io/projected/106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb-kube-api-access-pdrbf\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vfg5m\" (UID: \"106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vfg5m" Apr 16 16:05:17.942883 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.942718 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3520921b-a85a-4467-a9fe-2c2eb4308569-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-czs75\" (UID: \"3520921b-a85a-4467-a9fe-2c2eb4308569\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-czs75" Apr 16 16:05:17.942883 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.942748 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vfg5m\" (UID: \"106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vfg5m" Apr 16 16:05:17.942883 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.942814 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d70c29e-9509-41d0-b800-db4a02a3db76-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-8n252\" (UID: \"0d70c29e-9509-41d0-b800-db4a02a3db76\") " pod="openshift-insights/insights-operator-5785d4fcdd-8n252" Apr 16 16:05:17.942883 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:17.942842 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:05:17.942883 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.942872 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff7fcfa4-9774-4831-b686-678a7f92a456-serving-cert\") pod \"console-operator-d87b8d5fc-2894b\" (UID: \"ff7fcfa4-9774-4831-b686-678a7f92a456\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" Apr 16 16:05:17.943308 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:17.942905 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3520921b-a85a-4467-a9fe-2c2eb4308569-samples-operator-tls podName:3520921b-a85a-4467-a9fe-2c2eb4308569 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:18.442886609 +0000 UTC m=+161.206511307 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3520921b-a85a-4467-a9fe-2c2eb4308569-samples-operator-tls") pod "cluster-samples-operator-667775844f-czs75" (UID: "3520921b-a85a-4467-a9fe-2c2eb4308569") : secret "samples-operator-tls" not found Apr 16 16:05:17.943308 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.943028 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vfg5m\" (UID: \"106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vfg5m" Apr 16 16:05:17.943308 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.943080 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67kh2\" (UniqueName: \"kubernetes.io/projected/0d70c29e-9509-41d0-b800-db4a02a3db76-kube-api-access-67kh2\") pod \"insights-operator-5785d4fcdd-8n252\" (UID: \"0d70c29e-9509-41d0-b800-db4a02a3db76\") " pod="openshift-insights/insights-operator-5785d4fcdd-8n252" Apr 16 16:05:17.943308 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.943144 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d70c29e-9509-41d0-b800-db4a02a3db76-serving-cert\") pod \"insights-operator-5785d4fcdd-8n252\" (UID: \"0d70c29e-9509-41d0-b800-db4a02a3db76\") " pod="openshift-insights/insights-operator-5785d4fcdd-8n252" Apr 16 16:05:17.943308 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.943171 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff7fcfa4-9774-4831-b686-678a7f92a456-config\") pod \"console-operator-d87b8d5fc-2894b\" (UID: \"ff7fcfa4-9774-4831-b686-678a7f92a456\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" Apr 16 16:05:17.943308 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.943197 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d70c29e-9509-41d0-b800-db4a02a3db76-tmp\") pod \"insights-operator-5785d4fcdd-8n252\" (UID: \"0d70c29e-9509-41d0-b800-db4a02a3db76\") " pod="openshift-insights/insights-operator-5785d4fcdd-8n252" Apr 16 16:05:17.943308 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.943232 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rr64b\" (UniqueName: \"kubernetes.io/projected/ff7fcfa4-9774-4831-b686-678a7f92a456-kube-api-access-rr64b\") pod \"console-operator-d87b8d5fc-2894b\" (UID: \"ff7fcfa4-9774-4831-b686-678a7f92a456\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" Apr 16 16:05:17.943308 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.943301 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zf4j7\" (UniqueName: \"kubernetes.io/projected/3520921b-a85a-4467-a9fe-2c2eb4308569-kube-api-access-zf4j7\") pod \"cluster-samples-operator-667775844f-czs75\" (UID: \"3520921b-a85a-4467-a9fe-2c2eb4308569\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-czs75" Apr 16 16:05:17.943682 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.943401 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d70c29e-9509-41d0-b800-db4a02a3db76-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-8n252\" (UID: \"0d70c29e-9509-41d0-b800-db4a02a3db76\") " pod="openshift-insights/insights-operator-5785d4fcdd-8n252" Apr 16 16:05:17.943682 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.943480 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/0d70c29e-9509-41d0-b800-db4a02a3db76-snapshots\") pod \"insights-operator-5785d4fcdd-8n252\" (UID: \"0d70c29e-9509-41d0-b800-db4a02a3db76\") " pod="openshift-insights/insights-operator-5785d4fcdd-8n252" Apr 16 16:05:17.943782 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.943753 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d70c29e-9509-41d0-b800-db4a02a3db76-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-8n252\" (UID: \"0d70c29e-9509-41d0-b800-db4a02a3db76\") " pod="openshift-insights/insights-operator-5785d4fcdd-8n252" Apr 16 16:05:17.943832 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.943788 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d70c29e-9509-41d0-b800-db4a02a3db76-tmp\") pod \"insights-operator-5785d4fcdd-8n252\" (UID: \"0d70c29e-9509-41d0-b800-db4a02a3db76\") " pod="openshift-insights/insights-operator-5785d4fcdd-8n252" Apr 16 16:05:17.943832 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.943791 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff7fcfa4-9774-4831-b686-678a7f92a456-trusted-ca\") pod \"console-operator-d87b8d5fc-2894b\" (UID: \"ff7fcfa4-9774-4831-b686-678a7f92a456\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" Apr 16 16:05:17.943973 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.943952 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff7fcfa4-9774-4831-b686-678a7f92a456-config\") pod \"console-operator-d87b8d5fc-2894b\" (UID: \"ff7fcfa4-9774-4831-b686-678a7f92a456\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" Apr 16 16:05:17.945271 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.945249 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff7fcfa4-9774-4831-b686-678a7f92a456-serving-cert\") pod \"console-operator-d87b8d5fc-2894b\" (UID: \"ff7fcfa4-9774-4831-b686-678a7f92a456\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" Apr 16 16:05:17.945653 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.945633 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d70c29e-9509-41d0-b800-db4a02a3db76-serving-cert\") pod \"insights-operator-5785d4fcdd-8n252\" (UID: \"0d70c29e-9509-41d0-b800-db4a02a3db76\") " pod="openshift-insights/insights-operator-5785d4fcdd-8n252" Apr 16 16:05:17.952330 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.952302 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf4j7\" (UniqueName: \"kubernetes.io/projected/3520921b-a85a-4467-a9fe-2c2eb4308569-kube-api-access-zf4j7\") pod \"cluster-samples-operator-667775844f-czs75\" (UID: \"3520921b-a85a-4467-a9fe-2c2eb4308569\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-czs75" Apr 16 16:05:17.952424 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.952401 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5ld4\" (UniqueName: \"kubernetes.io/projected/e55b08d2-3aff-471f-9d16-c3b3eb293f68-kube-api-access-z5ld4\") pod \"volume-data-source-validator-7d955d5dd4-wjq4m\" (UID: \"e55b08d2-3aff-471f-9d16-c3b3eb293f68\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wjq4m" Apr 16 16:05:17.954677 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.954651 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr64b\" (UniqueName: \"kubernetes.io/projected/ff7fcfa4-9774-4831-b686-678a7f92a456-kube-api-access-rr64b\") pod \"console-operator-d87b8d5fc-2894b\" (UID: \"ff7fcfa4-9774-4831-b686-678a7f92a456\") " pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" Apr 16 16:05:17.954909 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.954890 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67kh2\" (UniqueName: \"kubernetes.io/projected/0d70c29e-9509-41d0-b800-db4a02a3db76-kube-api-access-67kh2\") pod \"insights-operator-5785d4fcdd-8n252\" (UID: \"0d70c29e-9509-41d0-b800-db4a02a3db76\") " pod="openshift-insights/insights-operator-5785d4fcdd-8n252" Apr 16 16:05:17.996122 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.996083 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l"] Apr 16 16:05:17.998556 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:17.998536 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l" Apr 16 16:05:18.001464 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.001445 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 16:05:18.001807 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.001785 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 16:05:18.002622 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.002604 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-ncprn\"" Apr 16 16:05:18.002998 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.002945 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 16:05:18.003109 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.002995 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 16:05:18.011103 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.011074 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l"] Apr 16 16:05:18.044515 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.044430 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdrbf\" (UniqueName: \"kubernetes.io/projected/106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb-kube-api-access-pdrbf\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vfg5m\" (UID: \"106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vfg5m" Apr 16 16:05:18.044515 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.044486 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vfg5m\" (UID: \"106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vfg5m" Apr 16 16:05:18.044711 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.044531 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vfg5m\" (UID: \"106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vfg5m" Apr 16 16:05:18.044711 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.044589 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a2e5aa5-e21e-48f0-b556-613a38c7c168-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-c7z5l\" (UID: \"5a2e5aa5-e21e-48f0-b556-613a38c7c168\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l" Apr 16 16:05:18.044711 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.044618 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h8ld\" (UniqueName: \"kubernetes.io/projected/5a2e5aa5-e21e-48f0-b556-613a38c7c168-kube-api-access-7h8ld\") pod \"cluster-monitoring-operator-6667474d89-c7z5l\" (UID: \"5a2e5aa5-e21e-48f0-b556-613a38c7c168\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l" Apr 16 16:05:18.044711 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.044677 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5a2e5aa5-e21e-48f0-b556-613a38c7c168-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-c7z5l\" (UID: \"5a2e5aa5-e21e-48f0-b556-613a38c7c168\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l" Apr 16 16:05:18.045091 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.045058 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vfg5m\" (UID: \"106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vfg5m" Apr 16 16:05:18.046721 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.046703 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vfg5m\" (UID: \"106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vfg5m" Apr 16 16:05:18.055067 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.055040 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdrbf\" (UniqueName: \"kubernetes.io/projected/106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb-kube-api-access-pdrbf\") pod \"kube-storage-version-migrator-operator-756bb7d76f-vfg5m\" (UID: \"106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vfg5m" Apr 16 16:05:18.098030 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.097993 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wjq4m" Apr 16 16:05:18.119974 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.119942 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" Apr 16 16:05:18.126713 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.126690 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-8n252" Apr 16 16:05:18.145335 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.145297 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7h8ld\" (UniqueName: \"kubernetes.io/projected/5a2e5aa5-e21e-48f0-b556-613a38c7c168-kube-api-access-7h8ld\") pod \"cluster-monitoring-operator-6667474d89-c7z5l\" (UID: \"5a2e5aa5-e21e-48f0-b556-613a38c7c168\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l" Apr 16 16:05:18.145472 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.145377 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5a2e5aa5-e21e-48f0-b556-613a38c7c168-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-c7z5l\" (UID: \"5a2e5aa5-e21e-48f0-b556-613a38c7c168\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l" Apr 16 16:05:18.145547 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.145531 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a2e5aa5-e21e-48f0-b556-613a38c7c168-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-c7z5l\" (UID: \"5a2e5aa5-e21e-48f0-b556-613a38c7c168\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l" Apr 16 16:05:18.145672 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:18.145652 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:05:18.145748 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:18.145736 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a2e5aa5-e21e-48f0-b556-613a38c7c168-cluster-monitoring-operator-tls podName:5a2e5aa5-e21e-48f0-b556-613a38c7c168 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:18.645715642 +0000 UTC m=+161.409340352 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5a2e5aa5-e21e-48f0-b556-613a38c7c168-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-c7z5l" (UID: "5a2e5aa5-e21e-48f0-b556-613a38c7c168") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:05:18.146626 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.146594 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/5a2e5aa5-e21e-48f0-b556-613a38c7c168-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-c7z5l\" (UID: \"5a2e5aa5-e21e-48f0-b556-613a38c7c168\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l" Apr 16 16:05:18.158096 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.158062 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h8ld\" (UniqueName: \"kubernetes.io/projected/5a2e5aa5-e21e-48f0-b556-613a38c7c168-kube-api-access-7h8ld\") pod \"cluster-monitoring-operator-6667474d89-c7z5l\" (UID: \"5a2e5aa5-e21e-48f0-b556-613a38c7c168\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l" Apr 16 16:05:18.198446 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.198264 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vfg5m" Apr 16 16:05:18.233286 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.233249 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wjq4m"] Apr 16 16:05:18.268531 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.266197 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-2894b"] Apr 16 16:05:18.270833 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:05:18.270798 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff7fcfa4_9774_4831_b686_678a7f92a456.slice/crio-98fb8afa89124661f3d701a2a8bdb1fd7db168d750a6ac0cafe56909e95d52fb WatchSource:0}: Error finding container 98fb8afa89124661f3d701a2a8bdb1fd7db168d750a6ac0cafe56909e95d52fb: Status 404 returned error can't find the container with id 98fb8afa89124661f3d701a2a8bdb1fd7db168d750a6ac0cafe56909e95d52fb Apr 16 16:05:18.289348 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.289318 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-8n252"] Apr 16 16:05:18.292177 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:05:18.292120 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d70c29e_9509_41d0_b800_db4a02a3db76.slice/crio-c788ddadd657e161ff7b6e2f93dcfccf8b0c84a7ad5f3c76142e4d741f632509 WatchSource:0}: Error finding container c788ddadd657e161ff7b6e2f93dcfccf8b0c84a7ad5f3c76142e4d741f632509: Status 404 returned error can't find the container with id c788ddadd657e161ff7b6e2f93dcfccf8b0c84a7ad5f3c76142e4d741f632509 Apr 16 16:05:18.321300 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.321267 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" event={"ID":"ff7fcfa4-9774-4831-b686-678a7f92a456","Type":"ContainerStarted","Data":"98fb8afa89124661f3d701a2a8bdb1fd7db168d750a6ac0cafe56909e95d52fb"} Apr 16 16:05:18.322096 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.322071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wjq4m" event={"ID":"e55b08d2-3aff-471f-9d16-c3b3eb293f68","Type":"ContainerStarted","Data":"bf7e4c97502fd28128ce95ab2be8d3ffd4c9a3a8fb81f0ed3cae49931c1c914f"} Apr 16 16:05:18.322937 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.322917 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-8n252" event={"ID":"0d70c29e-9509-41d0-b800-db4a02a3db76","Type":"ContainerStarted","Data":"c788ddadd657e161ff7b6e2f93dcfccf8b0c84a7ad5f3c76142e4d741f632509"} Apr 16 16:05:18.331979 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.331957 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vfg5m"] Apr 16 16:05:18.334943 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:05:18.334919 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod106f6ad6_d8e6_4fc6_9a3a_e824bb4270cb.slice/crio-1f825c66de3347aa05468354acd92e91af0d7c8d28a2f81a9871107e9bdf8748 WatchSource:0}: Error finding container 1f825c66de3347aa05468354acd92e91af0d7c8d28a2f81a9871107e9bdf8748: Status 404 returned error can't find the container with id 1f825c66de3347aa05468354acd92e91af0d7c8d28a2f81a9871107e9bdf8748 Apr 16 16:05:18.448633 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.448598 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3520921b-a85a-4467-a9fe-2c2eb4308569-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-czs75\" (UID: \"3520921b-a85a-4467-a9fe-2c2eb4308569\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-czs75" Apr 16 16:05:18.448877 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:18.448751 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:05:18.448877 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:18.448815 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3520921b-a85a-4467-a9fe-2c2eb4308569-samples-operator-tls podName:3520921b-a85a-4467-a9fe-2c2eb4308569 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:19.448800854 +0000 UTC m=+162.212425548 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3520921b-a85a-4467-a9fe-2c2eb4308569-samples-operator-tls") pod "cluster-samples-operator-667775844f-czs75" (UID: "3520921b-a85a-4467-a9fe-2c2eb4308569") : secret "samples-operator-tls" not found Apr 16 16:05:18.549581 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.549483 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls\") pod \"image-registry-f7df78bf6-m4srg\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:05:18.549727 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:18.549640 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:05:18.549727 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:18.549660 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f7df78bf6-m4srg: secret "image-registry-tls" not found Apr 16 16:05:18.549727 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:18.549712 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls podName:4680da5f-5f61-4016-b39c-64017ebd7fa4 nodeName:}" failed. No retries permitted until 2026-04-16 16:07:20.549697945 +0000 UTC m=+283.313322639 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls") pod "image-registry-f7df78bf6-m4srg" (UID: "4680da5f-5f61-4016-b39c-64017ebd7fa4") : secret "image-registry-tls" not found Apr 16 16:05:18.650463 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.650422 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert\") pod \"ingress-canary-ckdqq\" (UID: \"24ed6202-e738-4abc-b26d-eec84a76b75b\") " pod="openshift-ingress-canary/ingress-canary-ckdqq" Apr 16 16:05:18.650628 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.650492 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a2e5aa5-e21e-48f0-b556-613a38c7c168-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-c7z5l\" (UID: \"5a2e5aa5-e21e-48f0-b556-613a38c7c168\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l" Apr 16 16:05:18.650628 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:18.650556 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls\") pod \"dns-default-2fk94\" (UID: \"98c6de58-d4f8-4f67-ab26-d582517a2717\") " pod="openshift-dns/dns-default-2fk94" Apr 16 16:05:18.650628 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:18.650577 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:05:18.650769 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:18.650641 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert podName:24ed6202-e738-4abc-b26d-eec84a76b75b nodeName:}" failed. No retries permitted until 2026-04-16 16:07:20.650620611 +0000 UTC m=+283.414245325 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert") pod "ingress-canary-ckdqq" (UID: "24ed6202-e738-4abc-b26d-eec84a76b75b") : secret "canary-serving-cert" not found Apr 16 16:05:18.650769 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:18.650641 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:05:18.650769 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:18.650652 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:05:18.650769 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:18.650686 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls podName:98c6de58-d4f8-4f67-ab26-d582517a2717 nodeName:}" failed. No retries permitted until 2026-04-16 16:07:20.650676495 +0000 UTC m=+283.414301185 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls") pod "dns-default-2fk94" (UID: "98c6de58-d4f8-4f67-ab26-d582517a2717") : secret "dns-default-metrics-tls" not found Apr 16 16:05:18.650769 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:18.650702 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a2e5aa5-e21e-48f0-b556-613a38c7c168-cluster-monitoring-operator-tls podName:5a2e5aa5-e21e-48f0-b556-613a38c7c168 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:19.65069301 +0000 UTC m=+162.414317700 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5a2e5aa5-e21e-48f0-b556-613a38c7c168-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-c7z5l" (UID: "5a2e5aa5-e21e-48f0-b556-613a38c7c168") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:05:19.326810 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:19.326748 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vfg5m" event={"ID":"106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb","Type":"ContainerStarted","Data":"1f825c66de3347aa05468354acd92e91af0d7c8d28a2f81a9871107e9bdf8748"} Apr 16 16:05:19.458475 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:19.458434 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3520921b-a85a-4467-a9fe-2c2eb4308569-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-czs75\" (UID: \"3520921b-a85a-4467-a9fe-2c2eb4308569\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-czs75" Apr 16 16:05:19.458662 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:19.458594 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:05:19.458662 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:19.458661 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3520921b-a85a-4467-a9fe-2c2eb4308569-samples-operator-tls podName:3520921b-a85a-4467-a9fe-2c2eb4308569 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:21.458641587 +0000 UTC m=+164.222266279 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3520921b-a85a-4467-a9fe-2c2eb4308569-samples-operator-tls") pod "cluster-samples-operator-667775844f-czs75" (UID: "3520921b-a85a-4467-a9fe-2c2eb4308569") : secret "samples-operator-tls" not found Apr 16 16:05:19.659988 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:19.659825 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a2e5aa5-e21e-48f0-b556-613a38c7c168-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-c7z5l\" (UID: \"5a2e5aa5-e21e-48f0-b556-613a38c7c168\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l" Apr 16 16:05:19.660224 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:19.660049 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:05:19.660224 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:19.660115 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a2e5aa5-e21e-48f0-b556-613a38c7c168-cluster-monitoring-operator-tls podName:5a2e5aa5-e21e-48f0-b556-613a38c7c168 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:21.660095796 +0000 UTC m=+164.423720490 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5a2e5aa5-e21e-48f0-b556-613a38c7c168-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-c7z5l" (UID: "5a2e5aa5-e21e-48f0-b556-613a38c7c168") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:05:21.474611 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:21.474581 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3520921b-a85a-4467-a9fe-2c2eb4308569-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-czs75\" (UID: \"3520921b-a85a-4467-a9fe-2c2eb4308569\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-czs75" Apr 16 16:05:21.474988 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:21.474752 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:05:21.474988 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:21.474836 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3520921b-a85a-4467-a9fe-2c2eb4308569-samples-operator-tls podName:3520921b-a85a-4467-a9fe-2c2eb4308569 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:25.474814701 +0000 UTC m=+168.238439392 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3520921b-a85a-4467-a9fe-2c2eb4308569-samples-operator-tls") pod "cluster-samples-operator-667775844f-czs75" (UID: "3520921b-a85a-4467-a9fe-2c2eb4308569") : secret "samples-operator-tls" not found Apr 16 16:05:21.675660 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:21.675628 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a2e5aa5-e21e-48f0-b556-613a38c7c168-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-c7z5l\" (UID: \"5a2e5aa5-e21e-48f0-b556-613a38c7c168\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l" Apr 16 16:05:21.675825 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:21.675803 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:05:21.675898 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:21.675887 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a2e5aa5-e21e-48f0-b556-613a38c7c168-cluster-monitoring-operator-tls podName:5a2e5aa5-e21e-48f0-b556-613a38c7c168 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:25.675866808 +0000 UTC m=+168.439491503 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5a2e5aa5-e21e-48f0-b556-613a38c7c168-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-c7z5l" (UID: "5a2e5aa5-e21e-48f0-b556-613a38c7c168") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:05:22.107517 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:22.107437 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-wnvth"] Apr 16 16:05:22.109547 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:22.109513 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-wnvth" Apr 16 16:05:22.111904 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:22.111882 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 16:05:22.112028 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:22.111884 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-qk66t\"" Apr 16 16:05:22.112991 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:22.112976 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 16:05:22.120028 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:22.120005 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-wnvth"] Apr 16 16:05:22.180089 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:22.180047 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqzjp\" (UniqueName: \"kubernetes.io/projected/f1e35f44-03e0-4625-857c-c50176b17d74-kube-api-access-hqzjp\") pod \"migrator-64d4d94569-wnvth\" (UID: \"f1e35f44-03e0-4625-857c-c50176b17d74\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-wnvth" Apr 16 16:05:22.280908 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:22.280874 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqzjp\" (UniqueName: \"kubernetes.io/projected/f1e35f44-03e0-4625-857c-c50176b17d74-kube-api-access-hqzjp\") pod \"migrator-64d4d94569-wnvth\" (UID: \"f1e35f44-03e0-4625-857c-c50176b17d74\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-wnvth" Apr 16 16:05:22.289593 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:22.289569 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqzjp\" (UniqueName: \"kubernetes.io/projected/f1e35f44-03e0-4625-857c-c50176b17d74-kube-api-access-hqzjp\") pod \"migrator-64d4d94569-wnvth\" (UID: \"f1e35f44-03e0-4625-857c-c50176b17d74\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-wnvth" Apr 16 16:05:22.334789 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:22.334752 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vfg5m" event={"ID":"106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb","Type":"ContainerStarted","Data":"0257749a7b8925668df6e1d6012a4a73795165ffa5fe790e1393491a2f3e80a4"} Apr 16 16:05:22.336451 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:22.336422 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2894b_ff7fcfa4-9774-4831-b686-678a7f92a456/console-operator/0.log" Apr 16 16:05:22.336579 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:22.336471 2577 generic.go:358] "Generic (PLEG): container finished" podID="ff7fcfa4-9774-4831-b686-678a7f92a456" containerID="22f1011d317b9944e6159fb786639dfc35261ee820bbe94073548f8579ceedfc" exitCode=255 Apr 16 16:05:22.336635 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:22.336565 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" event={"ID":"ff7fcfa4-9774-4831-b686-678a7f92a456","Type":"ContainerDied","Data":"22f1011d317b9944e6159fb786639dfc35261ee820bbe94073548f8579ceedfc"} Apr 16 16:05:22.336763 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:22.336745 2577 scope.go:117] "RemoveContainer" containerID="22f1011d317b9944e6159fb786639dfc35261ee820bbe94073548f8579ceedfc" Apr 16 16:05:22.338023 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:22.337989 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wjq4m" event={"ID":"e55b08d2-3aff-471f-9d16-c3b3eb293f68","Type":"ContainerStarted","Data":"01e0753e114777aeb0da3474029ec250ab9ec7ef314119ec2bae2765e895adc3"} Apr 16 16:05:22.339387 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:22.339366 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-8n252" event={"ID":"0d70c29e-9509-41d0-b800-db4a02a3db76","Type":"ContainerStarted","Data":"6a99697eac19f46cfe942820d7951881d331dc16d5ada846c0821eac0bf7c2fa"} Apr 16 16:05:22.352450 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:22.352394 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vfg5m" podStartSLOduration=2.372934834 podStartE2EDuration="5.352371424s" podCreationTimestamp="2026-04-16 16:05:17 +0000 UTC" firstStartedPulling="2026-04-16 16:05:18.336678015 +0000 UTC m=+161.100302704" lastFinishedPulling="2026-04-16 16:05:21.316114602 +0000 UTC m=+164.079739294" observedRunningTime="2026-04-16 16:05:22.351389858 +0000 UTC m=+165.115014570" watchObservedRunningTime="2026-04-16 16:05:22.352371424 +0000 UTC m=+165.115996135" Apr 16 16:05:22.378403 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:22.378359 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-8n252" podStartSLOduration=2.357975571 podStartE2EDuration="5.378342758s" podCreationTimestamp="2026-04-16 16:05:17 +0000 UTC" firstStartedPulling="2026-04-16 16:05:18.294076152 +0000 UTC m=+161.057700841" lastFinishedPulling="2026-04-16 16:05:21.314443338 +0000 UTC m=+164.078068028" observedRunningTime="2026-04-16 16:05:22.377455547 +0000 UTC m=+165.141080270" watchObservedRunningTime="2026-04-16 16:05:22.378342758 +0000 UTC m=+165.141967464" Apr 16 16:05:22.416746 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:22.416689 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-wjq4m" podStartSLOduration=2.349582189 podStartE2EDuration="5.416673617s" podCreationTimestamp="2026-04-16 16:05:17 +0000 UTC" firstStartedPulling="2026-04-16 16:05:18.243242131 +0000 UTC m=+161.006866839" lastFinishedPulling="2026-04-16 16:05:21.310333577 +0000 UTC m=+164.073958267" observedRunningTime="2026-04-16 16:05:22.39443257 +0000 UTC m=+165.158057281" watchObservedRunningTime="2026-04-16 16:05:22.416673617 +0000 UTC m=+165.180298330" Apr 16 16:05:22.419171 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:22.419143 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-wnvth" Apr 16 16:05:22.540841 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:22.540808 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-wnvth"] Apr 16 16:05:22.544671 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:05:22.544643 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1e35f44_03e0_4625_857c_c50176b17d74.slice/crio-e9e69699025f081e741c348aa298137c40317679bf2cf01eee23e53b5ce96fcc WatchSource:0}: Error finding container e9e69699025f081e741c348aa298137c40317679bf2cf01eee23e53b5ce96fcc: Status 404 returned error can't find the container with id e9e69699025f081e741c348aa298137c40317679bf2cf01eee23e53b5ce96fcc Apr 16 16:05:23.343662 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:23.343625 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-wnvth" event={"ID":"f1e35f44-03e0-4625-857c-c50176b17d74","Type":"ContainerStarted","Data":"e9e69699025f081e741c348aa298137c40317679bf2cf01eee23e53b5ce96fcc"} Apr 16 16:05:23.345094 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:23.345072 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2894b_ff7fcfa4-9774-4831-b686-678a7f92a456/console-operator/1.log" Apr 16 16:05:23.345488 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:23.345471 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2894b_ff7fcfa4-9774-4831-b686-678a7f92a456/console-operator/0.log" Apr 16 16:05:23.345592 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:23.345512 2577 generic.go:358] "Generic (PLEG): container finished" podID="ff7fcfa4-9774-4831-b686-678a7f92a456" containerID="aef4f37eff20c4d598c55a2c808fb64b015eea60bd5bc8b25e3025af55fd0659" exitCode=255 Apr 16 16:05:23.345648 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:23.345626 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" event={"ID":"ff7fcfa4-9774-4831-b686-678a7f92a456","Type":"ContainerDied","Data":"aef4f37eff20c4d598c55a2c808fb64b015eea60bd5bc8b25e3025af55fd0659"} Apr 16 16:05:23.345706 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:23.345675 2577 scope.go:117] "RemoveContainer" containerID="22f1011d317b9944e6159fb786639dfc35261ee820bbe94073548f8579ceedfc" Apr 16 16:05:23.345995 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:23.345978 2577 scope.go:117] "RemoveContainer" containerID="aef4f37eff20c4d598c55a2c808fb64b015eea60bd5bc8b25e3025af55fd0659" Apr 16 16:05:23.346217 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:23.346190 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-2894b_openshift-console-operator(ff7fcfa4-9774-4831-b686-678a7f92a456)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" podUID="ff7fcfa4-9774-4831-b686-678a7f92a456" Apr 16 16:05:24.107047 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.107014 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-rsgjn"] Apr 16 16:05:24.108929 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.108914 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-rsgjn" Apr 16 16:05:24.111356 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.111332 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-vrt4g\"" Apr 16 16:05:24.111477 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.111452 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 16:05:24.112048 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.112031 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 16:05:24.112149 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.112065 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 16:05:24.112149 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.112082 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 16:05:24.119735 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.119714 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-rsgjn"] Apr 16 16:05:24.198048 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.198007 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7473c9fe-8894-4e04-812e-e7456f23a988-signing-cabundle\") pod \"service-ca-bfc587fb7-rsgjn\" (UID: \"7473c9fe-8894-4e04-812e-e7456f23a988\") " pod="openshift-service-ca/service-ca-bfc587fb7-rsgjn" Apr 16 16:05:24.198242 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.198091 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxm87\" (UniqueName: \"kubernetes.io/projected/7473c9fe-8894-4e04-812e-e7456f23a988-kube-api-access-wxm87\") pod \"service-ca-bfc587fb7-rsgjn\" (UID: \"7473c9fe-8894-4e04-812e-e7456f23a988\") " pod="openshift-service-ca/service-ca-bfc587fb7-rsgjn" Apr 16 16:05:24.198242 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.198122 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7473c9fe-8894-4e04-812e-e7456f23a988-signing-key\") pod \"service-ca-bfc587fb7-rsgjn\" (UID: \"7473c9fe-8894-4e04-812e-e7456f23a988\") " pod="openshift-service-ca/service-ca-bfc587fb7-rsgjn" Apr 16 16:05:24.298685 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.298648 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7473c9fe-8894-4e04-812e-e7456f23a988-signing-cabundle\") pod \"service-ca-bfc587fb7-rsgjn\" (UID: \"7473c9fe-8894-4e04-812e-e7456f23a988\") " pod="openshift-service-ca/service-ca-bfc587fb7-rsgjn" Apr 16 16:05:24.298855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.298725 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxm87\" (UniqueName: \"kubernetes.io/projected/7473c9fe-8894-4e04-812e-e7456f23a988-kube-api-access-wxm87\") pod \"service-ca-bfc587fb7-rsgjn\" (UID: \"7473c9fe-8894-4e04-812e-e7456f23a988\") " pod="openshift-service-ca/service-ca-bfc587fb7-rsgjn" Apr 16 16:05:24.298855 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.298818 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7473c9fe-8894-4e04-812e-e7456f23a988-signing-key\") pod \"service-ca-bfc587fb7-rsgjn\" (UID: \"7473c9fe-8894-4e04-812e-e7456f23a988\") " pod="openshift-service-ca/service-ca-bfc587fb7-rsgjn" Apr 16 16:05:24.299294 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.299276 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7473c9fe-8894-4e04-812e-e7456f23a988-signing-cabundle\") pod \"service-ca-bfc587fb7-rsgjn\" (UID: \"7473c9fe-8894-4e04-812e-e7456f23a988\") " pod="openshift-service-ca/service-ca-bfc587fb7-rsgjn" Apr 16 16:05:24.301125 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.301108 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7473c9fe-8894-4e04-812e-e7456f23a988-signing-key\") pod \"service-ca-bfc587fb7-rsgjn\" (UID: \"7473c9fe-8894-4e04-812e-e7456f23a988\") " pod="openshift-service-ca/service-ca-bfc587fb7-rsgjn" Apr 16 16:05:24.309967 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.309942 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxm87\" (UniqueName: \"kubernetes.io/projected/7473c9fe-8894-4e04-812e-e7456f23a988-kube-api-access-wxm87\") pod \"service-ca-bfc587fb7-rsgjn\" (UID: \"7473c9fe-8894-4e04-812e-e7456f23a988\") " pod="openshift-service-ca/service-ca-bfc587fb7-rsgjn" Apr 16 16:05:24.349197 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.349165 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-wnvth" event={"ID":"f1e35f44-03e0-4625-857c-c50176b17d74","Type":"ContainerStarted","Data":"9e0de9fc2e4af0def2ce76b3f49c437e3e10721fdecac6145ce0af92efd0a15d"} Apr 16 16:05:24.349197 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.349204 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-wnvth" event={"ID":"f1e35f44-03e0-4625-857c-c50176b17d74","Type":"ContainerStarted","Data":"67017d68cd9b5780251ff273a4f66a9bbcbf5b41f76a05c8ff9aadef28e0a6fa"} Apr 16 16:05:24.350387 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.350369 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2894b_ff7fcfa4-9774-4831-b686-678a7f92a456/console-operator/1.log" Apr 16 16:05:24.350657 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.350645 2577 scope.go:117] "RemoveContainer" containerID="aef4f37eff20c4d598c55a2c808fb64b015eea60bd5bc8b25e3025af55fd0659" Apr 16 16:05:24.350792 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:24.350777 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-2894b_openshift-console-operator(ff7fcfa4-9774-4831-b686-678a7f92a456)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" podUID="ff7fcfa4-9774-4831-b686-678a7f92a456" Apr 16 16:05:24.367969 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.367867 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-wnvth" podStartSLOduration=1.290418747 podStartE2EDuration="2.367855793s" podCreationTimestamp="2026-04-16 16:05:22 +0000 UTC" firstStartedPulling="2026-04-16 16:05:22.546598068 +0000 UTC m=+165.310222757" lastFinishedPulling="2026-04-16 16:05:23.624035112 +0000 UTC m=+166.387659803" observedRunningTime="2026-04-16 16:05:24.366457961 +0000 UTC m=+167.130082671" watchObservedRunningTime="2026-04-16 16:05:24.367855793 +0000 UTC m=+167.131480505" Apr 16 16:05:24.417340 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.417305 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-rsgjn" Apr 16 16:05:24.554621 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.554592 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-rsgjn"] Apr 16 16:05:24.557313 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:05:24.557288 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7473c9fe_8894_4e04_812e_e7456f23a988.slice/crio-8b5c4e2fe98210a2bf1bd40b58efcdd4dfff088ef8a9e89a29d59c4b6170f1be WatchSource:0}: Error finding container 8b5c4e2fe98210a2bf1bd40b58efcdd4dfff088ef8a9e89a29d59c4b6170f1be: Status 404 returned error can't find the container with id 8b5c4e2fe98210a2bf1bd40b58efcdd4dfff088ef8a9e89a29d59c4b6170f1be Apr 16 16:05:24.851945 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:24.851917 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jt7zj_6463e7fd-1eba-4228-8782-edee4a55c601/dns-node-resolver/0.log" Apr 16 16:05:25.354813 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:25.354778 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-rsgjn" event={"ID":"7473c9fe-8894-4e04-812e-e7456f23a988","Type":"ContainerStarted","Data":"8b5c4e2fe98210a2bf1bd40b58efcdd4dfff088ef8a9e89a29d59c4b6170f1be"} Apr 16 16:05:25.509299 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:25.509264 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3520921b-a85a-4467-a9fe-2c2eb4308569-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-czs75\" (UID: \"3520921b-a85a-4467-a9fe-2c2eb4308569\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-czs75" Apr 16 16:05:25.509490 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:25.509420 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 16:05:25.509552 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:25.509491 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3520921b-a85a-4467-a9fe-2c2eb4308569-samples-operator-tls podName:3520921b-a85a-4467-a9fe-2c2eb4308569 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:33.509473358 +0000 UTC m=+176.273098055 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3520921b-a85a-4467-a9fe-2c2eb4308569-samples-operator-tls") pod "cluster-samples-operator-667775844f-czs75" (UID: "3520921b-a85a-4467-a9fe-2c2eb4308569") : secret "samples-operator-tls" not found Apr 16 16:05:25.711222 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:25.711177 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a2e5aa5-e21e-48f0-b556-613a38c7c168-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-c7z5l\" (UID: \"5a2e5aa5-e21e-48f0-b556-613a38c7c168\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l" Apr 16 16:05:25.711385 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:25.711345 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:05:25.711448 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:25.711432 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a2e5aa5-e21e-48f0-b556-613a38c7c168-cluster-monitoring-operator-tls podName:5a2e5aa5-e21e-48f0-b556-613a38c7c168 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:33.711411391 +0000 UTC m=+176.475036103 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5a2e5aa5-e21e-48f0-b556-613a38c7c168-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-c7z5l" (UID: "5a2e5aa5-e21e-48f0-b556-613a38c7c168") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:05:26.249200 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:26.249115 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vjpkx_1b804e33-713f-4c6f-a4a1-0181d77254e1/node-ca/0.log" Apr 16 16:05:26.358749 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:26.358660 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-rsgjn" event={"ID":"7473c9fe-8894-4e04-812e-e7456f23a988","Type":"ContainerStarted","Data":"d3b82d2da928e8669468857ecbbd413fb385ed4e566e300467a18e872851eae0"} Apr 16 16:05:26.376392 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:26.376331 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-rsgjn" podStartSLOduration=0.892859308 podStartE2EDuration="2.376315252s" podCreationTimestamp="2026-04-16 16:05:24 +0000 UTC" firstStartedPulling="2026-04-16 16:05:24.559099369 +0000 UTC m=+167.322724058" lastFinishedPulling="2026-04-16 16:05:26.0425553 +0000 UTC m=+168.806180002" observedRunningTime="2026-04-16 16:05:26.376161703 +0000 UTC m=+169.139786415" watchObservedRunningTime="2026-04-16 16:05:26.376315252 +0000 UTC m=+169.139939961" Apr 16 16:05:26.914770 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:26.914735 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ckdqq" Apr 16 16:05:26.914950 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:26.914791 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:05:27.252223 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:27.252198 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-wnvth_f1e35f44-03e0-4625-857c-c50176b17d74/migrator/0.log" Apr 16 16:05:27.450690 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:27.450659 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-wnvth_f1e35f44-03e0-4625-857c-c50176b17d74/graceful-termination/0.log" Apr 16 16:05:27.668387 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:27.668300 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-vfg5m_106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb/kube-storage-version-migrator-operator/0.log" Apr 16 16:05:28.120755 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:28.120712 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" Apr 16 16:05:28.120755 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:28.120756 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" Apr 16 16:05:28.121160 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:28.121117 2577 scope.go:117] "RemoveContainer" containerID="aef4f37eff20c4d598c55a2c808fb64b015eea60bd5bc8b25e3025af55fd0659" Apr 16 16:05:28.121339 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:28.121321 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-2894b_openshift-console-operator(ff7fcfa4-9774-4831-b686-678a7f92a456)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" podUID="ff7fcfa4-9774-4831-b686-678a7f92a456" Apr 16 16:05:33.581802 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:33.581764 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3520921b-a85a-4467-a9fe-2c2eb4308569-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-czs75\" (UID: \"3520921b-a85a-4467-a9fe-2c2eb4308569\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-czs75" Apr 16 16:05:33.584180 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:33.584155 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3520921b-a85a-4467-a9fe-2c2eb4308569-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-czs75\" (UID: \"3520921b-a85a-4467-a9fe-2c2eb4308569\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-czs75" Apr 16 16:05:33.713642 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:33.713604 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-czs75" Apr 16 16:05:33.785055 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:33.784438 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a2e5aa5-e21e-48f0-b556-613a38c7c168-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-c7z5l\" (UID: \"5a2e5aa5-e21e-48f0-b556-613a38c7c168\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l" Apr 16 16:05:33.785055 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:33.784599 2577 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 16:05:33.785055 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:33.784668 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a2e5aa5-e21e-48f0-b556-613a38c7c168-cluster-monitoring-operator-tls podName:5a2e5aa5-e21e-48f0-b556-613a38c7c168 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:49.784647528 +0000 UTC m=+192.548272223 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/5a2e5aa5-e21e-48f0-b556-613a38c7c168-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-c7z5l" (UID: "5a2e5aa5-e21e-48f0-b556-613a38c7c168") : secret "cluster-monitoring-operator-tls" not found Apr 16 16:05:33.857924 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:33.857829 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-czs75"] Apr 16 16:05:34.389951 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:34.389917 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-czs75" event={"ID":"3520921b-a85a-4467-a9fe-2c2eb4308569","Type":"ContainerStarted","Data":"97066f7d5d9bcf252f37d32ed61302e76077c14f27239628d7349700b340adff"} Apr 16 16:05:36.397264 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:36.397224 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-czs75" event={"ID":"3520921b-a85a-4467-a9fe-2c2eb4308569","Type":"ContainerStarted","Data":"d04dff66170f289e5850d301493180e35bc88e1abf286ee2a9d36d1813cb2e9a"} Apr 16 16:05:36.397264 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:36.397269 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-czs75" event={"ID":"3520921b-a85a-4467-a9fe-2c2eb4308569","Type":"ContainerStarted","Data":"f961beb4acbd8048dc45968d2c740e2a3ffecd950ddf6cca05d7efdb7a15d49c"} Apr 16 16:05:36.419864 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:36.419817 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-czs75" podStartSLOduration=17.770557374 podStartE2EDuration="19.419800538s" podCreationTimestamp="2026-04-16 16:05:17 +0000 UTC" firstStartedPulling="2026-04-16 16:05:33.903253013 +0000 UTC m=+176.666877705" lastFinishedPulling="2026-04-16 16:05:35.552496168 +0000 UTC m=+178.316120869" observedRunningTime="2026-04-16 16:05:36.419659444 +0000 UTC m=+179.183284169" watchObservedRunningTime="2026-04-16 16:05:36.419800538 +0000 UTC m=+179.183425250" Apr 16 16:05:42.914530 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:42.914493 2577 scope.go:117] "RemoveContainer" containerID="aef4f37eff20c4d598c55a2c808fb64b015eea60bd5bc8b25e3025af55fd0659" Apr 16 16:05:43.415858 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:43.415834 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2894b_ff7fcfa4-9774-4831-b686-678a7f92a456/console-operator/2.log" Apr 16 16:05:43.416258 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:43.416242 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2894b_ff7fcfa4-9774-4831-b686-678a7f92a456/console-operator/1.log" Apr 16 16:05:43.416328 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:43.416275 2577 generic.go:358] "Generic (PLEG): container finished" podID="ff7fcfa4-9774-4831-b686-678a7f92a456" containerID="de5dd689700975a9c20b23c23d6a5f6a38f26c309e6d3f597070b663aa349bd8" exitCode=255 Apr 16 16:05:43.416370 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:43.416333 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" event={"ID":"ff7fcfa4-9774-4831-b686-678a7f92a456","Type":"ContainerDied","Data":"de5dd689700975a9c20b23c23d6a5f6a38f26c309e6d3f597070b663aa349bd8"} Apr 16 16:05:43.416370 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:43.416365 2577 scope.go:117] "RemoveContainer" containerID="aef4f37eff20c4d598c55a2c808fb64b015eea60bd5bc8b25e3025af55fd0659" Apr 16 16:05:43.416659 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:43.416631 2577 scope.go:117] "RemoveContainer" containerID="de5dd689700975a9c20b23c23d6a5f6a38f26c309e6d3f597070b663aa349bd8" Apr 16 16:05:43.416822 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:43.416801 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-2894b_openshift-console-operator(ff7fcfa4-9774-4831-b686-678a7f92a456)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" podUID="ff7fcfa4-9774-4831-b686-678a7f92a456" Apr 16 16:05:44.421008 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:44.420978 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2894b_ff7fcfa4-9774-4831-b686-678a7f92a456/console-operator/2.log" Apr 16 16:05:48.120613 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.120577 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" Apr 16 16:05:48.120613 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.120611 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" Apr 16 16:05:48.121031 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.120911 2577 scope.go:117] "RemoveContainer" containerID="de5dd689700975a9c20b23c23d6a5f6a38f26c309e6d3f597070b663aa349bd8" Apr 16 16:05:48.121084 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:48.121068 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-2894b_openshift-console-operator(ff7fcfa4-9774-4831-b686-678a7f92a456)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" podUID="ff7fcfa4-9774-4831-b686-678a7f92a456" Apr 16 16:05:48.564239 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.564206 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-pxhmv"] Apr 16 16:05:48.566795 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.566770 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pxhmv" Apr 16 16:05:48.571531 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.571506 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 16:05:48.572315 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.572276 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 16:05:48.572732 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.572713 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-p4mnm\"" Apr 16 16:05:48.584556 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.584528 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pxhmv"] Apr 16 16:05:48.707926 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.707891 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f190f975-83f3-4c5e-977d-1e7688992f0e-data-volume\") pod \"insights-runtime-extractor-pxhmv\" (UID: \"f190f975-83f3-4c5e-977d-1e7688992f0e\") " pod="openshift-insights/insights-runtime-extractor-pxhmv" Apr 16 16:05:48.708097 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.707934 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f190f975-83f3-4c5e-977d-1e7688992f0e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pxhmv\" (UID: \"f190f975-83f3-4c5e-977d-1e7688992f0e\") " pod="openshift-insights/insights-runtime-extractor-pxhmv" Apr 16 16:05:48.708097 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.707982 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f190f975-83f3-4c5e-977d-1e7688992f0e-crio-socket\") pod \"insights-runtime-extractor-pxhmv\" (UID: \"f190f975-83f3-4c5e-977d-1e7688992f0e\") " pod="openshift-insights/insights-runtime-extractor-pxhmv" Apr 16 16:05:48.708097 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.708018 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pprk\" (UniqueName: \"kubernetes.io/projected/f190f975-83f3-4c5e-977d-1e7688992f0e-kube-api-access-9pprk\") pod \"insights-runtime-extractor-pxhmv\" (UID: \"f190f975-83f3-4c5e-977d-1e7688992f0e\") " pod="openshift-insights/insights-runtime-extractor-pxhmv" Apr 16 16:05:48.708097 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.708060 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f190f975-83f3-4c5e-977d-1e7688992f0e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pxhmv\" (UID: \"f190f975-83f3-4c5e-977d-1e7688992f0e\") " pod="openshift-insights/insights-runtime-extractor-pxhmv" Apr 16 16:05:48.809033 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.808996 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f190f975-83f3-4c5e-977d-1e7688992f0e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pxhmv\" (UID: \"f190f975-83f3-4c5e-977d-1e7688992f0e\") " pod="openshift-insights/insights-runtime-extractor-pxhmv" Apr 16 16:05:48.809259 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.809060 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f190f975-83f3-4c5e-977d-1e7688992f0e-crio-socket\") pod \"insights-runtime-extractor-pxhmv\" (UID: \"f190f975-83f3-4c5e-977d-1e7688992f0e\") " pod="openshift-insights/insights-runtime-extractor-pxhmv" Apr 16 16:05:48.809259 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.809118 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9pprk\" (UniqueName: \"kubernetes.io/projected/f190f975-83f3-4c5e-977d-1e7688992f0e-kube-api-access-9pprk\") pod \"insights-runtime-extractor-pxhmv\" (UID: \"f190f975-83f3-4c5e-977d-1e7688992f0e\") " pod="openshift-insights/insights-runtime-extractor-pxhmv" Apr 16 16:05:48.809259 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.809209 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f190f975-83f3-4c5e-977d-1e7688992f0e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pxhmv\" (UID: \"f190f975-83f3-4c5e-977d-1e7688992f0e\") " pod="openshift-insights/insights-runtime-extractor-pxhmv" Apr 16 16:05:48.809259 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.809210 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f190f975-83f3-4c5e-977d-1e7688992f0e-crio-socket\") pod \"insights-runtime-extractor-pxhmv\" (UID: \"f190f975-83f3-4c5e-977d-1e7688992f0e\") " pod="openshift-insights/insights-runtime-extractor-pxhmv" Apr 16 16:05:48.809477 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.809250 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f190f975-83f3-4c5e-977d-1e7688992f0e-data-volume\") pod \"insights-runtime-extractor-pxhmv\" (UID: \"f190f975-83f3-4c5e-977d-1e7688992f0e\") " pod="openshift-insights/insights-runtime-extractor-pxhmv" Apr 16 16:05:48.809592 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.809573 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f190f975-83f3-4c5e-977d-1e7688992f0e-data-volume\") pod \"insights-runtime-extractor-pxhmv\" (UID: \"f190f975-83f3-4c5e-977d-1e7688992f0e\") " pod="openshift-insights/insights-runtime-extractor-pxhmv" Apr 16 16:05:48.809721 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.809704 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f190f975-83f3-4c5e-977d-1e7688992f0e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-pxhmv\" (UID: \"f190f975-83f3-4c5e-977d-1e7688992f0e\") " pod="openshift-insights/insights-runtime-extractor-pxhmv" Apr 16 16:05:48.811482 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.811466 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f190f975-83f3-4c5e-977d-1e7688992f0e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-pxhmv\" (UID: \"f190f975-83f3-4c5e-977d-1e7688992f0e\") " pod="openshift-insights/insights-runtime-extractor-pxhmv" Apr 16 16:05:48.827853 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.827792 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pprk\" (UniqueName: \"kubernetes.io/projected/f190f975-83f3-4c5e-977d-1e7688992f0e-kube-api-access-9pprk\") pod \"insights-runtime-extractor-pxhmv\" (UID: \"f190f975-83f3-4c5e-977d-1e7688992f0e\") " pod="openshift-insights/insights-runtime-extractor-pxhmv" Apr 16 16:05:48.876312 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:48.876279 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-pxhmv" Apr 16 16:05:49.003879 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:49.000413 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-pxhmv"] Apr 16 16:05:49.004896 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:05:49.004860 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf190f975_83f3_4c5e_977d_1e7688992f0e.slice/crio-9fd933d91e2177d46375aee702ec0175e5d710359f4fc30aa8e405a5174431ea WatchSource:0}: Error finding container 9fd933d91e2177d46375aee702ec0175e5d710359f4fc30aa8e405a5174431ea: Status 404 returned error can't find the container with id 9fd933d91e2177d46375aee702ec0175e5d710359f4fc30aa8e405a5174431ea Apr 16 16:05:49.433711 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:49.433679 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pxhmv" event={"ID":"f190f975-83f3-4c5e-977d-1e7688992f0e","Type":"ContainerStarted","Data":"4503045580fe6b48b30a7074f0790e0ab69621ab0e42fa6cdeb10c0a3608a61f"} Apr 16 16:05:49.433711 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:49.433713 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pxhmv" event={"ID":"f190f975-83f3-4c5e-977d-1e7688992f0e","Type":"ContainerStarted","Data":"9fd933d91e2177d46375aee702ec0175e5d710359f4fc30aa8e405a5174431ea"} Apr 16 16:05:49.817614 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:49.817529 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a2e5aa5-e21e-48f0-b556-613a38c7c168-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-c7z5l\" (UID: \"5a2e5aa5-e21e-48f0-b556-613a38c7c168\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l" Apr 16 16:05:49.819957 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:49.819924 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/5a2e5aa5-e21e-48f0-b556-613a38c7c168-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-c7z5l\" (UID: \"5a2e5aa5-e21e-48f0-b556-613a38c7c168\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l" Apr 16 16:05:50.111333 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:50.111239 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-ncprn\"" Apr 16 16:05:50.119329 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:50.119292 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l" Apr 16 16:05:50.258177 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:50.258147 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l"] Apr 16 16:05:50.261541 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:05:50.261509 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a2e5aa5_e21e_48f0_b556_613a38c7c168.slice/crio-7d9a0da7de338d327bb8082e6567f74b435e45139985251d8dfa8cd8013c2bc5 WatchSource:0}: Error finding container 7d9a0da7de338d327bb8082e6567f74b435e45139985251d8dfa8cd8013c2bc5: Status 404 returned error can't find the container with id 7d9a0da7de338d327bb8082e6567f74b435e45139985251d8dfa8cd8013c2bc5 Apr 16 16:05:50.438702 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:50.438662 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l" event={"ID":"5a2e5aa5-e21e-48f0-b556-613a38c7c168","Type":"ContainerStarted","Data":"7d9a0da7de338d327bb8082e6567f74b435e45139985251d8dfa8cd8013c2bc5"} Apr 16 16:05:50.441316 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:50.441286 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pxhmv" event={"ID":"f190f975-83f3-4c5e-977d-1e7688992f0e","Type":"ContainerStarted","Data":"f0b1d8f9862c9d47ff16f9b61160449e3882396282582912d9b78b627a7765e1"} Apr 16 16:05:51.445610 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:51.445577 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-pxhmv" event={"ID":"f190f975-83f3-4c5e-977d-1e7688992f0e","Type":"ContainerStarted","Data":"5983b01f05fe4953df85c2594c6702a5a8ccbaeecd9347e3b7ee613cb1544e1c"} Apr 16 16:05:51.464578 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:51.464529 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-pxhmv" podStartSLOduration=1.632203256 podStartE2EDuration="3.46451579s" podCreationTimestamp="2026-04-16 16:05:48 +0000 UTC" firstStartedPulling="2026-04-16 16:05:49.05586732 +0000 UTC m=+191.819492013" lastFinishedPulling="2026-04-16 16:05:50.888179842 +0000 UTC m=+193.651804547" observedRunningTime="2026-04-16 16:05:51.463394388 +0000 UTC m=+194.227019102" watchObservedRunningTime="2026-04-16 16:05:51.46451579 +0000 UTC m=+194.228140503" Apr 16 16:05:53.452662 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:53.452627 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l" event={"ID":"5a2e5aa5-e21e-48f0-b556-613a38c7c168","Type":"ContainerStarted","Data":"64fd868b3530f0c573bda3df70e5f7bb139eb30aed62164d97480265e7bfb3ef"} Apr 16 16:05:53.472755 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:53.472706 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-c7z5l" podStartSLOduration=34.293607778 podStartE2EDuration="36.472692461s" podCreationTimestamp="2026-04-16 16:05:17 +0000 UTC" firstStartedPulling="2026-04-16 16:05:50.264164464 +0000 UTC m=+193.027789157" lastFinishedPulling="2026-04-16 16:05:52.443249133 +0000 UTC m=+195.206873840" observedRunningTime="2026-04-16 16:05:53.471590031 +0000 UTC m=+196.235214742" watchObservedRunningTime="2026-04-16 16:05:53.472692461 +0000 UTC m=+196.236317171" Apr 16 16:05:55.070866 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:55.070832 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-95j9v"] Apr 16 16:05:55.110101 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:55.110068 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-95j9v"] Apr 16 16:05:55.110279 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:55.110106 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-95j9v" Apr 16 16:05:55.114059 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:55.114026 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 16:05:55.114220 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:55.114071 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 16:05:55.114220 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:55.114089 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-jl8n4\"" Apr 16 16:05:55.114220 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:55.114071 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 16:05:55.261984 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:55.261940 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7e7187f1-cf40-463f-b670-90c9d4fc7c3d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-95j9v\" (UID: \"7e7187f1-cf40-463f-b670-90c9d4fc7c3d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-95j9v" Apr 16 16:05:55.262187 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:55.262000 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e7187f1-cf40-463f-b670-90c9d4fc7c3d-metrics-client-ca\") pod \"prometheus-operator-78f957474d-95j9v\" (UID: \"7e7187f1-cf40-463f-b670-90c9d4fc7c3d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-95j9v" Apr 16 16:05:55.262187 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:55.262067 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e7187f1-cf40-463f-b670-90c9d4fc7c3d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-95j9v\" (UID: \"7e7187f1-cf40-463f-b670-90c9d4fc7c3d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-95j9v" Apr 16 16:05:55.262286 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:55.262204 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdzxh\" (UniqueName: \"kubernetes.io/projected/7e7187f1-cf40-463f-b670-90c9d4fc7c3d-kube-api-access-gdzxh\") pod \"prometheus-operator-78f957474d-95j9v\" (UID: \"7e7187f1-cf40-463f-b670-90c9d4fc7c3d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-95j9v" Apr 16 16:05:55.362932 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:55.362844 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7e7187f1-cf40-463f-b670-90c9d4fc7c3d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-95j9v\" (UID: \"7e7187f1-cf40-463f-b670-90c9d4fc7c3d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-95j9v" Apr 16 16:05:55.362932 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:55.362907 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e7187f1-cf40-463f-b670-90c9d4fc7c3d-metrics-client-ca\") pod \"prometheus-operator-78f957474d-95j9v\" (UID: \"7e7187f1-cf40-463f-b670-90c9d4fc7c3d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-95j9v" Apr 16 16:05:55.363105 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:55.362943 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e7187f1-cf40-463f-b670-90c9d4fc7c3d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-95j9v\" (UID: \"7e7187f1-cf40-463f-b670-90c9d4fc7c3d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-95j9v" Apr 16 16:05:55.363105 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:55.363020 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdzxh\" (UniqueName: \"kubernetes.io/projected/7e7187f1-cf40-463f-b670-90c9d4fc7c3d-kube-api-access-gdzxh\") pod \"prometheus-operator-78f957474d-95j9v\" (UID: \"7e7187f1-cf40-463f-b670-90c9d4fc7c3d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-95j9v" Apr 16 16:05:55.363210 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:55.363165 2577 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 16:05:55.363244 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:05:55.363229 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e7187f1-cf40-463f-b670-90c9d4fc7c3d-prometheus-operator-tls podName:7e7187f1-cf40-463f-b670-90c9d4fc7c3d nodeName:}" failed. No retries permitted until 2026-04-16 16:05:55.863214158 +0000 UTC m=+198.626838853 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/7e7187f1-cf40-463f-b670-90c9d4fc7c3d-prometheus-operator-tls") pod "prometheus-operator-78f957474d-95j9v" (UID: "7e7187f1-cf40-463f-b670-90c9d4fc7c3d") : secret "prometheus-operator-tls" not found Apr 16 16:05:55.363570 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:55.363554 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e7187f1-cf40-463f-b670-90c9d4fc7c3d-metrics-client-ca\") pod \"prometheus-operator-78f957474d-95j9v\" (UID: \"7e7187f1-cf40-463f-b670-90c9d4fc7c3d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-95j9v" Apr 16 16:05:55.374833 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:55.374806 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdzxh\" (UniqueName: \"kubernetes.io/projected/7e7187f1-cf40-463f-b670-90c9d4fc7c3d-kube-api-access-gdzxh\") pod \"prometheus-operator-78f957474d-95j9v\" (UID: \"7e7187f1-cf40-463f-b670-90c9d4fc7c3d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-95j9v" Apr 16 16:05:55.375916 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:55.375891 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7e7187f1-cf40-463f-b670-90c9d4fc7c3d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-95j9v\" (UID: \"7e7187f1-cf40-463f-b670-90c9d4fc7c3d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-95j9v" Apr 16 16:05:55.866929 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:55.866894 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e7187f1-cf40-463f-b670-90c9d4fc7c3d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-95j9v\" (UID: \"7e7187f1-cf40-463f-b670-90c9d4fc7c3d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-95j9v" Apr 16 16:05:55.869467 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:55.869443 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e7187f1-cf40-463f-b670-90c9d4fc7c3d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-95j9v\" (UID: \"7e7187f1-cf40-463f-b670-90c9d4fc7c3d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-95j9v" Apr 16 16:05:56.019792 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:56.019755 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-95j9v" Apr 16 16:05:56.156034 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:56.155954 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-95j9v"] Apr 16 16:05:56.159927 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:05:56.159887 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e7187f1_cf40_463f_b670_90c9d4fc7c3d.slice/crio-936933b040add8e6f805f3f44d707865af638f8224408bbe111f7f4e07f585f1 WatchSource:0}: Error finding container 936933b040add8e6f805f3f44d707865af638f8224408bbe111f7f4e07f585f1: Status 404 returned error can't find the container with id 936933b040add8e6f805f3f44d707865af638f8224408bbe111f7f4e07f585f1 Apr 16 16:05:56.461661 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:56.461626 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-95j9v" event={"ID":"7e7187f1-cf40-463f-b670-90c9d4fc7c3d","Type":"ContainerStarted","Data":"936933b040add8e6f805f3f44d707865af638f8224408bbe111f7f4e07f585f1"} Apr 16 16:05:58.468382 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:58.468349 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-95j9v" event={"ID":"7e7187f1-cf40-463f-b670-90c9d4fc7c3d","Type":"ContainerStarted","Data":"a768687d669fb2f3545d507f19e400f75cd07c7e2232305df915fc5b69ef52bb"} Apr 16 16:05:59.473255 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:59.473212 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-95j9v" event={"ID":"7e7187f1-cf40-463f-b670-90c9d4fc7c3d","Type":"ContainerStarted","Data":"de61b9102a4d58765fb5660276d8dcb7749d35ec3977ea352858efe398524f07"} Apr 16 16:05:59.491070 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:05:59.491023 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-95j9v" podStartSLOduration=2.258816743 podStartE2EDuration="4.491010028s" podCreationTimestamp="2026-04-16 16:05:55 +0000 UTC" firstStartedPulling="2026-04-16 16:05:56.161926538 +0000 UTC m=+198.925551227" lastFinishedPulling="2026-04-16 16:05:58.39411981 +0000 UTC m=+201.157744512" observedRunningTime="2026-04-16 16:05:59.48962632 +0000 UTC m=+202.253251031" watchObservedRunningTime="2026-04-16 16:05:59.491010028 +0000 UTC m=+202.254634739" Apr 16 16:06:00.914567 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:00.914535 2577 scope.go:117] "RemoveContainer" containerID="de5dd689700975a9c20b23c23d6a5f6a38f26c309e6d3f597070b663aa349bd8" Apr 16 16:06:00.915009 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:06:00.914711 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-2894b_openshift-console-operator(ff7fcfa4-9774-4831-b686-678a7f92a456)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" podUID="ff7fcfa4-9774-4831-b686-678a7f92a456" Apr 16 16:06:01.469565 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.469527 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-rfpjk"] Apr 16 16:06:01.472858 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.472835 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" Apr 16 16:06:01.477316 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.477275 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-mrmjx\"" Apr 16 16:06:01.477634 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.477614 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 16:06:01.477805 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.477343 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 16:06:01.477964 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.477379 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 16:06:01.485348 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.485325 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-rfpjk"] Apr 16 16:06:01.489608 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.489581 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-44r2t"] Apr 16 16:06:01.493117 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.493098 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.495480 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.495462 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 16:06:01.496320 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.496293 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-87tn6\"" Apr 16 16:06:01.496414 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.496391 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 16:06:01.496510 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.496493 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 16:06:01.504037 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.504009 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/387cec09-a424-4de4-8906-da52b3743df9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.504174 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.504058 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7gk2\" (UniqueName: \"kubernetes.io/projected/387cec09-a424-4de4-8906-da52b3743df9-kube-api-access-t7gk2\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.504174 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.504086 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62t5v\" (UniqueName: \"kubernetes.io/projected/152001c5-8370-4271-b714-21043a493948-kube-api-access-62t5v\") pod \"kube-state-metrics-7479c89684-rfpjk\" (UID: \"152001c5-8370-4271-b714-21043a493948\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" Apr 16 16:06:01.504174 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.504112 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/387cec09-a424-4de4-8906-da52b3743df9-sys\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.504174 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.504152 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/152001c5-8370-4271-b714-21043a493948-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-rfpjk\" (UID: \"152001c5-8370-4271-b714-21043a493948\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" Apr 16 16:06:01.504438 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.504180 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/387cec09-a424-4de4-8906-da52b3743df9-node-exporter-tls\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.504438 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.504206 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/387cec09-a424-4de4-8906-da52b3743df9-metrics-client-ca\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.504438 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.504249 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/387cec09-a424-4de4-8906-da52b3743df9-node-exporter-accelerators-collector-config\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.504613 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.504566 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/152001c5-8370-4271-b714-21043a493948-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-rfpjk\" (UID: \"152001c5-8370-4271-b714-21043a493948\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" Apr 16 16:06:01.504613 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.504600 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/152001c5-8370-4271-b714-21043a493948-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-rfpjk\" (UID: \"152001c5-8370-4271-b714-21043a493948\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" Apr 16 16:06:01.504709 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.504669 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/152001c5-8370-4271-b714-21043a493948-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-rfpjk\" (UID: \"152001c5-8370-4271-b714-21043a493948\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" Apr 16 16:06:01.504760 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.504712 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/387cec09-a424-4de4-8906-da52b3743df9-root\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.504760 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.504740 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/152001c5-8370-4271-b714-21043a493948-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-rfpjk\" (UID: \"152001c5-8370-4271-b714-21043a493948\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" Apr 16 16:06:01.504860 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.504765 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/387cec09-a424-4de4-8906-da52b3743df9-node-exporter-wtmp\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.504860 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.504805 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/387cec09-a424-4de4-8906-da52b3743df9-node-exporter-textfile\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.606058 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.606018 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/387cec09-a424-4de4-8906-da52b3743df9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.606253 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.606065 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7gk2\" (UniqueName: \"kubernetes.io/projected/387cec09-a424-4de4-8906-da52b3743df9-kube-api-access-t7gk2\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.606314 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.606258 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62t5v\" (UniqueName: \"kubernetes.io/projected/152001c5-8370-4271-b714-21043a493948-kube-api-access-62t5v\") pod \"kube-state-metrics-7479c89684-rfpjk\" (UID: \"152001c5-8370-4271-b714-21043a493948\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" Apr 16 16:06:01.606314 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.606295 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/387cec09-a424-4de4-8906-da52b3743df9-sys\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.606478 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.606322 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/152001c5-8370-4271-b714-21043a493948-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-rfpjk\" (UID: \"152001c5-8370-4271-b714-21043a493948\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" Apr 16 16:06:01.606478 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.606346 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/387cec09-a424-4de4-8906-da52b3743df9-node-exporter-tls\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.606478 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.606377 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/387cec09-a424-4de4-8906-da52b3743df9-metrics-client-ca\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.606478 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.606395 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/387cec09-a424-4de4-8906-da52b3743df9-sys\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.606478 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.606403 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/387cec09-a424-4de4-8906-da52b3743df9-node-exporter-accelerators-collector-config\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.606478 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.606462 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/152001c5-8370-4271-b714-21043a493948-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-rfpjk\" (UID: \"152001c5-8370-4271-b714-21043a493948\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" Apr 16 16:06:01.606764 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.606492 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/152001c5-8370-4271-b714-21043a493948-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-rfpjk\" (UID: \"152001c5-8370-4271-b714-21043a493948\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" Apr 16 16:06:01.606764 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.606548 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/152001c5-8370-4271-b714-21043a493948-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-rfpjk\" (UID: \"152001c5-8370-4271-b714-21043a493948\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" Apr 16 16:06:01.606764 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.606599 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/387cec09-a424-4de4-8906-da52b3743df9-root\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.606764 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.606629 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/152001c5-8370-4271-b714-21043a493948-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-rfpjk\" (UID: \"152001c5-8370-4271-b714-21043a493948\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" Apr 16 16:06:01.606764 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.606655 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/387cec09-a424-4de4-8906-da52b3743df9-node-exporter-wtmp\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.606764 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.606698 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/387cec09-a424-4de4-8906-da52b3743df9-node-exporter-textfile\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.606764 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.606762 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/152001c5-8370-4271-b714-21043a493948-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-rfpjk\" (UID: \"152001c5-8370-4271-b714-21043a493948\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" Apr 16 16:06:01.607122 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:06:01.606492 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 16:06:01.607122 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:06:01.606876 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/387cec09-a424-4de4-8906-da52b3743df9-node-exporter-tls podName:387cec09-a424-4de4-8906-da52b3743df9 nodeName:}" failed. No retries permitted until 2026-04-16 16:06:02.106857038 +0000 UTC m=+204.870481727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/387cec09-a424-4de4-8906-da52b3743df9-node-exporter-tls") pod "node-exporter-44r2t" (UID: "387cec09-a424-4de4-8906-da52b3743df9") : secret "node-exporter-tls" not found Apr 16 16:06:01.607122 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.606988 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/387cec09-a424-4de4-8906-da52b3743df9-metrics-client-ca\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.607358 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.607120 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/387cec09-a424-4de4-8906-da52b3743df9-node-exporter-accelerators-collector-config\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.607358 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.607171 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/387cec09-a424-4de4-8906-da52b3743df9-node-exporter-textfile\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.607358 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.607206 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/387cec09-a424-4de4-8906-da52b3743df9-root\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.607358 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.607266 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/387cec09-a424-4de4-8906-da52b3743df9-node-exporter-wtmp\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.607358 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.607289 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/152001c5-8370-4271-b714-21043a493948-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-rfpjk\" (UID: \"152001c5-8370-4271-b714-21043a493948\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" Apr 16 16:06:01.607653 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.607630 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/152001c5-8370-4271-b714-21043a493948-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-rfpjk\" (UID: \"152001c5-8370-4271-b714-21043a493948\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" Apr 16 16:06:01.608701 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.608663 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/387cec09-a424-4de4-8906-da52b3743df9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.609443 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.609424 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/152001c5-8370-4271-b714-21043a493948-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-rfpjk\" (UID: \"152001c5-8370-4271-b714-21043a493948\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" Apr 16 16:06:01.609625 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.609600 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/152001c5-8370-4271-b714-21043a493948-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-rfpjk\" (UID: \"152001c5-8370-4271-b714-21043a493948\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" Apr 16 16:06:01.615519 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.615494 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7gk2\" (UniqueName: \"kubernetes.io/projected/387cec09-a424-4de4-8906-da52b3743df9-kube-api-access-t7gk2\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:01.616359 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.616334 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62t5v\" (UniqueName: \"kubernetes.io/projected/152001c5-8370-4271-b714-21043a493948-kube-api-access-62t5v\") pod \"kube-state-metrics-7479c89684-rfpjk\" (UID: \"152001c5-8370-4271-b714-21043a493948\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" Apr 16 16:06:01.783584 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.783511 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" Apr 16 16:06:01.919519 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:01.919495 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-rfpjk"] Apr 16 16:06:01.919519 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:06:01.919496 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod152001c5_8370_4271_b714_21043a493948.slice/crio-a40b9969650542a89e5e4a03886d06362a7e720c38fa38b9aa5fb6cbb950ce32 WatchSource:0}: Error finding container a40b9969650542a89e5e4a03886d06362a7e720c38fa38b9aa5fb6cbb950ce32: Status 404 returned error can't find the container with id a40b9969650542a89e5e4a03886d06362a7e720c38fa38b9aa5fb6cbb950ce32 Apr 16 16:06:02.109591 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.109495 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/387cec09-a424-4de4-8906-da52b3743df9-node-exporter-tls\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:02.111744 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.111729 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/387cec09-a424-4de4-8906-da52b3743df9-node-exporter-tls\") pod \"node-exporter-44r2t\" (UID: \"387cec09-a424-4de4-8906-da52b3743df9\") " pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:02.403067 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.402975 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-44r2t" Apr 16 16:06:02.412163 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:06:02.412104 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod387cec09_a424_4de4_8906_da52b3743df9.slice/crio-7023d38d28687ab181bda2e67045a1d522bc09c3d710975f0461cd22c1298861 WatchSource:0}: Error finding container 7023d38d28687ab181bda2e67045a1d522bc09c3d710975f0461cd22c1298861: Status 404 returned error can't find the container with id 7023d38d28687ab181bda2e67045a1d522bc09c3d710975f0461cd22c1298861 Apr 16 16:06:02.491148 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.491080 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-44r2t" event={"ID":"387cec09-a424-4de4-8906-da52b3743df9","Type":"ContainerStarted","Data":"7023d38d28687ab181bda2e67045a1d522bc09c3d710975f0461cd22c1298861"} Apr 16 16:06:02.492277 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.492251 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" event={"ID":"152001c5-8370-4271-b714-21043a493948","Type":"ContainerStarted","Data":"a40b9969650542a89e5e4a03886d06362a7e720c38fa38b9aa5fb6cbb950ce32"} Apr 16 16:06:02.589775 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.589741 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:06:02.595612 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.595295 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.598541 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.598376 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 16:06:02.598541 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.598399 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 16:06:02.598541 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.598376 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 16:06:02.598973 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.598701 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 16:06:02.598973 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.598717 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 16:06:02.598973 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.598928 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 16:06:02.598973 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.598932 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 16:06:02.599278 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.599168 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 16:06:02.599649 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.599399 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-gdl9n\"" Apr 16 16:06:02.599649 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.599455 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 16:06:02.609899 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.609836 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:06:02.613840 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.613310 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.613840 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.613460 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.613840 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.613510 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-web-config\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.613840 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.613630 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.613840 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.613685 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.613840 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.613719 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.613840 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.613746 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.614290 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.614013 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwrfd\" (UniqueName: \"kubernetes.io/projected/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-kube-api-access-xwrfd\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.614290 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.614047 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.614290 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.614093 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-config-volume\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.614290 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.614159 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.614290 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.614210 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-config-out\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.614290 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.614235 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.714635 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.714602 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.714635 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.714646 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.714878 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.714692 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-web-config\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.714878 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.714719 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.714878 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.714768 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.714878 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.714796 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.714878 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.714823 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.714878 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.714855 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwrfd\" (UniqueName: \"kubernetes.io/projected/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-kube-api-access-xwrfd\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.715177 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.714885 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.715177 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.714925 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-config-volume\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.715177 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.714965 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.715177 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.715005 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-config-out\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.715177 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.715030 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.715177 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.715147 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.715455 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:06:02.715440 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-alertmanager-trusted-ca-bundle podName:c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf nodeName:}" failed. No retries permitted until 2026-04-16 16:06:03.215417292 +0000 UTC m=+205.979041994 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf") : configmap references non-existent config key: ca-bundle.crt Apr 16 16:06:02.718184 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.718159 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.718662 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.718635 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.719642 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.719613 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.720657 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.720615 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-config-volume\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.720834 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.720787 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.721530 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.721468 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.721653 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.721638 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-web-config\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.722017 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.721902 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.722017 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.721976 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-config-out\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.723366 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.723325 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:02.731718 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:02.731685 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwrfd\" (UniqueName: \"kubernetes.io/projected/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-kube-api-access-xwrfd\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:03.220457 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:03.220422 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:03.221505 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:03.221478 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:03.496877 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:03.496839 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" event={"ID":"152001c5-8370-4271-b714-21043a493948","Type":"ContainerStarted","Data":"2d01364deebcc2b9d9069ed1afac6fbf6cf17c6ed9164240aca511ff77d9089e"} Apr 16 16:06:03.498352 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:03.498325 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-44r2t" event={"ID":"387cec09-a424-4de4-8906-da52b3743df9","Type":"ContainerStarted","Data":"aa74de17d30e19dfba56aa2db4efc8b7577a4fd3ff287ebf3247e1d15d38a956"} Apr 16 16:06:03.513720 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:03.513331 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:06:03.699696 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:03.699661 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:06:03.706481 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:06:03.706438 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5b1fbcf_7b9e_4bda_9ce7_1257d39791cf.slice/crio-67174a4ca56da03371edd9ec8e45f6649cab6ccd926f61d94a05baad721ae3f5 WatchSource:0}: Error finding container 67174a4ca56da03371edd9ec8e45f6649cab6ccd926f61d94a05baad721ae3f5: Status 404 returned error can't find the container with id 67174a4ca56da03371edd9ec8e45f6649cab6ccd926f61d94a05baad721ae3f5 Apr 16 16:06:04.502426 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:04.502375 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf","Type":"ContainerStarted","Data":"67174a4ca56da03371edd9ec8e45f6649cab6ccd926f61d94a05baad721ae3f5"} Apr 16 16:06:04.503874 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:04.503844 2577 generic.go:358] "Generic (PLEG): container finished" podID="387cec09-a424-4de4-8906-da52b3743df9" containerID="aa74de17d30e19dfba56aa2db4efc8b7577a4fd3ff287ebf3247e1d15d38a956" exitCode=0 Apr 16 16:06:04.503992 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:04.503964 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-44r2t" event={"ID":"387cec09-a424-4de4-8906-da52b3743df9","Type":"ContainerDied","Data":"aa74de17d30e19dfba56aa2db4efc8b7577a4fd3ff287ebf3247e1d15d38a956"} Apr 16 16:06:04.506393 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:04.506372 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" event={"ID":"152001c5-8370-4271-b714-21043a493948","Type":"ContainerStarted","Data":"9db042a2de04c9abd9b0a41d2e79ba24fc8e68e432a65bee91b0407e02ed8a5a"} Apr 16 16:06:04.506481 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:04.506401 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" event={"ID":"152001c5-8370-4271-b714-21043a493948","Type":"ContainerStarted","Data":"378633fda07c936d88fdc39ca78988e74ca2f194cf1a5b2e5a6d3a06eccabf21"} Apr 16 16:06:04.561828 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:04.561762 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-rfpjk" podStartSLOduration=2.113425007 podStartE2EDuration="3.561742927s" podCreationTimestamp="2026-04-16 16:06:01 +0000 UTC" firstStartedPulling="2026-04-16 16:06:01.921550877 +0000 UTC m=+204.685175571" lastFinishedPulling="2026-04-16 16:06:03.369868795 +0000 UTC m=+206.133493491" observedRunningTime="2026-04-16 16:06:04.559160175 +0000 UTC m=+207.322784887" watchObservedRunningTime="2026-04-16 16:06:04.561742927 +0000 UTC m=+207.325367639" Apr 16 16:06:05.511140 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:05.511089 2577 generic.go:358] "Generic (PLEG): container finished" podID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerID="901c9298e782c74b6b32419e0d8ee3b69246f8ae5fa1e62e3283c434f889d01b" exitCode=0 Apr 16 16:06:05.511564 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:05.511181 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf","Type":"ContainerDied","Data":"901c9298e782c74b6b32419e0d8ee3b69246f8ae5fa1e62e3283c434f889d01b"} Apr 16 16:06:05.513417 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:05.513389 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-44r2t" event={"ID":"387cec09-a424-4de4-8906-da52b3743df9","Type":"ContainerStarted","Data":"6154b470fa807d4382baf7f05f0921fb8741f7f5d5ac5067dbe5e428e73ed4fc"} Apr 16 16:06:05.513508 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:05.513429 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-44r2t" event={"ID":"387cec09-a424-4de4-8906-da52b3743df9","Type":"ContainerStarted","Data":"6311d8805f616e20999f0ac8e4f9cadb3f49bbb2d0f31c9438eda95a51750cd0"} Apr 16 16:06:06.202229 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:06.202169 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-44r2t" podStartSLOduration=4.24390966 podStartE2EDuration="5.202150653s" podCreationTimestamp="2026-04-16 16:06:01 +0000 UTC" firstStartedPulling="2026-04-16 16:06:02.413837417 +0000 UTC m=+205.177462113" lastFinishedPulling="2026-04-16 16:06:03.372078414 +0000 UTC m=+206.135703106" observedRunningTime="2026-04-16 16:06:05.603249603 +0000 UTC m=+208.366874334" watchObservedRunningTime="2026-04-16 16:06:06.202150653 +0000 UTC m=+208.965775362" Apr 16 16:06:06.202743 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:06.202718 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-q4vvz"] Apr 16 16:06:06.207779 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:06.207760 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-q4vvz" Apr 16 16:06:06.210100 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:06.209878 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 16:06:06.210100 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:06.209879 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-pdhcc\"" Apr 16 16:06:06.213035 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:06.212997 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-q4vvz"] Apr 16 16:06:06.249363 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:06.249323 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/725494dc-ff16-4479-a274-23d904bd29ca-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-q4vvz\" (UID: \"725494dc-ff16-4479-a274-23d904bd29ca\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-q4vvz" Apr 16 16:06:06.350520 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:06.350484 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/725494dc-ff16-4479-a274-23d904bd29ca-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-q4vvz\" (UID: \"725494dc-ff16-4479-a274-23d904bd29ca\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-q4vvz" Apr 16 16:06:06.353438 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:06.353410 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/725494dc-ff16-4479-a274-23d904bd29ca-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-q4vvz\" (UID: \"725494dc-ff16-4479-a274-23d904bd29ca\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-q4vvz" Apr 16 16:06:06.524007 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:06.523846 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-q4vvz" Apr 16 16:06:06.947332 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:06.947311 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-q4vvz"] Apr 16 16:06:06.949155 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:06:06.949116 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod725494dc_ff16_4479_a274_23d904bd29ca.slice/crio-d3945400e362abf0e5c0bb62da32396d50d8879e2a44fc6560c39e468202e8de WatchSource:0}: Error finding container d3945400e362abf0e5c0bb62da32396d50d8879e2a44fc6560c39e468202e8de: Status 404 returned error can't find the container with id d3945400e362abf0e5c0bb62da32396d50d8879e2a44fc6560c39e468202e8de Apr 16 16:06:07.523517 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:07.523481 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf","Type":"ContainerStarted","Data":"1cbff76073ebdde51e4020d3d9d7b84cddb391d5fa61f7ea821f307ddfeedc83"} Apr 16 16:06:07.523761 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:07.523730 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf","Type":"ContainerStarted","Data":"297635f2622ddb88e58eb5994bf0040ca32369fe1fabe57674300bc875c11a60"} Apr 16 16:06:07.523761 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:07.523758 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf","Type":"ContainerStarted","Data":"5c20dbe095e5bbdcde9d530f925a94e19cb008328efb6861c7b4ac532848ab9e"} Apr 16 16:06:07.523942 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:07.523771 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf","Type":"ContainerStarted","Data":"38ea4a7b88b5f9a4f4a36d234976953dfebf4631b85da3b4e9d6958a810ce372"} Apr 16 16:06:07.523942 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:07.523786 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf","Type":"ContainerStarted","Data":"9216f21e9b713ce330de57220677d68a7458e4456d308998dab2d2f42a68166a"} Apr 16 16:06:07.524718 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:07.524689 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-q4vvz" event={"ID":"725494dc-ff16-4479-a274-23d904bd29ca","Type":"ContainerStarted","Data":"d3945400e362abf0e5c0bb62da32396d50d8879e2a44fc6560c39e468202e8de"} Apr 16 16:06:09.535889 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:09.535850 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf","Type":"ContainerStarted","Data":"4d2ab7c62090eda8d3d8c372a46518440acd13fdf4e4fab77f09ad6ea23dd9fe"} Apr 16 16:06:09.537119 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:09.537095 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-q4vvz" event={"ID":"725494dc-ff16-4479-a274-23d904bd29ca","Type":"ContainerStarted","Data":"fd42250b8d50164a64405e919aa86ef818119b8e6dcabbdafceae856a6341d23"} Apr 16 16:06:09.537339 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:09.537323 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-q4vvz" Apr 16 16:06:09.542115 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:09.542088 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-q4vvz" Apr 16 16:06:09.566660 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:09.566602 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.81917688 podStartE2EDuration="7.566583229s" podCreationTimestamp="2026-04-16 16:06:02 +0000 UTC" firstStartedPulling="2026-04-16 16:06:03.710032446 +0000 UTC m=+206.473657142" lastFinishedPulling="2026-04-16 16:06:08.457438786 +0000 UTC m=+211.221063491" observedRunningTime="2026-04-16 16:06:09.564922345 +0000 UTC m=+212.328547057" watchObservedRunningTime="2026-04-16 16:06:09.566583229 +0000 UTC m=+212.330207941" Apr 16 16:06:09.581124 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:09.581064 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-q4vvz" podStartSLOduration=2.072328043 podStartE2EDuration="3.581048911s" podCreationTimestamp="2026-04-16 16:06:06 +0000 UTC" firstStartedPulling="2026-04-16 16:06:06.950899137 +0000 UTC m=+209.714523827" lastFinishedPulling="2026-04-16 16:06:08.459620003 +0000 UTC m=+211.223244695" observedRunningTime="2026-04-16 16:06:09.579853681 +0000 UTC m=+212.343478392" watchObservedRunningTime="2026-04-16 16:06:09.581048911 +0000 UTC m=+212.344673621" Apr 16 16:06:10.839015 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:10.838980 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-f7df78bf6-m4srg"] Apr 16 16:06:10.839412 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:06:10.839230 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" podUID="4680da5f-5f61-4016-b39c-64017ebd7fa4" Apr 16 16:06:11.543997 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.543964 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:06:11.552885 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.552859 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:06:11.602043 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.602006 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4680da5f-5f61-4016-b39c-64017ebd7fa4-installation-pull-secrets\") pod \"4680da5f-5f61-4016-b39c-64017ebd7fa4\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " Apr 16 16:06:11.602223 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.602050 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4680da5f-5f61-4016-b39c-64017ebd7fa4-trusted-ca\") pod \"4680da5f-5f61-4016-b39c-64017ebd7fa4\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " Apr 16 16:06:11.602223 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.602091 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-certificates\") pod \"4680da5f-5f61-4016-b39c-64017ebd7fa4\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " Apr 16 16:06:11.602223 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.602119 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4680da5f-5f61-4016-b39c-64017ebd7fa4-ca-trust-extracted\") pod \"4680da5f-5f61-4016-b39c-64017ebd7fa4\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " Apr 16 16:06:11.602223 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.602178 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-bound-sa-token\") pod \"4680da5f-5f61-4016-b39c-64017ebd7fa4\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " Apr 16 16:06:11.602223 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.602218 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtmd6\" (UniqueName: \"kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-kube-api-access-dtmd6\") pod \"4680da5f-5f61-4016-b39c-64017ebd7fa4\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " Apr 16 16:06:11.602471 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.602282 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4680da5f-5f61-4016-b39c-64017ebd7fa4-image-registry-private-configuration\") pod \"4680da5f-5f61-4016-b39c-64017ebd7fa4\" (UID: \"4680da5f-5f61-4016-b39c-64017ebd7fa4\") " Apr 16 16:06:11.602516 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.602468 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4680da5f-5f61-4016-b39c-64017ebd7fa4-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4680da5f-5f61-4016-b39c-64017ebd7fa4" (UID: "4680da5f-5f61-4016-b39c-64017ebd7fa4"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:06:11.602573 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.602539 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4680da5f-5f61-4016-b39c-64017ebd7fa4" (UID: "4680da5f-5f61-4016-b39c-64017ebd7fa4"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:06:11.602624 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.602579 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4680da5f-5f61-4016-b39c-64017ebd7fa4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4680da5f-5f61-4016-b39c-64017ebd7fa4" (UID: "4680da5f-5f61-4016-b39c-64017ebd7fa4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:06:11.602679 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.602666 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-certificates\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:06:11.602730 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.602687 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4680da5f-5f61-4016-b39c-64017ebd7fa4-ca-trust-extracted\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:06:11.602730 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.602703 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4680da5f-5f61-4016-b39c-64017ebd7fa4-trusted-ca\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:06:11.604684 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.604642 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4680da5f-5f61-4016-b39c-64017ebd7fa4-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4680da5f-5f61-4016-b39c-64017ebd7fa4" (UID: "4680da5f-5f61-4016-b39c-64017ebd7fa4"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:06:11.604804 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.604763 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4680da5f-5f61-4016-b39c-64017ebd7fa4-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "4680da5f-5f61-4016-b39c-64017ebd7fa4" (UID: "4680da5f-5f61-4016-b39c-64017ebd7fa4"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:06:11.604871 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.604819 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-kube-api-access-dtmd6" (OuterVolumeSpecName: "kube-api-access-dtmd6") pod "4680da5f-5f61-4016-b39c-64017ebd7fa4" (UID: "4680da5f-5f61-4016-b39c-64017ebd7fa4"). InnerVolumeSpecName "kube-api-access-dtmd6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:06:11.604908 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.604883 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4680da5f-5f61-4016-b39c-64017ebd7fa4" (UID: "4680da5f-5f61-4016-b39c-64017ebd7fa4"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:06:11.703552 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.703506 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-bound-sa-token\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:06:11.703552 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.703543 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dtmd6\" (UniqueName: \"kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-kube-api-access-dtmd6\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:06:11.703552 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.703557 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4680da5f-5f61-4016-b39c-64017ebd7fa4-image-registry-private-configuration\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:06:11.703789 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:11.703570 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4680da5f-5f61-4016-b39c-64017ebd7fa4-installation-pull-secrets\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:06:12.546673 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:12.546642 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f7df78bf6-m4srg" Apr 16 16:06:12.591667 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:12.591627 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-f7df78bf6-m4srg"] Apr 16 16:06:12.593936 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:12.593878 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-f7df78bf6-m4srg"] Apr 16 16:06:12.712714 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:12.712675 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4680da5f-5f61-4016-b39c-64017ebd7fa4-registry-tls\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:06:13.918773 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:13.918743 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4680da5f-5f61-4016-b39c-64017ebd7fa4" path="/var/lib/kubelet/pods/4680da5f-5f61-4016-b39c-64017ebd7fa4/volumes" Apr 16 16:06:14.914532 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:14.914506 2577 scope.go:117] "RemoveContainer" containerID="de5dd689700975a9c20b23c23d6a5f6a38f26c309e6d3f597070b663aa349bd8" Apr 16 16:06:15.556125 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:15.556097 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2894b_ff7fcfa4-9774-4831-b686-678a7f92a456/console-operator/2.log" Apr 16 16:06:15.556614 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:15.556159 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" event={"ID":"ff7fcfa4-9774-4831-b686-678a7f92a456","Type":"ContainerStarted","Data":"94184467e3f9ed2b2d90351a355b4f3373ce7eabd537889c879d6ec5f7650f99"} Apr 16 16:06:15.556614 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:15.556416 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" Apr 16 16:06:15.574551 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:15.574497 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" podStartSLOduration=55.534955649 podStartE2EDuration="58.574478438s" podCreationTimestamp="2026-04-16 16:05:17 +0000 UTC" firstStartedPulling="2026-04-16 16:05:18.273253189 +0000 UTC m=+161.036877878" lastFinishedPulling="2026-04-16 16:05:21.312775965 +0000 UTC m=+164.076400667" observedRunningTime="2026-04-16 16:06:15.57447028 +0000 UTC m=+218.338095011" watchObservedRunningTime="2026-04-16 16:06:15.574478438 +0000 UTC m=+218.338103154" Apr 16 16:06:15.973949 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:15.973920 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-2894b" Apr 16 16:06:16.171476 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:16.171440 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-8pkwl"] Apr 16 16:06:16.175027 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:16.175007 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-8pkwl" Apr 16 16:06:16.177540 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:16.177515 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 16:06:16.178390 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:16.178367 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 16:06:16.178599 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:16.178578 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-bn9sz\"" Apr 16 16:06:16.190602 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:16.190571 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-8pkwl"] Apr 16 16:06:16.243246 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:16.243112 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzp9h\" (UniqueName: \"kubernetes.io/projected/5d436e24-a61c-48a2-8a51-1d4972aa2081-kube-api-access-qzp9h\") pod \"downloads-586b57c7b4-8pkwl\" (UID: \"5d436e24-a61c-48a2-8a51-1d4972aa2081\") " pod="openshift-console/downloads-586b57c7b4-8pkwl" Apr 16 16:06:16.343980 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:16.343934 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzp9h\" (UniqueName: \"kubernetes.io/projected/5d436e24-a61c-48a2-8a51-1d4972aa2081-kube-api-access-qzp9h\") pod \"downloads-586b57c7b4-8pkwl\" (UID: \"5d436e24-a61c-48a2-8a51-1d4972aa2081\") " pod="openshift-console/downloads-586b57c7b4-8pkwl" Apr 16 16:06:16.352980 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:16.352946 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzp9h\" (UniqueName: \"kubernetes.io/projected/5d436e24-a61c-48a2-8a51-1d4972aa2081-kube-api-access-qzp9h\") pod \"downloads-586b57c7b4-8pkwl\" (UID: \"5d436e24-a61c-48a2-8a51-1d4972aa2081\") " pod="openshift-console/downloads-586b57c7b4-8pkwl" Apr 16 16:06:16.484968 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:16.484928 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-8pkwl" Apr 16 16:06:16.611173 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:16.611115 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-8pkwl"] Apr 16 16:06:16.613300 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:06:16.613277 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d436e24_a61c_48a2_8a51_1d4972aa2081.slice/crio-fbee8edf64474090707042dcaeba27ca67fc941348031d99fcff117eac5b8ecd WatchSource:0}: Error finding container fbee8edf64474090707042dcaeba27ca67fc941348031d99fcff117eac5b8ecd: Status 404 returned error can't find the container with id fbee8edf64474090707042dcaeba27ca67fc941348031d99fcff117eac5b8ecd Apr 16 16:06:17.564063 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:17.564016 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-8pkwl" event={"ID":"5d436e24-a61c-48a2-8a51-1d4972aa2081","Type":"ContainerStarted","Data":"fbee8edf64474090707042dcaeba27ca67fc941348031d99fcff117eac5b8ecd"} Apr 16 16:06:21.697042 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.697002 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-685b94fb64-j9d9j"] Apr 16 16:06:21.700875 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.700848 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:21.705382 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.705352 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-r88v9\"" Apr 16 16:06:21.705618 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.705600 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 16:06:21.706215 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.706106 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 16:06:21.706215 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.706115 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 16:06:21.706215 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.706114 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 16:06:21.706426 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.706411 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 16:06:21.715802 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.715781 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-685b94fb64-j9d9j"] Apr 16 16:06:21.801408 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.801374 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd38406a-313a-4e05-beac-f73b4f576523-service-ca\") pod \"console-685b94fb64-j9d9j\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:21.801588 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.801512 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd38406a-313a-4e05-beac-f73b4f576523-console-config\") pod \"console-685b94fb64-j9d9j\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:21.801588 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.801541 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pmkx\" (UniqueName: \"kubernetes.io/projected/cd38406a-313a-4e05-beac-f73b4f576523-kube-api-access-9pmkx\") pod \"console-685b94fb64-j9d9j\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:21.801588 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.801573 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd38406a-313a-4e05-beac-f73b4f576523-oauth-serving-cert\") pod \"console-685b94fb64-j9d9j\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:21.801720 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.801639 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd38406a-313a-4e05-beac-f73b4f576523-console-oauth-config\") pod \"console-685b94fb64-j9d9j\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:21.801720 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.801668 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd38406a-313a-4e05-beac-f73b4f576523-console-serving-cert\") pod \"console-685b94fb64-j9d9j\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:21.903006 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.902971 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd38406a-313a-4e05-beac-f73b4f576523-service-ca\") pod \"console-685b94fb64-j9d9j\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:21.903216 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.903035 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd38406a-313a-4e05-beac-f73b4f576523-console-config\") pod \"console-685b94fb64-j9d9j\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:21.903216 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.903066 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9pmkx\" (UniqueName: \"kubernetes.io/projected/cd38406a-313a-4e05-beac-f73b4f576523-kube-api-access-9pmkx\") pod \"console-685b94fb64-j9d9j\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:21.903216 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.903098 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd38406a-313a-4e05-beac-f73b4f576523-oauth-serving-cert\") pod \"console-685b94fb64-j9d9j\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:21.903216 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.903121 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd38406a-313a-4e05-beac-f73b4f576523-console-oauth-config\") pod \"console-685b94fb64-j9d9j\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:21.903533 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.903504 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd38406a-313a-4e05-beac-f73b4f576523-console-serving-cert\") pod \"console-685b94fb64-j9d9j\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:21.903862 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.903827 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd38406a-313a-4e05-beac-f73b4f576523-oauth-serving-cert\") pod \"console-685b94fb64-j9d9j\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:21.903862 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.903827 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd38406a-313a-4e05-beac-f73b4f576523-service-ca\") pod \"console-685b94fb64-j9d9j\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:21.904023 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.903895 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd38406a-313a-4e05-beac-f73b4f576523-console-config\") pod \"console-685b94fb64-j9d9j\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:21.906079 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.906057 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd38406a-313a-4e05-beac-f73b4f576523-console-oauth-config\") pod \"console-685b94fb64-j9d9j\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:21.906206 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.906177 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd38406a-313a-4e05-beac-f73b4f576523-console-serving-cert\") pod \"console-685b94fb64-j9d9j\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:21.914590 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:21.914542 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pmkx\" (UniqueName: \"kubernetes.io/projected/cd38406a-313a-4e05-beac-f73b4f576523-kube-api-access-9pmkx\") pod \"console-685b94fb64-j9d9j\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:22.012411 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:22.012309 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:22.153709 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:22.153680 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-685b94fb64-j9d9j"] Apr 16 16:06:22.156253 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:06:22.156204 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd38406a_313a_4e05_beac_f73b4f576523.slice/crio-5e5b69f461510720b7f4885aa3106ae81d454a33a8ae804b85dafbf111bd3776 WatchSource:0}: Error finding container 5e5b69f461510720b7f4885aa3106ae81d454a33a8ae804b85dafbf111bd3776: Status 404 returned error can't find the container with id 5e5b69f461510720b7f4885aa3106ae81d454a33a8ae804b85dafbf111bd3776 Apr 16 16:06:22.579784 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:22.579741 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-685b94fb64-j9d9j" event={"ID":"cd38406a-313a-4e05-beac-f73b4f576523","Type":"ContainerStarted","Data":"5e5b69f461510720b7f4885aa3106ae81d454a33a8ae804b85dafbf111bd3776"} Apr 16 16:06:32.615879 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:32.615795 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-685b94fb64-j9d9j" event={"ID":"cd38406a-313a-4e05-beac-f73b4f576523","Type":"ContainerStarted","Data":"50ef10a361e22c4fdd0337a393f542999e3e46608fb3348c05b88112b792dde4"} Apr 16 16:06:32.617457 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:32.617415 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-8pkwl" event={"ID":"5d436e24-a61c-48a2-8a51-1d4972aa2081","Type":"ContainerStarted","Data":"2f1dc051913cd9f0923e5f1b95e72c59e23874d8ca37d343cc8c6261e72ca01b"} Apr 16 16:06:32.617632 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:32.617616 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-8pkwl" Apr 16 16:06:32.629306 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:32.629277 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-8pkwl" Apr 16 16:06:32.637345 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:32.637298 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-685b94fb64-j9d9j" podStartSLOduration=1.44681279 podStartE2EDuration="11.637284915s" podCreationTimestamp="2026-04-16 16:06:21 +0000 UTC" firstStartedPulling="2026-04-16 16:06:22.158957065 +0000 UTC m=+224.922581754" lastFinishedPulling="2026-04-16 16:06:32.349429189 +0000 UTC m=+235.113053879" observedRunningTime="2026-04-16 16:06:32.635727282 +0000 UTC m=+235.399352010" watchObservedRunningTime="2026-04-16 16:06:32.637284915 +0000 UTC m=+235.400909625" Apr 16 16:06:32.655113 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:32.655058 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-8pkwl" podStartSLOduration=0.877289278 podStartE2EDuration="16.655039037s" podCreationTimestamp="2026-04-16 16:06:16 +0000 UTC" firstStartedPulling="2026-04-16 16:06:16.615174503 +0000 UTC m=+219.378799205" lastFinishedPulling="2026-04-16 16:06:32.392924261 +0000 UTC m=+235.156548964" observedRunningTime="2026-04-16 16:06:32.653419648 +0000 UTC m=+235.417044360" watchObservedRunningTime="2026-04-16 16:06:32.655039037 +0000 UTC m=+235.418663749" Apr 16 16:06:42.012502 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:42.012460 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:42.012502 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:42.012513 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:42.017010 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:42.016987 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:42.077226 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:42.077191 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-685b94fb64-j9d9j"] Apr 16 16:06:42.653600 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:42.653571 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:06:49.657168 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:49.657102 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs\") pod \"network-metrics-daemon-2b9mp\" (UID: \"64649692-472e-4f06-9640-7e6075d1e84f\") " pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:06:49.659396 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:49.659374 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/64649692-472e-4f06-9640-7e6075d1e84f-metrics-certs\") pod \"network-metrics-daemon-2b9mp\" (UID: \"64649692-472e-4f06-9640-7e6075d1e84f\") " pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:06:49.718540 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:49.718503 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vp75c\"" Apr 16 16:06:49.726613 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:49.726589 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2b9mp" Apr 16 16:06:49.852728 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:49.852693 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2b9mp"] Apr 16 16:06:49.856294 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:06:49.856264 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64649692_472e_4f06_9640_7e6075d1e84f.slice/crio-4c91207a5a901ccf50f2ea781772696d01190abaafc443625557e2899854ca92 WatchSource:0}: Error finding container 4c91207a5a901ccf50f2ea781772696d01190abaafc443625557e2899854ca92: Status 404 returned error can't find the container with id 4c91207a5a901ccf50f2ea781772696d01190abaafc443625557e2899854ca92 Apr 16 16:06:50.673811 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:50.673765 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2b9mp" event={"ID":"64649692-472e-4f06-9640-7e6075d1e84f","Type":"ContainerStarted","Data":"4c91207a5a901ccf50f2ea781772696d01190abaafc443625557e2899854ca92"} Apr 16 16:06:51.678507 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:51.678469 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2b9mp" event={"ID":"64649692-472e-4f06-9640-7e6075d1e84f","Type":"ContainerStarted","Data":"74716e5bcd3e641741315c3d02391c4c090bb7d85a4eac060b77cbd00d174af8"} Apr 16 16:06:51.678507 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:51.678514 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2b9mp" event={"ID":"64649692-472e-4f06-9640-7e6075d1e84f","Type":"ContainerStarted","Data":"76ebd6ad01ddc5c2669075bfe7f9ecc358ee3a5ab28ddcade5f6bffb485aa8da"} Apr 16 16:06:51.679791 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:51.679766 2577 generic.go:358] "Generic (PLEG): container finished" podID="0d70c29e-9509-41d0-b800-db4a02a3db76" containerID="6a99697eac19f46cfe942820d7951881d331dc16d5ada846c0821eac0bf7c2fa" exitCode=0 Apr 16 16:06:51.679902 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:51.679825 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-8n252" event={"ID":"0d70c29e-9509-41d0-b800-db4a02a3db76","Type":"ContainerDied","Data":"6a99697eac19f46cfe942820d7951881d331dc16d5ada846c0821eac0bf7c2fa"} Apr 16 16:06:51.680096 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:51.680085 2577 scope.go:117] "RemoveContainer" containerID="6a99697eac19f46cfe942820d7951881d331dc16d5ada846c0821eac0bf7c2fa" Apr 16 16:06:51.697008 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:51.696951 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2b9mp" podStartSLOduration=252.251652798 podStartE2EDuration="4m13.696933222s" podCreationTimestamp="2026-04-16 16:02:38 +0000 UTC" firstStartedPulling="2026-04-16 16:06:49.858678017 +0000 UTC m=+252.622302706" lastFinishedPulling="2026-04-16 16:06:51.30395844 +0000 UTC m=+254.067583130" observedRunningTime="2026-04-16 16:06:51.695903908 +0000 UTC m=+254.459528618" watchObservedRunningTime="2026-04-16 16:06:51.696933222 +0000 UTC m=+254.460557937" Apr 16 16:06:52.684235 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:52.684202 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-8n252" event={"ID":"0d70c29e-9509-41d0-b800-db4a02a3db76","Type":"ContainerStarted","Data":"b7424fc9abeee5d4dac11bd58a610364c335f44d599c9e19913a88939595c299"} Apr 16 16:06:57.704994 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:57.704961 2577 generic.go:358] "Generic (PLEG): container finished" podID="106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb" containerID="0257749a7b8925668df6e1d6012a4a73795165ffa5fe790e1393491a2f3e80a4" exitCode=0 Apr 16 16:06:57.705393 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:57.705035 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vfg5m" event={"ID":"106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb","Type":"ContainerDied","Data":"0257749a7b8925668df6e1d6012a4a73795165ffa5fe790e1393491a2f3e80a4"} Apr 16 16:06:57.705393 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:57.705378 2577 scope.go:117] "RemoveContainer" containerID="0257749a7b8925668df6e1d6012a4a73795165ffa5fe790e1393491a2f3e80a4" Apr 16 16:06:58.709686 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:06:58.709631 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-vfg5m" event={"ID":"106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb","Type":"ContainerStarted","Data":"5c7aeceec74599be36f5b3da2eb1024df65979d7ed648b302031932a0bffd6a9"} Apr 16 16:07:08.671029 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:08.670987 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-685b94fb64-j9d9j" podUID="cd38406a-313a-4e05-beac-f73b4f576523" containerName="console" containerID="cri-o://50ef10a361e22c4fdd0337a393f542999e3e46608fb3348c05b88112b792dde4" gracePeriod=15 Apr 16 16:07:08.935184 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:08.935160 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-685b94fb64-j9d9j_cd38406a-313a-4e05-beac-f73b4f576523/console/0.log" Apr 16 16:07:08.935306 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:08.935219 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:07:09.034734 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.034696 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd38406a-313a-4e05-beac-f73b4f576523-console-oauth-config\") pod \"cd38406a-313a-4e05-beac-f73b4f576523\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " Apr 16 16:07:09.034734 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.034735 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd38406a-313a-4e05-beac-f73b4f576523-console-serving-cert\") pod \"cd38406a-313a-4e05-beac-f73b4f576523\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " Apr 16 16:07:09.034974 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.034793 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd38406a-313a-4e05-beac-f73b4f576523-console-config\") pod \"cd38406a-313a-4e05-beac-f73b4f576523\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " Apr 16 16:07:09.034974 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.034835 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd38406a-313a-4e05-beac-f73b4f576523-oauth-serving-cert\") pod \"cd38406a-313a-4e05-beac-f73b4f576523\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " Apr 16 16:07:09.034974 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.034858 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pmkx\" (UniqueName: \"kubernetes.io/projected/cd38406a-313a-4e05-beac-f73b4f576523-kube-api-access-9pmkx\") pod \"cd38406a-313a-4e05-beac-f73b4f576523\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " Apr 16 16:07:09.035156 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.035006 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd38406a-313a-4e05-beac-f73b4f576523-service-ca\") pod \"cd38406a-313a-4e05-beac-f73b4f576523\" (UID: \"cd38406a-313a-4e05-beac-f73b4f576523\") " Apr 16 16:07:09.035325 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.035299 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd38406a-313a-4e05-beac-f73b4f576523-console-config" (OuterVolumeSpecName: "console-config") pod "cd38406a-313a-4e05-beac-f73b4f576523" (UID: "cd38406a-313a-4e05-beac-f73b4f576523"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:07:09.035404 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.035331 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd38406a-313a-4e05-beac-f73b4f576523-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "cd38406a-313a-4e05-beac-f73b4f576523" (UID: "cd38406a-313a-4e05-beac-f73b4f576523"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:07:09.035440 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.035415 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd38406a-313a-4e05-beac-f73b4f576523-service-ca" (OuterVolumeSpecName: "service-ca") pod "cd38406a-313a-4e05-beac-f73b4f576523" (UID: "cd38406a-313a-4e05-beac-f73b4f576523"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:07:09.037033 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.037010 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd38406a-313a-4e05-beac-f73b4f576523-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "cd38406a-313a-4e05-beac-f73b4f576523" (UID: "cd38406a-313a-4e05-beac-f73b4f576523"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:07:09.037123 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.037073 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd38406a-313a-4e05-beac-f73b4f576523-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "cd38406a-313a-4e05-beac-f73b4f576523" (UID: "cd38406a-313a-4e05-beac-f73b4f576523"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:07:09.037123 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.037083 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd38406a-313a-4e05-beac-f73b4f576523-kube-api-access-9pmkx" (OuterVolumeSpecName: "kube-api-access-9pmkx") pod "cd38406a-313a-4e05-beac-f73b4f576523" (UID: "cd38406a-313a-4e05-beac-f73b4f576523"). InnerVolumeSpecName "kube-api-access-9pmkx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:07:09.135966 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.135924 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cd38406a-313a-4e05-beac-f73b4f576523-console-oauth-config\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:07:09.135966 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.135959 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd38406a-313a-4e05-beac-f73b4f576523-console-serving-cert\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:07:09.135966 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.135970 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cd38406a-313a-4e05-beac-f73b4f576523-console-config\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:07:09.136240 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.135980 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cd38406a-313a-4e05-beac-f73b4f576523-oauth-serving-cert\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:07:09.136240 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.135989 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9pmkx\" (UniqueName: \"kubernetes.io/projected/cd38406a-313a-4e05-beac-f73b4f576523-kube-api-access-9pmkx\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:07:09.136240 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.135998 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cd38406a-313a-4e05-beac-f73b4f576523-service-ca\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:07:09.743264 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.743232 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-685b94fb64-j9d9j_cd38406a-313a-4e05-beac-f73b4f576523/console/0.log" Apr 16 16:07:09.743665 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.743272 2577 generic.go:358] "Generic (PLEG): container finished" podID="cd38406a-313a-4e05-beac-f73b4f576523" containerID="50ef10a361e22c4fdd0337a393f542999e3e46608fb3348c05b88112b792dde4" exitCode=2 Apr 16 16:07:09.743665 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.743300 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-685b94fb64-j9d9j" event={"ID":"cd38406a-313a-4e05-beac-f73b4f576523","Type":"ContainerDied","Data":"50ef10a361e22c4fdd0337a393f542999e3e46608fb3348c05b88112b792dde4"} Apr 16 16:07:09.743665 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.743322 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-685b94fb64-j9d9j" event={"ID":"cd38406a-313a-4e05-beac-f73b4f576523","Type":"ContainerDied","Data":"5e5b69f461510720b7f4885aa3106ae81d454a33a8ae804b85dafbf111bd3776"} Apr 16 16:07:09.743665 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.743337 2577 scope.go:117] "RemoveContainer" containerID="50ef10a361e22c4fdd0337a393f542999e3e46608fb3348c05b88112b792dde4" Apr 16 16:07:09.743665 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.743352 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-685b94fb64-j9d9j" Apr 16 16:07:09.759484 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.759464 2577 scope.go:117] "RemoveContainer" containerID="50ef10a361e22c4fdd0337a393f542999e3e46608fb3348c05b88112b792dde4" Apr 16 16:07:09.759763 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:07:09.759742 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50ef10a361e22c4fdd0337a393f542999e3e46608fb3348c05b88112b792dde4\": container with ID starting with 50ef10a361e22c4fdd0337a393f542999e3e46608fb3348c05b88112b792dde4 not found: ID does not exist" containerID="50ef10a361e22c4fdd0337a393f542999e3e46608fb3348c05b88112b792dde4" Apr 16 16:07:09.759814 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.759772 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50ef10a361e22c4fdd0337a393f542999e3e46608fb3348c05b88112b792dde4"} err="failed to get container status \"50ef10a361e22c4fdd0337a393f542999e3e46608fb3348c05b88112b792dde4\": rpc error: code = NotFound desc = could not find container \"50ef10a361e22c4fdd0337a393f542999e3e46608fb3348c05b88112b792dde4\": container with ID starting with 50ef10a361e22c4fdd0337a393f542999e3e46608fb3348c05b88112b792dde4 not found: ID does not exist" Apr 16 16:07:09.766401 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.766377 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-685b94fb64-j9d9j"] Apr 16 16:07:09.769477 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.769455 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-685b94fb64-j9d9j"] Apr 16 16:07:09.919583 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:09.919549 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd38406a-313a-4e05-beac-f73b4f576523" path="/var/lib/kubelet/pods/cd38406a-313a-4e05-beac-f73b4f576523/volumes" Apr 16 16:07:17.312954 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:07:17.312909 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-2fk94" podUID="98c6de58-d4f8-4f67-ab26-d582517a2717" Apr 16 16:07:17.766946 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:17.766916 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2fk94" Apr 16 16:07:20.731641 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:20.731610 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert\") pod \"ingress-canary-ckdqq\" (UID: \"24ed6202-e738-4abc-b26d-eec84a76b75b\") " pod="openshift-ingress-canary/ingress-canary-ckdqq" Apr 16 16:07:20.732027 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:20.731780 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls\") pod \"dns-default-2fk94\" (UID: \"98c6de58-d4f8-4f67-ab26-d582517a2717\") " pod="openshift-dns/dns-default-2fk94" Apr 16 16:07:20.734716 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:20.734682 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/98c6de58-d4f8-4f67-ab26-d582517a2717-metrics-tls\") pod \"dns-default-2fk94\" (UID: \"98c6de58-d4f8-4f67-ab26-d582517a2717\") " pod="openshift-dns/dns-default-2fk94" Apr 16 16:07:20.734848 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:20.734727 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/24ed6202-e738-4abc-b26d-eec84a76b75b-cert\") pod \"ingress-canary-ckdqq\" (UID: \"24ed6202-e738-4abc-b26d-eec84a76b75b\") " pod="openshift-ingress-canary/ingress-canary-ckdqq" Apr 16 16:07:20.770601 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:20.770553 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-v9cqx\"" Apr 16 16:07:20.778602 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:20.778572 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2fk94" Apr 16 16:07:20.918322 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:20.918295 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wjzs4\"" Apr 16 16:07:20.926392 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:20.926369 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ckdqq" Apr 16 16:07:20.939562 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:20.939528 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2fk94"] Apr 16 16:07:20.946506 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:07:20.946465 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98c6de58_d4f8_4f67_ab26_d582517a2717.slice/crio-215178e0e8ee583a55f880b9a8296c32cbb7e7bd0256cd2d50c2c52023a35079 WatchSource:0}: Error finding container 215178e0e8ee583a55f880b9a8296c32cbb7e7bd0256cd2d50c2c52023a35079: Status 404 returned error can't find the container with id 215178e0e8ee583a55f880b9a8296c32cbb7e7bd0256cd2d50c2c52023a35079 Apr 16 16:07:21.071865 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:21.071816 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ckdqq"] Apr 16 16:07:21.076116 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:07:21.076092 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24ed6202_e738_4abc_b26d_eec84a76b75b.slice/crio-89b6c6e6a69da17d04c3b63e6eb6794f969ceffb689d6f2ba320e3009367e558 WatchSource:0}: Error finding container 89b6c6e6a69da17d04c3b63e6eb6794f969ceffb689d6f2ba320e3009367e558: Status 404 returned error can't find the container with id 89b6c6e6a69da17d04c3b63e6eb6794f969ceffb689d6f2ba320e3009367e558 Apr 16 16:07:21.780741 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:21.780697 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2fk94" event={"ID":"98c6de58-d4f8-4f67-ab26-d582517a2717","Type":"ContainerStarted","Data":"215178e0e8ee583a55f880b9a8296c32cbb7e7bd0256cd2d50c2c52023a35079"} Apr 16 16:07:21.781859 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:21.781834 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ckdqq" event={"ID":"24ed6202-e738-4abc-b26d-eec84a76b75b","Type":"ContainerStarted","Data":"89b6c6e6a69da17d04c3b63e6eb6794f969ceffb689d6f2ba320e3009367e558"} Apr 16 16:07:21.858187 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:21.858146 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:07:21.858764 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:21.858707 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="alertmanager" containerID="cri-o://9216f21e9b713ce330de57220677d68a7458e4456d308998dab2d2f42a68166a" gracePeriod=120 Apr 16 16:07:21.859416 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:21.858985 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="kube-rbac-proxy" containerID="cri-o://297635f2622ddb88e58eb5994bf0040ca32369fe1fabe57674300bc875c11a60" gracePeriod=120 Apr 16 16:07:21.859416 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:21.859035 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="prom-label-proxy" containerID="cri-o://4d2ab7c62090eda8d3d8c372a46518440acd13fdf4e4fab77f09ad6ea23dd9fe" gracePeriod=120 Apr 16 16:07:21.859416 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:21.859063 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="kube-rbac-proxy-web" containerID="cri-o://5c20dbe095e5bbdcde9d530f925a94e19cb008328efb6861c7b4ac532848ab9e" gracePeriod=120 Apr 16 16:07:21.859416 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:21.859103 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="kube-rbac-proxy-metric" containerID="cri-o://1cbff76073ebdde51e4020d3d9d7b84cddb391d5fa61f7ea821f307ddfeedc83" gracePeriod=120 Apr 16 16:07:21.859416 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:21.859118 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="config-reloader" containerID="cri-o://38ea4a7b88b5f9a4f4a36d234976953dfebf4631b85da3b4e9d6958a810ce372" gracePeriod=120 Apr 16 16:07:22.789816 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:22.789778 2577 generic.go:358] "Generic (PLEG): container finished" podID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerID="4d2ab7c62090eda8d3d8c372a46518440acd13fdf4e4fab77f09ad6ea23dd9fe" exitCode=0 Apr 16 16:07:22.789816 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:22.789809 2577 generic.go:358] "Generic (PLEG): container finished" podID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerID="1cbff76073ebdde51e4020d3d9d7b84cddb391d5fa61f7ea821f307ddfeedc83" exitCode=0 Apr 16 16:07:22.789816 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:22.789815 2577 generic.go:358] "Generic (PLEG): container finished" podID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerID="297635f2622ddb88e58eb5994bf0040ca32369fe1fabe57674300bc875c11a60" exitCode=0 Apr 16 16:07:22.789816 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:22.789821 2577 generic.go:358] "Generic (PLEG): container finished" podID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerID="38ea4a7b88b5f9a4f4a36d234976953dfebf4631b85da3b4e9d6958a810ce372" exitCode=0 Apr 16 16:07:22.790413 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:22.789827 2577 generic.go:358] "Generic (PLEG): container finished" podID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerID="9216f21e9b713ce330de57220677d68a7458e4456d308998dab2d2f42a68166a" exitCode=0 Apr 16 16:07:22.790413 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:22.789819 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf","Type":"ContainerDied","Data":"4d2ab7c62090eda8d3d8c372a46518440acd13fdf4e4fab77f09ad6ea23dd9fe"} Apr 16 16:07:22.790413 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:22.789864 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf","Type":"ContainerDied","Data":"1cbff76073ebdde51e4020d3d9d7b84cddb391d5fa61f7ea821f307ddfeedc83"} Apr 16 16:07:22.790413 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:22.789882 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf","Type":"ContainerDied","Data":"297635f2622ddb88e58eb5994bf0040ca32369fe1fabe57674300bc875c11a60"} Apr 16 16:07:22.790413 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:22.789897 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf","Type":"ContainerDied","Data":"38ea4a7b88b5f9a4f4a36d234976953dfebf4631b85da3b4e9d6958a810ce372"} Apr 16 16:07:22.790413 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:22.789910 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf","Type":"ContainerDied","Data":"9216f21e9b713ce330de57220677d68a7458e4456d308998dab2d2f42a68166a"} Apr 16 16:07:23.186996 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.186948 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:23.255230 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.254956 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-main-tls\") pod \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " Apr 16 16:07:23.255230 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.255008 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-alertmanager-main-db\") pod \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " Apr 16 16:07:23.255230 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.255047 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-tls-assets\") pod \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " Apr 16 16:07:23.255230 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.255080 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-kube-rbac-proxy-metric\") pod \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " Apr 16 16:07:23.255569 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.255537 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-alertmanager-trusted-ca-bundle\") pod \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " Apr 16 16:07:23.255629 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.255583 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-metrics-client-ca\") pod \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " Apr 16 16:07:23.255629 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.255608 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" (UID: "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:07:23.255629 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.255619 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-config-volume\") pod \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " Apr 16 16:07:23.255781 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.255657 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-kube-rbac-proxy\") pod \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " Apr 16 16:07:23.255781 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.255682 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-cluster-tls-config\") pod \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " Apr 16 16:07:23.255781 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.255738 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-kube-rbac-proxy-web\") pod \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " Apr 16 16:07:23.255781 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.255773 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwrfd\" (UniqueName: \"kubernetes.io/projected/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-kube-api-access-xwrfd\") pod \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " Apr 16 16:07:23.255961 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.255807 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-web-config\") pod \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " Apr 16 16:07:23.255961 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.255840 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-config-out\") pod \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\" (UID: \"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf\") " Apr 16 16:07:23.255961 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.255917 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" (UID: "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:07:23.256154 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.256113 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-alertmanager-main-db\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:07:23.256248 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.256152 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:07:23.257902 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.257860 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" (UID: "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:07:23.264173 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.263933 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" (UID: "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:07:23.264173 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.263948 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-kube-api-access-xwrfd" (OuterVolumeSpecName: "kube-api-access-xwrfd") pod "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" (UID: "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf"). InnerVolumeSpecName "kube-api-access-xwrfd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:07:23.264173 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.264025 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" (UID: "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:07:23.264173 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.264030 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" (UID: "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:07:23.264173 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.264075 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" (UID: "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:07:23.265499 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.265440 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" (UID: "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:07:23.265636 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.265611 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-config-out" (OuterVolumeSpecName: "config-out") pod "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" (UID: "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:07:23.266836 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.266798 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-config-volume" (OuterVolumeSpecName: "config-volume") pod "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" (UID: "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:07:23.270297 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.270265 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" (UID: "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:07:23.277223 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.277190 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-web-config" (OuterVolumeSpecName: "web-config") pod "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" (UID: "c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:07:23.356907 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.356868 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:07:23.356907 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.356899 2577 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-cluster-tls-config\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:07:23.356907 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.356911 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:07:23.357156 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.356920 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xwrfd\" (UniqueName: \"kubernetes.io/projected/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-kube-api-access-xwrfd\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:07:23.357156 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.356953 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-web-config\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:07:23.357156 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.356962 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-config-out\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:07:23.357156 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.356971 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-main-tls\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:07:23.357156 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.356979 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-tls-assets\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:07:23.357156 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.356988 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:07:23.357156 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.356997 2577 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-metrics-client-ca\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:07:23.357156 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.357006 2577 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf-config-volume\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:07:23.794601 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.794564 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2fk94" event={"ID":"98c6de58-d4f8-4f67-ab26-d582517a2717","Type":"ContainerStarted","Data":"6f14b57374461e347409614096fd3a8a384a58ffb240baa7b1f7a393c3de5a44"} Apr 16 16:07:23.794601 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.794607 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2fk94" event={"ID":"98c6de58-d4f8-4f67-ab26-d582517a2717","Type":"ContainerStarted","Data":"ac75ca007b4631f7c2cac7a426b7ccc41560badb10a735e3d71c5e63e6a74b3c"} Apr 16 16:07:23.795068 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.794694 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-2fk94" Apr 16 16:07:23.795929 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.795911 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ckdqq" event={"ID":"24ed6202-e738-4abc-b26d-eec84a76b75b","Type":"ContainerStarted","Data":"ad1ccbaa20b8c0af25d4f5cd26e4948419e16f109a35568f8eed0c419eb32437"} Apr 16 16:07:23.798839 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.798815 2577 generic.go:358] "Generic (PLEG): container finished" podID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerID="5c20dbe095e5bbdcde9d530f925a94e19cb008328efb6861c7b4ac532848ab9e" exitCode=0 Apr 16 16:07:23.798958 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.798862 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf","Type":"ContainerDied","Data":"5c20dbe095e5bbdcde9d530f925a94e19cb008328efb6861c7b4ac532848ab9e"} Apr 16 16:07:23.798958 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.798879 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf","Type":"ContainerDied","Data":"67174a4ca56da03371edd9ec8e45f6649cab6ccd926f61d94a05baad721ae3f5"} Apr 16 16:07:23.798958 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.798897 2577 scope.go:117] "RemoveContainer" containerID="4d2ab7c62090eda8d3d8c372a46518440acd13fdf4e4fab77f09ad6ea23dd9fe" Apr 16 16:07:23.798958 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.798919 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:23.806333 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.806311 2577 scope.go:117] "RemoveContainer" containerID="1cbff76073ebdde51e4020d3d9d7b84cddb391d5fa61f7ea821f307ddfeedc83" Apr 16 16:07:23.812879 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.812836 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2fk94" podStartSLOduration=251.717871349 podStartE2EDuration="4m13.812822976s" podCreationTimestamp="2026-04-16 16:03:10 +0000 UTC" firstStartedPulling="2026-04-16 16:07:20.948393037 +0000 UTC m=+283.712017726" lastFinishedPulling="2026-04-16 16:07:23.043344664 +0000 UTC m=+285.806969353" observedRunningTime="2026-04-16 16:07:23.811175145 +0000 UTC m=+286.574799847" watchObservedRunningTime="2026-04-16 16:07:23.812822976 +0000 UTC m=+286.576447687" Apr 16 16:07:23.813729 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.813715 2577 scope.go:117] "RemoveContainer" containerID="297635f2622ddb88e58eb5994bf0040ca32369fe1fabe57674300bc875c11a60" Apr 16 16:07:23.819970 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.819954 2577 scope.go:117] "RemoveContainer" containerID="5c20dbe095e5bbdcde9d530f925a94e19cb008328efb6861c7b4ac532848ab9e" Apr 16 16:07:23.826666 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.826648 2577 scope.go:117] "RemoveContainer" containerID="38ea4a7b88b5f9a4f4a36d234976953dfebf4631b85da3b4e9d6958a810ce372" Apr 16 16:07:23.832117 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.832074 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ckdqq" podStartSLOduration=251.858886326 podStartE2EDuration="4m13.832063081s" podCreationTimestamp="2026-04-16 16:03:10 +0000 UTC" firstStartedPulling="2026-04-16 16:07:21.078114486 +0000 UTC m=+283.841739176" lastFinishedPulling="2026-04-16 16:07:23.051291226 +0000 UTC m=+285.814915931" observedRunningTime="2026-04-16 16:07:23.830800828 +0000 UTC m=+286.594425578" watchObservedRunningTime="2026-04-16 16:07:23.832063081 +0000 UTC m=+286.595687806" Apr 16 16:07:23.833589 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.833571 2577 scope.go:117] "RemoveContainer" containerID="9216f21e9b713ce330de57220677d68a7458e4456d308998dab2d2f42a68166a" Apr 16 16:07:23.840920 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.840899 2577 scope.go:117] "RemoveContainer" containerID="901c9298e782c74b6b32419e0d8ee3b69246f8ae5fa1e62e3283c434f889d01b" Apr 16 16:07:23.848319 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.848267 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:07:23.849322 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.849287 2577 scope.go:117] "RemoveContainer" containerID="4d2ab7c62090eda8d3d8c372a46518440acd13fdf4e4fab77f09ad6ea23dd9fe" Apr 16 16:07:23.849673 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:07:23.849633 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d2ab7c62090eda8d3d8c372a46518440acd13fdf4e4fab77f09ad6ea23dd9fe\": container with ID starting with 4d2ab7c62090eda8d3d8c372a46518440acd13fdf4e4fab77f09ad6ea23dd9fe not found: ID does not exist" containerID="4d2ab7c62090eda8d3d8c372a46518440acd13fdf4e4fab77f09ad6ea23dd9fe" Apr 16 16:07:23.849774 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.849680 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d2ab7c62090eda8d3d8c372a46518440acd13fdf4e4fab77f09ad6ea23dd9fe"} err="failed to get container status \"4d2ab7c62090eda8d3d8c372a46518440acd13fdf4e4fab77f09ad6ea23dd9fe\": rpc error: code = NotFound desc = could not find container \"4d2ab7c62090eda8d3d8c372a46518440acd13fdf4e4fab77f09ad6ea23dd9fe\": container with ID starting with 4d2ab7c62090eda8d3d8c372a46518440acd13fdf4e4fab77f09ad6ea23dd9fe not found: ID does not exist" Apr 16 16:07:23.849774 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.849703 2577 scope.go:117] "RemoveContainer" containerID="1cbff76073ebdde51e4020d3d9d7b84cddb391d5fa61f7ea821f307ddfeedc83" Apr 16 16:07:23.850186 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:07:23.850040 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cbff76073ebdde51e4020d3d9d7b84cddb391d5fa61f7ea821f307ddfeedc83\": container with ID starting with 1cbff76073ebdde51e4020d3d9d7b84cddb391d5fa61f7ea821f307ddfeedc83 not found: ID does not exist" containerID="1cbff76073ebdde51e4020d3d9d7b84cddb391d5fa61f7ea821f307ddfeedc83" Apr 16 16:07:23.850186 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.850077 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cbff76073ebdde51e4020d3d9d7b84cddb391d5fa61f7ea821f307ddfeedc83"} err="failed to get container status \"1cbff76073ebdde51e4020d3d9d7b84cddb391d5fa61f7ea821f307ddfeedc83\": rpc error: code = NotFound desc = could not find container \"1cbff76073ebdde51e4020d3d9d7b84cddb391d5fa61f7ea821f307ddfeedc83\": container with ID starting with 1cbff76073ebdde51e4020d3d9d7b84cddb391d5fa61f7ea821f307ddfeedc83 not found: ID does not exist" Apr 16 16:07:23.850186 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.850099 2577 scope.go:117] "RemoveContainer" containerID="297635f2622ddb88e58eb5994bf0040ca32369fe1fabe57674300bc875c11a60" Apr 16 16:07:23.850460 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:07:23.850436 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"297635f2622ddb88e58eb5994bf0040ca32369fe1fabe57674300bc875c11a60\": container with ID starting with 297635f2622ddb88e58eb5994bf0040ca32369fe1fabe57674300bc875c11a60 not found: ID does not exist" containerID="297635f2622ddb88e58eb5994bf0040ca32369fe1fabe57674300bc875c11a60" Apr 16 16:07:23.850581 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.850465 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"297635f2622ddb88e58eb5994bf0040ca32369fe1fabe57674300bc875c11a60"} err="failed to get container status \"297635f2622ddb88e58eb5994bf0040ca32369fe1fabe57674300bc875c11a60\": rpc error: code = NotFound desc = could not find container \"297635f2622ddb88e58eb5994bf0040ca32369fe1fabe57674300bc875c11a60\": container with ID starting with 297635f2622ddb88e58eb5994bf0040ca32369fe1fabe57674300bc875c11a60 not found: ID does not exist" Apr 16 16:07:23.850581 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.850485 2577 scope.go:117] "RemoveContainer" containerID="5c20dbe095e5bbdcde9d530f925a94e19cb008328efb6861c7b4ac532848ab9e" Apr 16 16:07:23.850806 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:07:23.850780 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c20dbe095e5bbdcde9d530f925a94e19cb008328efb6861c7b4ac532848ab9e\": container with ID starting with 5c20dbe095e5bbdcde9d530f925a94e19cb008328efb6861c7b4ac532848ab9e not found: ID does not exist" containerID="5c20dbe095e5bbdcde9d530f925a94e19cb008328efb6861c7b4ac532848ab9e" Apr 16 16:07:23.850920 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.850810 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c20dbe095e5bbdcde9d530f925a94e19cb008328efb6861c7b4ac532848ab9e"} err="failed to get container status \"5c20dbe095e5bbdcde9d530f925a94e19cb008328efb6861c7b4ac532848ab9e\": rpc error: code = NotFound desc = could not find container \"5c20dbe095e5bbdcde9d530f925a94e19cb008328efb6861c7b4ac532848ab9e\": container with ID starting with 5c20dbe095e5bbdcde9d530f925a94e19cb008328efb6861c7b4ac532848ab9e not found: ID does not exist" Apr 16 16:07:23.850920 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.850828 2577 scope.go:117] "RemoveContainer" containerID="38ea4a7b88b5f9a4f4a36d234976953dfebf4631b85da3b4e9d6958a810ce372" Apr 16 16:07:23.851289 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:07:23.851257 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38ea4a7b88b5f9a4f4a36d234976953dfebf4631b85da3b4e9d6958a810ce372\": container with ID starting with 38ea4a7b88b5f9a4f4a36d234976953dfebf4631b85da3b4e9d6958a810ce372 not found: ID does not exist" containerID="38ea4a7b88b5f9a4f4a36d234976953dfebf4631b85da3b4e9d6958a810ce372" Apr 16 16:07:23.851361 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.851289 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38ea4a7b88b5f9a4f4a36d234976953dfebf4631b85da3b4e9d6958a810ce372"} err="failed to get container status \"38ea4a7b88b5f9a4f4a36d234976953dfebf4631b85da3b4e9d6958a810ce372\": rpc error: code = NotFound desc = could not find container \"38ea4a7b88b5f9a4f4a36d234976953dfebf4631b85da3b4e9d6958a810ce372\": container with ID starting with 38ea4a7b88b5f9a4f4a36d234976953dfebf4631b85da3b4e9d6958a810ce372 not found: ID does not exist" Apr 16 16:07:23.851361 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.851311 2577 scope.go:117] "RemoveContainer" containerID="9216f21e9b713ce330de57220677d68a7458e4456d308998dab2d2f42a68166a" Apr 16 16:07:23.851582 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:07:23.851564 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9216f21e9b713ce330de57220677d68a7458e4456d308998dab2d2f42a68166a\": container with ID starting with 9216f21e9b713ce330de57220677d68a7458e4456d308998dab2d2f42a68166a not found: ID does not exist" containerID="9216f21e9b713ce330de57220677d68a7458e4456d308998dab2d2f42a68166a" Apr 16 16:07:23.851625 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.851598 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9216f21e9b713ce330de57220677d68a7458e4456d308998dab2d2f42a68166a"} err="failed to get container status \"9216f21e9b713ce330de57220677d68a7458e4456d308998dab2d2f42a68166a\": rpc error: code = NotFound desc = could not find container \"9216f21e9b713ce330de57220677d68a7458e4456d308998dab2d2f42a68166a\": container with ID starting with 9216f21e9b713ce330de57220677d68a7458e4456d308998dab2d2f42a68166a not found: ID does not exist" Apr 16 16:07:23.851625 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.851613 2577 scope.go:117] "RemoveContainer" containerID="901c9298e782c74b6b32419e0d8ee3b69246f8ae5fa1e62e3283c434f889d01b" Apr 16 16:07:23.851843 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:07:23.851821 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"901c9298e782c74b6b32419e0d8ee3b69246f8ae5fa1e62e3283c434f889d01b\": container with ID starting with 901c9298e782c74b6b32419e0d8ee3b69246f8ae5fa1e62e3283c434f889d01b not found: ID does not exist" containerID="901c9298e782c74b6b32419e0d8ee3b69246f8ae5fa1e62e3283c434f889d01b" Apr 16 16:07:23.851938 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.851848 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"901c9298e782c74b6b32419e0d8ee3b69246f8ae5fa1e62e3283c434f889d01b"} err="failed to get container status \"901c9298e782c74b6b32419e0d8ee3b69246f8ae5fa1e62e3283c434f889d01b\": rpc error: code = NotFound desc = could not find container \"901c9298e782c74b6b32419e0d8ee3b69246f8ae5fa1e62e3283c434f889d01b\": container with ID starting with 901c9298e782c74b6b32419e0d8ee3b69246f8ae5fa1e62e3283c434f889d01b not found: ID does not exist" Apr 16 16:07:23.853302 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.853283 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:07:23.883796 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.883764 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:07:23.884088 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884073 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="config-reloader" Apr 16 16:07:23.884174 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884091 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="config-reloader" Apr 16 16:07:23.884174 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884105 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="kube-rbac-proxy-web" Apr 16 16:07:23.884174 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884113 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="kube-rbac-proxy-web" Apr 16 16:07:23.884174 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884143 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="alertmanager" Apr 16 16:07:23.884174 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884152 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="alertmanager" Apr 16 16:07:23.884174 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884166 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="init-config-reloader" Apr 16 16:07:23.884174 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884174 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="init-config-reloader" Apr 16 16:07:23.884515 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884192 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="prom-label-proxy" Apr 16 16:07:23.884515 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884200 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="prom-label-proxy" Apr 16 16:07:23.884515 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884210 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd38406a-313a-4e05-beac-f73b4f576523" containerName="console" Apr 16 16:07:23.884515 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884218 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd38406a-313a-4e05-beac-f73b4f576523" containerName="console" Apr 16 16:07:23.884515 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884233 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="kube-rbac-proxy" Apr 16 16:07:23.884515 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884241 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="kube-rbac-proxy" Apr 16 16:07:23.884515 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884257 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="kube-rbac-proxy-metric" Apr 16 16:07:23.884515 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884265 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="kube-rbac-proxy-metric" Apr 16 16:07:23.884515 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884346 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="kube-rbac-proxy-web" Apr 16 16:07:23.884515 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884362 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="alertmanager" Apr 16 16:07:23.884515 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884372 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="prom-label-proxy" Apr 16 16:07:23.884515 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884381 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="kube-rbac-proxy" Apr 16 16:07:23.884515 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884390 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="kube-rbac-proxy-metric" Apr 16 16:07:23.884515 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884399 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd38406a-313a-4e05-beac-f73b4f576523" containerName="console" Apr 16 16:07:23.884515 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.884409 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" containerName="config-reloader" Apr 16 16:07:23.889726 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.889705 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:23.892101 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.892078 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 16:07:23.892234 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.892210 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 16:07:23.892234 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.892225 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 16:07:23.892372 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.892247 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 16:07:23.892565 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.892548 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-gdl9n\"" Apr 16 16:07:23.892748 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.892734 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 16:07:23.892838 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.892804 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 16:07:23.893011 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.892998 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 16:07:23.893011 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.893005 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 16:07:23.902885 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.902861 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 16:07:23.906613 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.906591 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:07:23.917956 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.917934 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf" path="/var/lib/kubelet/pods/c5b1fbcf-7b9e-4bda-9ce7-1257d39791cf/volumes" Apr 16 16:07:23.961931 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.961898 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/049ea6b7-7647-4ba0-bc85-2f00639e58c5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:23.961931 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.961932 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/049ea6b7-7647-4ba0-bc85-2f00639e58c5-config-volume\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:23.962117 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.961956 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/049ea6b7-7647-4ba0-bc85-2f00639e58c5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:23.962117 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.962034 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/049ea6b7-7647-4ba0-bc85-2f00639e58c5-web-config\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:23.962117 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.962079 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/049ea6b7-7647-4ba0-bc85-2f00639e58c5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:23.962117 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.962103 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/049ea6b7-7647-4ba0-bc85-2f00639e58c5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:23.962275 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.962149 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/049ea6b7-7647-4ba0-bc85-2f00639e58c5-config-out\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:23.962275 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.962176 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/049ea6b7-7647-4ba0-bc85-2f00639e58c5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:23.962275 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.962203 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/049ea6b7-7647-4ba0-bc85-2f00639e58c5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:23.962275 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.962263 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/049ea6b7-7647-4ba0-bc85-2f00639e58c5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:23.962400 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.962296 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/049ea6b7-7647-4ba0-bc85-2f00639e58c5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:23.962400 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.962311 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/049ea6b7-7647-4ba0-bc85-2f00639e58c5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:23.962400 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:23.962332 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfblr\" (UniqueName: \"kubernetes.io/projected/049ea6b7-7647-4ba0-bc85-2f00639e58c5-kube-api-access-vfblr\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.062796 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.062703 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/049ea6b7-7647-4ba0-bc85-2f00639e58c5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.062796 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.062741 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/049ea6b7-7647-4ba0-bc85-2f00639e58c5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.062796 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.062759 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfblr\" (UniqueName: \"kubernetes.io/projected/049ea6b7-7647-4ba0-bc85-2f00639e58c5-kube-api-access-vfblr\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.063069 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.062880 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/049ea6b7-7647-4ba0-bc85-2f00639e58c5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.063069 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.062923 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/049ea6b7-7647-4ba0-bc85-2f00639e58c5-config-volume\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.063069 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.062951 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/049ea6b7-7647-4ba0-bc85-2f00639e58c5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.063069 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.063009 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/049ea6b7-7647-4ba0-bc85-2f00639e58c5-web-config\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.063069 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.063046 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/049ea6b7-7647-4ba0-bc85-2f00639e58c5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.063364 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.063074 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/049ea6b7-7647-4ba0-bc85-2f00639e58c5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.063364 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.063111 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/049ea6b7-7647-4ba0-bc85-2f00639e58c5-config-out\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.063364 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.063161 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/049ea6b7-7647-4ba0-bc85-2f00639e58c5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.063518 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.063499 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/049ea6b7-7647-4ba0-bc85-2f00639e58c5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.063857 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.063832 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/049ea6b7-7647-4ba0-bc85-2f00639e58c5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.064575 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.064551 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/049ea6b7-7647-4ba0-bc85-2f00639e58c5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.064765 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.064745 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/049ea6b7-7647-4ba0-bc85-2f00639e58c5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.064894 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.064878 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/049ea6b7-7647-4ba0-bc85-2f00639e58c5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.065817 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.065701 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/049ea6b7-7647-4ba0-bc85-2f00639e58c5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.066008 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.065984 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/049ea6b7-7647-4ba0-bc85-2f00639e58c5-web-config\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.066459 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.066100 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/049ea6b7-7647-4ba0-bc85-2f00639e58c5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.066459 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.066444 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/049ea6b7-7647-4ba0-bc85-2f00639e58c5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.066645 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.066623 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/049ea6b7-7647-4ba0-bc85-2f00639e58c5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.066809 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.066790 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/049ea6b7-7647-4ba0-bc85-2f00639e58c5-config-volume\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.067375 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.067356 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/049ea6b7-7647-4ba0-bc85-2f00639e58c5-config-out\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.067485 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.067467 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/049ea6b7-7647-4ba0-bc85-2f00639e58c5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.068290 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.068276 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/049ea6b7-7647-4ba0-bc85-2f00639e58c5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.071528 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.071505 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfblr\" (UniqueName: \"kubernetes.io/projected/049ea6b7-7647-4ba0-bc85-2f00639e58c5-kube-api-access-vfblr\") pod \"alertmanager-main-0\" (UID: \"049ea6b7-7647-4ba0-bc85-2f00639e58c5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.199492 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.199461 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:07:24.341273 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.341244 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:07:24.347925 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:07:24.347891 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod049ea6b7_7647_4ba0_bc85_2f00639e58c5.slice/crio-514d120553b83941633ed34a2d07a496db88b2570eabe60d32d92e2243079496 WatchSource:0}: Error finding container 514d120553b83941633ed34a2d07a496db88b2570eabe60d32d92e2243079496: Status 404 returned error can't find the container with id 514d120553b83941633ed34a2d07a496db88b2570eabe60d32d92e2243079496 Apr 16 16:07:24.803766 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.803732 2577 generic.go:358] "Generic (PLEG): container finished" podID="049ea6b7-7647-4ba0-bc85-2f00639e58c5" containerID="f5d80aec0602dc630d555974e8422f50e0f650f16660657a133cad1ca887edd2" exitCode=0 Apr 16 16:07:24.804258 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.803822 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"049ea6b7-7647-4ba0-bc85-2f00639e58c5","Type":"ContainerDied","Data":"f5d80aec0602dc630d555974e8422f50e0f650f16660657a133cad1ca887edd2"} Apr 16 16:07:24.804258 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:24.803854 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"049ea6b7-7647-4ba0-bc85-2f00639e58c5","Type":"ContainerStarted","Data":"514d120553b83941633ed34a2d07a496db88b2570eabe60d32d92e2243079496"} Apr 16 16:07:25.811336 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.811302 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"049ea6b7-7647-4ba0-bc85-2f00639e58c5","Type":"ContainerStarted","Data":"8bbacea55cce2b2ff66dd6d973465c7aa1c4c48596e33a81943a5d54b32da9fe"} Apr 16 16:07:25.811336 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.811337 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"049ea6b7-7647-4ba0-bc85-2f00639e58c5","Type":"ContainerStarted","Data":"c98013a6113118ea7d97b8f43cc2e4d81be2ee51e3d9e1834eb0f9704469fb9b"} Apr 16 16:07:25.811770 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.811351 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"049ea6b7-7647-4ba0-bc85-2f00639e58c5","Type":"ContainerStarted","Data":"63d28007875e84c1379f94be7d54b35194eded7a0f7229f5088639f3d9b9482a"} Apr 16 16:07:25.811770 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.811363 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"049ea6b7-7647-4ba0-bc85-2f00639e58c5","Type":"ContainerStarted","Data":"9c8e0f80a1abef2b966e5589177a093e57dc71da4cf52d8d92cb56234b3e0251"} Apr 16 16:07:25.811770 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.811375 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"049ea6b7-7647-4ba0-bc85-2f00639e58c5","Type":"ContainerStarted","Data":"6c218785a6120d8ea6c3afc9b8b1f8e2587d1f3f182b91c4cd7015d2444652df"} Apr 16 16:07:25.811770 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.811384 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"049ea6b7-7647-4ba0-bc85-2f00639e58c5","Type":"ContainerStarted","Data":"ec57a701c9c1b0e7eb791c32b57dfe93e3fd3ec2571259f516ac22fc77b4add5"} Apr 16 16:07:25.842856 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.842801 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.8427834069999998 podStartE2EDuration="2.842783407s" podCreationTimestamp="2026-04-16 16:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:07:25.841246897 +0000 UTC m=+288.604871609" watchObservedRunningTime="2026-04-16 16:07:25.842783407 +0000 UTC m=+288.606408119" Apr 16 16:07:25.886672 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.886634 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-7b94b48476-s499c"] Apr 16 16:07:25.890901 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.890875 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:25.895594 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.895526 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-zczq2\"" Apr 16 16:07:25.895904 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.895548 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 16:07:25.896044 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.895626 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 16:07:25.897155 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.896250 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 16:07:25.897155 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.896448 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 16:07:25.898160 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.897555 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 16:07:25.900432 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.900409 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 16:07:25.907352 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.907318 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7b94b48476-s499c"] Apr 16 16:07:25.997286 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.997252 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/bacaae01-d5b9-4bfb-9444-11a0a9791f46-telemeter-client-tls\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:25.997429 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.997291 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bacaae01-d5b9-4bfb-9444-11a0a9791f46-serving-certs-ca-bundle\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:25.997429 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.997313 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/bacaae01-d5b9-4bfb-9444-11a0a9791f46-secret-telemeter-client\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:25.997429 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.997414 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bacaae01-d5b9-4bfb-9444-11a0a9791f46-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:25.997550 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.997486 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bacaae01-d5b9-4bfb-9444-11a0a9791f46-metrics-client-ca\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:25.997590 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.997519 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bacaae01-d5b9-4bfb-9444-11a0a9791f46-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:25.997624 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.997584 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgk68\" (UniqueName: \"kubernetes.io/projected/bacaae01-d5b9-4bfb-9444-11a0a9791f46-kube-api-access-pgk68\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:25.997674 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:25.997620 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/bacaae01-d5b9-4bfb-9444-11a0a9791f46-federate-client-tls\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:26.098628 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:26.098524 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bacaae01-d5b9-4bfb-9444-11a0a9791f46-metrics-client-ca\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:26.098628 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:26.098584 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bacaae01-d5b9-4bfb-9444-11a0a9791f46-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:26.098628 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:26.098610 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgk68\" (UniqueName: \"kubernetes.io/projected/bacaae01-d5b9-4bfb-9444-11a0a9791f46-kube-api-access-pgk68\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:26.098914 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:26.098645 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/bacaae01-d5b9-4bfb-9444-11a0a9791f46-federate-client-tls\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:26.098914 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:26.098792 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/bacaae01-d5b9-4bfb-9444-11a0a9791f46-telemeter-client-tls\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:26.098914 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:26.098845 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bacaae01-d5b9-4bfb-9444-11a0a9791f46-serving-certs-ca-bundle\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:26.098914 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:26.098874 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/bacaae01-d5b9-4bfb-9444-11a0a9791f46-secret-telemeter-client\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:26.098914 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:26.098911 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bacaae01-d5b9-4bfb-9444-11a0a9791f46-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:26.099413 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:26.099380 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bacaae01-d5b9-4bfb-9444-11a0a9791f46-metrics-client-ca\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:26.099566 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:26.099543 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bacaae01-d5b9-4bfb-9444-11a0a9791f46-telemeter-trusted-ca-bundle\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:26.099644 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:26.099615 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bacaae01-d5b9-4bfb-9444-11a0a9791f46-serving-certs-ca-bundle\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:26.101280 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:26.101255 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/bacaae01-d5b9-4bfb-9444-11a0a9791f46-secret-telemeter-client\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:26.101505 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:26.101481 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/bacaae01-d5b9-4bfb-9444-11a0a9791f46-telemeter-client-tls\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:26.101579 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:26.101558 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/bacaae01-d5b9-4bfb-9444-11a0a9791f46-federate-client-tls\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:26.101824 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:26.101807 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bacaae01-d5b9-4bfb-9444-11a0a9791f46-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:26.107104 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:26.107079 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgk68\" (UniqueName: \"kubernetes.io/projected/bacaae01-d5b9-4bfb-9444-11a0a9791f46-kube-api-access-pgk68\") pod \"telemeter-client-7b94b48476-s499c\" (UID: \"bacaae01-d5b9-4bfb-9444-11a0a9791f46\") " pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:26.206553 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:26.206509 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" Apr 16 16:07:26.352650 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:26.352582 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-7b94b48476-s499c"] Apr 16 16:07:26.355115 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:07:26.355088 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbacaae01_d5b9_4bfb_9444_11a0a9791f46.slice/crio-1303c3c989c9fc41294f30989a14d5d809e8162be281a3454b84170e0bd0a464 WatchSource:0}: Error finding container 1303c3c989c9fc41294f30989a14d5d809e8162be281a3454b84170e0bd0a464: Status 404 returned error can't find the container with id 1303c3c989c9fc41294f30989a14d5d809e8162be281a3454b84170e0bd0a464 Apr 16 16:07:26.816375 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:26.816337 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" event={"ID":"bacaae01-d5b9-4bfb-9444-11a0a9791f46","Type":"ContainerStarted","Data":"1303c3c989c9fc41294f30989a14d5d809e8162be281a3454b84170e0bd0a464"} Apr 16 16:07:28.823313 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:28.823278 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" event={"ID":"bacaae01-d5b9-4bfb-9444-11a0a9791f46","Type":"ContainerStarted","Data":"4573e85d0b6b23dc2c530e743500738e4852356b290d2100e64c6cc28c239cd4"} Apr 16 16:07:28.823313 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:28.823315 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" event={"ID":"bacaae01-d5b9-4bfb-9444-11a0a9791f46","Type":"ContainerStarted","Data":"46c0bed76c8bd4bef13f298d14a864786b89f68391013a15dd82e62c5b9aa3f5"} Apr 16 16:07:28.823691 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:28.823326 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" event={"ID":"bacaae01-d5b9-4bfb-9444-11a0a9791f46","Type":"ContainerStarted","Data":"4affabf9f9236c666395608afe0f74ba3b3319b0df1752e586831d67c7e0ac71"} Apr 16 16:07:28.848019 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:28.847960 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-7b94b48476-s499c" podStartSLOduration=2.340867411 podStartE2EDuration="3.847947097s" podCreationTimestamp="2026-04-16 16:07:25 +0000 UTC" firstStartedPulling="2026-04-16 16:07:26.35742763 +0000 UTC m=+289.121052322" lastFinishedPulling="2026-04-16 16:07:27.864507316 +0000 UTC m=+290.628132008" observedRunningTime="2026-04-16 16:07:28.847763623 +0000 UTC m=+291.611388335" watchObservedRunningTime="2026-04-16 16:07:28.847947097 +0000 UTC m=+291.611571809" Apr 16 16:07:29.505914 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.505874 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-56569f8bc7-twn6m"] Apr 16 16:07:29.509626 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.509600 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.514393 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.514373 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 16:07:29.514523 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.514372 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 16:07:29.514523 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.514372 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 16:07:29.514523 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.514378 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-r88v9\"" Apr 16 16:07:29.514523 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.514411 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 16:07:29.515425 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.515409 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 16:07:29.520326 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.520310 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 16:07:29.523197 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.523177 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56569f8bc7-twn6m"] Apr 16 16:07:29.626386 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.626347 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-console-config\") pod \"console-56569f8bc7-twn6m\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.626386 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.626384 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-trusted-ca-bundle\") pod \"console-56569f8bc7-twn6m\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.626582 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.626405 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-oauth-serving-cert\") pod \"console-56569f8bc7-twn6m\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.626582 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.626450 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n75ps\" (UniqueName: \"kubernetes.io/projected/16e1482f-7167-42f7-840e-8c2d3b04a3c8-kube-api-access-n75ps\") pod \"console-56569f8bc7-twn6m\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.626582 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.626478 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-service-ca\") pod \"console-56569f8bc7-twn6m\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.626582 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.626527 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16e1482f-7167-42f7-840e-8c2d3b04a3c8-console-serving-cert\") pod \"console-56569f8bc7-twn6m\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.626582 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.626557 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16e1482f-7167-42f7-840e-8c2d3b04a3c8-console-oauth-config\") pod \"console-56569f8bc7-twn6m\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.727495 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.727458 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-console-config\") pod \"console-56569f8bc7-twn6m\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.727495 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.727494 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-trusted-ca-bundle\") pod \"console-56569f8bc7-twn6m\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.727737 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.727513 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-oauth-serving-cert\") pod \"console-56569f8bc7-twn6m\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.727737 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.727533 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n75ps\" (UniqueName: \"kubernetes.io/projected/16e1482f-7167-42f7-840e-8c2d3b04a3c8-kube-api-access-n75ps\") pod \"console-56569f8bc7-twn6m\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.727737 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.727564 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-service-ca\") pod \"console-56569f8bc7-twn6m\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.727737 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.727589 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16e1482f-7167-42f7-840e-8c2d3b04a3c8-console-serving-cert\") pod \"console-56569f8bc7-twn6m\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.727737 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.727614 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16e1482f-7167-42f7-840e-8c2d3b04a3c8-console-oauth-config\") pod \"console-56569f8bc7-twn6m\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.728299 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.728270 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-console-config\") pod \"console-56569f8bc7-twn6m\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.728437 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.728309 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-oauth-serving-cert\") pod \"console-56569f8bc7-twn6m\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.728506 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.728435 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-service-ca\") pod \"console-56569f8bc7-twn6m\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.728564 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.728517 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-trusted-ca-bundle\") pod \"console-56569f8bc7-twn6m\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.730054 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.730027 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16e1482f-7167-42f7-840e-8c2d3b04a3c8-console-oauth-config\") pod \"console-56569f8bc7-twn6m\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.730207 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.730189 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16e1482f-7167-42f7-840e-8c2d3b04a3c8-console-serving-cert\") pod \"console-56569f8bc7-twn6m\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.736483 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.736461 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n75ps\" (UniqueName: \"kubernetes.io/projected/16e1482f-7167-42f7-840e-8c2d3b04a3c8-kube-api-access-n75ps\") pod \"console-56569f8bc7-twn6m\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.819851 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.819759 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:29.964287 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:29.964254 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-56569f8bc7-twn6m"] Apr 16 16:07:29.966160 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:07:29.966120 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16e1482f_7167_42f7_840e_8c2d3b04a3c8.slice/crio-ef3bff261642ec7da6267d51adf1b2bf5d626c5fb348a5a9ba1d958bb2fcc9f3 WatchSource:0}: Error finding container ef3bff261642ec7da6267d51adf1b2bf5d626c5fb348a5a9ba1d958bb2fcc9f3: Status 404 returned error can't find the container with id ef3bff261642ec7da6267d51adf1b2bf5d626c5fb348a5a9ba1d958bb2fcc9f3 Apr 16 16:07:30.829980 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:30.829942 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56569f8bc7-twn6m" event={"ID":"16e1482f-7167-42f7-840e-8c2d3b04a3c8","Type":"ContainerStarted","Data":"3a81ae684aad5809a93662f86415a50a45111e7721021fd8b6bd6cb51883d6bb"} Apr 16 16:07:30.829980 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:30.829984 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56569f8bc7-twn6m" event={"ID":"16e1482f-7167-42f7-840e-8c2d3b04a3c8","Type":"ContainerStarted","Data":"ef3bff261642ec7da6267d51adf1b2bf5d626c5fb348a5a9ba1d958bb2fcc9f3"} Apr 16 16:07:30.849910 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:30.849862 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-56569f8bc7-twn6m" podStartSLOduration=1.8498469690000001 podStartE2EDuration="1.849846969s" podCreationTimestamp="2026-04-16 16:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:07:30.847914004 +0000 UTC m=+293.611538715" watchObservedRunningTime="2026-04-16 16:07:30.849846969 +0000 UTC m=+293.613471680" Apr 16 16:07:33.807116 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:33.807085 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2fk94" Apr 16 16:07:37.784599 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:37.784569 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2894b_ff7fcfa4-9774-4831-b686-678a7f92a456/console-operator/2.log" Apr 16 16:07:37.785040 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:37.785017 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2894b_ff7fcfa4-9774-4831-b686-678a7f92a456/console-operator/2.log" Apr 16 16:07:37.795473 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:37.795449 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/ovn-acl-logging/0.log" Apr 16 16:07:37.796371 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:37.796354 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/ovn-acl-logging/0.log" Apr 16 16:07:37.798639 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:37.798623 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 16:07:39.820502 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:39.820469 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:39.820502 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:39.820506 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:39.825254 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:39.825231 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:07:39.860371 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:07:39.860346 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:08:39.289843 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.289759 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c6dcb9ffd-7kwcj"] Apr 16 16:08:39.292654 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.292632 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.304939 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.304912 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c6dcb9ffd-7kwcj"] Apr 16 16:08:39.399024 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.398989 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-trusted-ca-bundle\") pod \"console-5c6dcb9ffd-7kwcj\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.399210 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.399031 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52a2d57c-c665-4c97-b321-ce358ec8b0ac-console-serving-cert\") pod \"console-5c6dcb9ffd-7kwcj\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.399210 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.399052 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52a2d57c-c665-4c97-b321-ce358ec8b0ac-console-oauth-config\") pod \"console-5c6dcb9ffd-7kwcj\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.399210 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.399072 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-console-config\") pod \"console-5c6dcb9ffd-7kwcj\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.399210 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.399099 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-oauth-serving-cert\") pod \"console-5c6dcb9ffd-7kwcj\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.399210 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.399124 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj6zp\" (UniqueName: \"kubernetes.io/projected/52a2d57c-c665-4c97-b321-ce358ec8b0ac-kube-api-access-sj6zp\") pod \"console-5c6dcb9ffd-7kwcj\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.399210 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.399185 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-service-ca\") pod \"console-5c6dcb9ffd-7kwcj\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.499711 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.499677 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52a2d57c-c665-4c97-b321-ce358ec8b0ac-console-serving-cert\") pod \"console-5c6dcb9ffd-7kwcj\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.499711 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.499712 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52a2d57c-c665-4c97-b321-ce358ec8b0ac-console-oauth-config\") pod \"console-5c6dcb9ffd-7kwcj\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.499968 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.499731 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-console-config\") pod \"console-5c6dcb9ffd-7kwcj\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.499968 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.499785 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-oauth-serving-cert\") pod \"console-5c6dcb9ffd-7kwcj\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.499968 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.499821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sj6zp\" (UniqueName: \"kubernetes.io/projected/52a2d57c-c665-4c97-b321-ce358ec8b0ac-kube-api-access-sj6zp\") pod \"console-5c6dcb9ffd-7kwcj\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.499968 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.499957 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-service-ca\") pod \"console-5c6dcb9ffd-7kwcj\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.500213 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.500019 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-trusted-ca-bundle\") pod \"console-5c6dcb9ffd-7kwcj\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.500671 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.500646 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-console-config\") pod \"console-5c6dcb9ffd-7kwcj\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.500671 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.500657 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-oauth-serving-cert\") pod \"console-5c6dcb9ffd-7kwcj\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.500837 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.500743 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-service-ca\") pod \"console-5c6dcb9ffd-7kwcj\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.501034 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.501015 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-trusted-ca-bundle\") pod \"console-5c6dcb9ffd-7kwcj\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.502315 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.502287 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52a2d57c-c665-4c97-b321-ce358ec8b0ac-console-oauth-config\") pod \"console-5c6dcb9ffd-7kwcj\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.502409 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.502393 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52a2d57c-c665-4c97-b321-ce358ec8b0ac-console-serving-cert\") pod \"console-5c6dcb9ffd-7kwcj\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.507153 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.507112 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj6zp\" (UniqueName: \"kubernetes.io/projected/52a2d57c-c665-4c97-b321-ce358ec8b0ac-kube-api-access-sj6zp\") pod \"console-5c6dcb9ffd-7kwcj\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.603425 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.603334 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:39.728184 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.728158 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c6dcb9ffd-7kwcj"] Apr 16 16:08:39.730966 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:08:39.730927 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52a2d57c_c665_4c97_b321_ce358ec8b0ac.slice/crio-2236f6358f5b3115d31d1000850825281cefb36eedb6aba42a1e90f8c9a25d4a WatchSource:0}: Error finding container 2236f6358f5b3115d31d1000850825281cefb36eedb6aba42a1e90f8c9a25d4a: Status 404 returned error can't find the container with id 2236f6358f5b3115d31d1000850825281cefb36eedb6aba42a1e90f8c9a25d4a Apr 16 16:08:39.732770 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:39.732752 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:08:40.039343 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:40.039307 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6dcb9ffd-7kwcj" event={"ID":"52a2d57c-c665-4c97-b321-ce358ec8b0ac","Type":"ContainerStarted","Data":"5af6c7e3a98168055ea44fac940ac4a93c96419db2dedfccf32c23c6c3bc3a32"} Apr 16 16:08:40.039343 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:40.039345 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6dcb9ffd-7kwcj" event={"ID":"52a2d57c-c665-4c97-b321-ce358ec8b0ac","Type":"ContainerStarted","Data":"2236f6358f5b3115d31d1000850825281cefb36eedb6aba42a1e90f8c9a25d4a"} Apr 16 16:08:40.060028 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:40.059978 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c6dcb9ffd-7kwcj" podStartSLOduration=1.059961932 podStartE2EDuration="1.059961932s" podCreationTimestamp="2026-04-16 16:08:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:08:40.058589286 +0000 UTC m=+362.822214006" watchObservedRunningTime="2026-04-16 16:08:40.059961932 +0000 UTC m=+362.823586644" Apr 16 16:08:49.604235 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:49.604200 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:49.604786 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:49.604248 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:49.609735 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:49.609701 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:50.072034 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:50.072003 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:08:50.124251 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:08:50.124218 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56569f8bc7-twn6m"] Apr 16 16:09:15.147414 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.147355 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-56569f8bc7-twn6m" podUID="16e1482f-7167-42f7-840e-8c2d3b04a3c8" containerName="console" containerID="cri-o://3a81ae684aad5809a93662f86415a50a45111e7721021fd8b6bd6cb51883d6bb" gracePeriod=15 Apr 16 16:09:15.383167 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.383145 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56569f8bc7-twn6m_16e1482f-7167-42f7-840e-8c2d3b04a3c8/console/0.log" Apr 16 16:09:15.383298 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.383204 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:09:15.508459 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.508421 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n75ps\" (UniqueName: \"kubernetes.io/projected/16e1482f-7167-42f7-840e-8c2d3b04a3c8-kube-api-access-n75ps\") pod \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " Apr 16 16:09:15.508459 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.508463 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-oauth-serving-cert\") pod \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " Apr 16 16:09:15.508692 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.508524 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16e1482f-7167-42f7-840e-8c2d3b04a3c8-console-serving-cert\") pod \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " Apr 16 16:09:15.508692 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.508541 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16e1482f-7167-42f7-840e-8c2d3b04a3c8-console-oauth-config\") pod \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " Apr 16 16:09:15.508692 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.508582 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-trusted-ca-bundle\") pod \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " Apr 16 16:09:15.508692 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.508602 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-console-config\") pod \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " Apr 16 16:09:15.508692 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.508629 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-service-ca\") pod \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\" (UID: \"16e1482f-7167-42f7-840e-8c2d3b04a3c8\") " Apr 16 16:09:15.508973 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.508933 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "16e1482f-7167-42f7-840e-8c2d3b04a3c8" (UID: "16e1482f-7167-42f7-840e-8c2d3b04a3c8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:09:15.509040 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.509012 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "16e1482f-7167-42f7-840e-8c2d3b04a3c8" (UID: "16e1482f-7167-42f7-840e-8c2d3b04a3c8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:09:15.509096 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.509042 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-console-config" (OuterVolumeSpecName: "console-config") pod "16e1482f-7167-42f7-840e-8c2d3b04a3c8" (UID: "16e1482f-7167-42f7-840e-8c2d3b04a3c8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:09:15.509183 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.509106 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-service-ca" (OuterVolumeSpecName: "service-ca") pod "16e1482f-7167-42f7-840e-8c2d3b04a3c8" (UID: "16e1482f-7167-42f7-840e-8c2d3b04a3c8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:09:15.510656 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.510633 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16e1482f-7167-42f7-840e-8c2d3b04a3c8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "16e1482f-7167-42f7-840e-8c2d3b04a3c8" (UID: "16e1482f-7167-42f7-840e-8c2d3b04a3c8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:09:15.510771 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.510684 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16e1482f-7167-42f7-840e-8c2d3b04a3c8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "16e1482f-7167-42f7-840e-8c2d3b04a3c8" (UID: "16e1482f-7167-42f7-840e-8c2d3b04a3c8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:09:15.510771 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.510754 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16e1482f-7167-42f7-840e-8c2d3b04a3c8-kube-api-access-n75ps" (OuterVolumeSpecName: "kube-api-access-n75ps") pod "16e1482f-7167-42f7-840e-8c2d3b04a3c8" (UID: "16e1482f-7167-42f7-840e-8c2d3b04a3c8"). InnerVolumeSpecName "kube-api-access-n75ps". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:09:15.610300 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.610259 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-trusted-ca-bundle\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:09:15.610300 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.610293 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-console-config\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:09:15.610300 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.610303 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-service-ca\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:09:15.610300 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.610311 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n75ps\" (UniqueName: \"kubernetes.io/projected/16e1482f-7167-42f7-840e-8c2d3b04a3c8-kube-api-access-n75ps\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:09:15.610567 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.610322 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16e1482f-7167-42f7-840e-8c2d3b04a3c8-oauth-serving-cert\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:09:15.610567 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.610331 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16e1482f-7167-42f7-840e-8c2d3b04a3c8-console-serving-cert\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:09:15.610567 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:15.610340 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16e1482f-7167-42f7-840e-8c2d3b04a3c8-console-oauth-config\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:09:16.150778 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:16.150751 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-56569f8bc7-twn6m_16e1482f-7167-42f7-840e-8c2d3b04a3c8/console/0.log" Apr 16 16:09:16.151213 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:16.150789 2577 generic.go:358] "Generic (PLEG): container finished" podID="16e1482f-7167-42f7-840e-8c2d3b04a3c8" containerID="3a81ae684aad5809a93662f86415a50a45111e7721021fd8b6bd6cb51883d6bb" exitCode=2 Apr 16 16:09:16.151213 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:16.150847 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56569f8bc7-twn6m" event={"ID":"16e1482f-7167-42f7-840e-8c2d3b04a3c8","Type":"ContainerDied","Data":"3a81ae684aad5809a93662f86415a50a45111e7721021fd8b6bd6cb51883d6bb"} Apr 16 16:09:16.151213 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:16.150876 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-56569f8bc7-twn6m" event={"ID":"16e1482f-7167-42f7-840e-8c2d3b04a3c8","Type":"ContainerDied","Data":"ef3bff261642ec7da6267d51adf1b2bf5d626c5fb348a5a9ba1d958bb2fcc9f3"} Apr 16 16:09:16.151213 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:16.150871 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-56569f8bc7-twn6m" Apr 16 16:09:16.151213 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:16.150945 2577 scope.go:117] "RemoveContainer" containerID="3a81ae684aad5809a93662f86415a50a45111e7721021fd8b6bd6cb51883d6bb" Apr 16 16:09:16.158784 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:16.158764 2577 scope.go:117] "RemoveContainer" containerID="3a81ae684aad5809a93662f86415a50a45111e7721021fd8b6bd6cb51883d6bb" Apr 16 16:09:16.159044 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:09:16.159026 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a81ae684aad5809a93662f86415a50a45111e7721021fd8b6bd6cb51883d6bb\": container with ID starting with 3a81ae684aad5809a93662f86415a50a45111e7721021fd8b6bd6cb51883d6bb not found: ID does not exist" containerID="3a81ae684aad5809a93662f86415a50a45111e7721021fd8b6bd6cb51883d6bb" Apr 16 16:09:16.159094 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:16.159052 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a81ae684aad5809a93662f86415a50a45111e7721021fd8b6bd6cb51883d6bb"} err="failed to get container status \"3a81ae684aad5809a93662f86415a50a45111e7721021fd8b6bd6cb51883d6bb\": rpc error: code = NotFound desc = could not find container \"3a81ae684aad5809a93662f86415a50a45111e7721021fd8b6bd6cb51883d6bb\": container with ID starting with 3a81ae684aad5809a93662f86415a50a45111e7721021fd8b6bd6cb51883d6bb not found: ID does not exist" Apr 16 16:09:16.168475 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:16.168451 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-56569f8bc7-twn6m"] Apr 16 16:09:16.176989 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:16.174960 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-56569f8bc7-twn6m"] Apr 16 16:09:17.918840 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:17.918805 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16e1482f-7167-42f7-840e-8c2d3b04a3c8" path="/var/lib/kubelet/pods/16e1482f-7167-42f7-840e-8c2d3b04a3c8/volumes" Apr 16 16:09:54.302450 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:54.302408 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h"] Apr 16 16:09:54.303028 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:54.302922 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16e1482f-7167-42f7-840e-8c2d3b04a3c8" containerName="console" Apr 16 16:09:54.303028 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:54.302941 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="16e1482f-7167-42f7-840e-8c2d3b04a3c8" containerName="console" Apr 16 16:09:54.303181 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:54.303034 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="16e1482f-7167-42f7-840e-8c2d3b04a3c8" containerName="console" Apr 16 16:09:54.305791 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:54.305771 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h" Apr 16 16:09:54.308232 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:54.308212 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 16:09:54.308329 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:54.308274 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-5jqn9\"" Apr 16 16:09:54.309112 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:54.309097 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 16:09:54.315439 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:54.315417 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h"] Apr 16 16:09:54.340458 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:54.340427 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/efda044d-9148-4846-9083-ef325b35bde1-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h\" (UID: \"efda044d-9148-4846-9083-ef325b35bde1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h" Apr 16 16:09:54.340624 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:54.340476 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/efda044d-9148-4846-9083-ef325b35bde1-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h\" (UID: \"efda044d-9148-4846-9083-ef325b35bde1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h" Apr 16 16:09:54.340624 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:54.340573 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m28gl\" (UniqueName: \"kubernetes.io/projected/efda044d-9148-4846-9083-ef325b35bde1-kube-api-access-m28gl\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h\" (UID: \"efda044d-9148-4846-9083-ef325b35bde1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h" Apr 16 16:09:54.441489 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:54.441438 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m28gl\" (UniqueName: \"kubernetes.io/projected/efda044d-9148-4846-9083-ef325b35bde1-kube-api-access-m28gl\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h\" (UID: \"efda044d-9148-4846-9083-ef325b35bde1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h" Apr 16 16:09:54.441701 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:54.441534 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/efda044d-9148-4846-9083-ef325b35bde1-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h\" (UID: \"efda044d-9148-4846-9083-ef325b35bde1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h" Apr 16 16:09:54.441701 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:54.441560 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/efda044d-9148-4846-9083-ef325b35bde1-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h\" (UID: \"efda044d-9148-4846-9083-ef325b35bde1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h" Apr 16 16:09:54.441922 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:54.441905 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/efda044d-9148-4846-9083-ef325b35bde1-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h\" (UID: \"efda044d-9148-4846-9083-ef325b35bde1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h" Apr 16 16:09:54.441981 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:54.441933 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/efda044d-9148-4846-9083-ef325b35bde1-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h\" (UID: \"efda044d-9148-4846-9083-ef325b35bde1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h" Apr 16 16:09:54.449992 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:54.449964 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m28gl\" (UniqueName: \"kubernetes.io/projected/efda044d-9148-4846-9083-ef325b35bde1-kube-api-access-m28gl\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h\" (UID: \"efda044d-9148-4846-9083-ef325b35bde1\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h" Apr 16 16:09:54.615993 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:54.615908 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h" Apr 16 16:09:54.743426 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:54.743394 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h"] Apr 16 16:09:54.747352 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:09:54.747322 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefda044d_9148_4846_9083_ef325b35bde1.slice/crio-bf39216fbf638a7e25106fccc5e57bb19930aa63155edf61e392bed741f8a26d WatchSource:0}: Error finding container bf39216fbf638a7e25106fccc5e57bb19930aa63155edf61e392bed741f8a26d: Status 404 returned error can't find the container with id bf39216fbf638a7e25106fccc5e57bb19930aa63155edf61e392bed741f8a26d Apr 16 16:09:55.270825 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:09:55.270790 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h" event={"ID":"efda044d-9148-4846-9083-ef325b35bde1","Type":"ContainerStarted","Data":"bf39216fbf638a7e25106fccc5e57bb19930aa63155edf61e392bed741f8a26d"} Apr 16 16:10:00.297656 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:00.297618 2577 generic.go:358] "Generic (PLEG): container finished" podID="efda044d-9148-4846-9083-ef325b35bde1" containerID="063fa4ab96886ef339fa2ea17003b691d09ae7811a4c2f52d7bcf7f77afe6ad5" exitCode=0 Apr 16 16:10:00.298061 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:00.297681 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h" event={"ID":"efda044d-9148-4846-9083-ef325b35bde1","Type":"ContainerDied","Data":"063fa4ab96886ef339fa2ea17003b691d09ae7811a4c2f52d7bcf7f77afe6ad5"} Apr 16 16:10:02.304596 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:02.304509 2577 generic.go:358] "Generic (PLEG): container finished" podID="efda044d-9148-4846-9083-ef325b35bde1" containerID="5ab538ab8a01729a621676613e541d1eccb52aff9e73d0946f8a7daca8418ed8" exitCode=0 Apr 16 16:10:02.305017 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:02.304595 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h" event={"ID":"efda044d-9148-4846-9083-ef325b35bde1","Type":"ContainerDied","Data":"5ab538ab8a01729a621676613e541d1eccb52aff9e73d0946f8a7daca8418ed8"} Apr 16 16:10:08.327816 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:08.327732 2577 generic.go:358] "Generic (PLEG): container finished" podID="efda044d-9148-4846-9083-ef325b35bde1" containerID="dc7d0e863caf35c50e2daedfd43ecb7e47dcccfb2e56e8b9218de476f4f1b1ec" exitCode=0 Apr 16 16:10:08.328181 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:08.327819 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h" event={"ID":"efda044d-9148-4846-9083-ef325b35bde1","Type":"ContainerDied","Data":"dc7d0e863caf35c50e2daedfd43ecb7e47dcccfb2e56e8b9218de476f4f1b1ec"} Apr 16 16:10:09.469466 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:09.469440 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h" Apr 16 16:10:09.570458 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:09.570420 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m28gl\" (UniqueName: \"kubernetes.io/projected/efda044d-9148-4846-9083-ef325b35bde1-kube-api-access-m28gl\") pod \"efda044d-9148-4846-9083-ef325b35bde1\" (UID: \"efda044d-9148-4846-9083-ef325b35bde1\") " Apr 16 16:10:09.570650 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:09.570509 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/efda044d-9148-4846-9083-ef325b35bde1-util\") pod \"efda044d-9148-4846-9083-ef325b35bde1\" (UID: \"efda044d-9148-4846-9083-ef325b35bde1\") " Apr 16 16:10:09.570650 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:09.570548 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/efda044d-9148-4846-9083-ef325b35bde1-bundle\") pod \"efda044d-9148-4846-9083-ef325b35bde1\" (UID: \"efda044d-9148-4846-9083-ef325b35bde1\") " Apr 16 16:10:09.571123 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:09.571096 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efda044d-9148-4846-9083-ef325b35bde1-bundle" (OuterVolumeSpecName: "bundle") pod "efda044d-9148-4846-9083-ef325b35bde1" (UID: "efda044d-9148-4846-9083-ef325b35bde1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:10:09.572707 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:09.572687 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efda044d-9148-4846-9083-ef325b35bde1-kube-api-access-m28gl" (OuterVolumeSpecName: "kube-api-access-m28gl") pod "efda044d-9148-4846-9083-ef325b35bde1" (UID: "efda044d-9148-4846-9083-ef325b35bde1"). InnerVolumeSpecName "kube-api-access-m28gl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:10:09.575169 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:09.575104 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efda044d-9148-4846-9083-ef325b35bde1-util" (OuterVolumeSpecName: "util") pod "efda044d-9148-4846-9083-ef325b35bde1" (UID: "efda044d-9148-4846-9083-ef325b35bde1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:10:09.672094 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:09.672007 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m28gl\" (UniqueName: \"kubernetes.io/projected/efda044d-9148-4846-9083-ef325b35bde1-kube-api-access-m28gl\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:10:09.672094 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:09.672038 2577 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/efda044d-9148-4846-9083-ef325b35bde1-util\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:10:09.672094 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:09.672048 2577 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/efda044d-9148-4846-9083-ef325b35bde1-bundle\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:10:10.336726 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:10.336637 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h" event={"ID":"efda044d-9148-4846-9083-ef325b35bde1","Type":"ContainerDied","Data":"bf39216fbf638a7e25106fccc5e57bb19930aa63155edf61e392bed741f8a26d"} Apr 16 16:10:10.336726 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:10.336670 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf39216fbf638a7e25106fccc5e57bb19930aa63155edf61e392bed741f8a26d" Apr 16 16:10:10.336726 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:10.336694 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c8tj5h" Apr 16 16:10:20.820058 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:20.820027 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-r2hzv"] Apr 16 16:10:20.820432 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:20.820355 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="efda044d-9148-4846-9083-ef325b35bde1" containerName="util" Apr 16 16:10:20.820432 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:20.820367 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="efda044d-9148-4846-9083-ef325b35bde1" containerName="util" Apr 16 16:10:20.820432 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:20.820382 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="efda044d-9148-4846-9083-ef325b35bde1" containerName="extract" Apr 16 16:10:20.820432 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:20.820387 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="efda044d-9148-4846-9083-ef325b35bde1" containerName="extract" Apr 16 16:10:20.820432 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:20.820394 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="efda044d-9148-4846-9083-ef325b35bde1" containerName="pull" Apr 16 16:10:20.820432 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:20.820400 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="efda044d-9148-4846-9083-ef325b35bde1" containerName="pull" Apr 16 16:10:20.820615 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:20.820461 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="efda044d-9148-4846-9083-ef325b35bde1" containerName="extract" Apr 16 16:10:20.826636 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:20.826617 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-r2hzv" Apr 16 16:10:20.829511 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:20.829491 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 16:10:20.830461 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:20.830430 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-rwkrh\"" Apr 16 16:10:20.830567 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:20.830514 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 16:10:20.830567 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:20.830526 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 16:10:20.830683 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:20.830586 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 16:10:20.832980 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:20.832963 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 16:10:20.850201 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:20.850174 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-r2hzv"] Apr 16 16:10:20.962208 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:20.962122 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30f8b014-51a7-4367-b39b-0e97d53e2fb4-certificates\") pod \"keda-operator-ffbb595cb-r2hzv\" (UID: \"30f8b014-51a7-4367-b39b-0e97d53e2fb4\") " pod="openshift-keda/keda-operator-ffbb595cb-r2hzv" Apr 16 16:10:20.962393 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:20.962277 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xqmb\" (UniqueName: \"kubernetes.io/projected/30f8b014-51a7-4367-b39b-0e97d53e2fb4-kube-api-access-8xqmb\") pod \"keda-operator-ffbb595cb-r2hzv\" (UID: \"30f8b014-51a7-4367-b39b-0e97d53e2fb4\") " pod="openshift-keda/keda-operator-ffbb595cb-r2hzv" Apr 16 16:10:20.962393 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:20.962310 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/30f8b014-51a7-4367-b39b-0e97d53e2fb4-cabundle0\") pod \"keda-operator-ffbb595cb-r2hzv\" (UID: \"30f8b014-51a7-4367-b39b-0e97d53e2fb4\") " pod="openshift-keda/keda-operator-ffbb595cb-r2hzv" Apr 16 16:10:21.063485 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:21.063447 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xqmb\" (UniqueName: \"kubernetes.io/projected/30f8b014-51a7-4367-b39b-0e97d53e2fb4-kube-api-access-8xqmb\") pod \"keda-operator-ffbb595cb-r2hzv\" (UID: \"30f8b014-51a7-4367-b39b-0e97d53e2fb4\") " pod="openshift-keda/keda-operator-ffbb595cb-r2hzv" Apr 16 16:10:21.063485 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:21.063494 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/30f8b014-51a7-4367-b39b-0e97d53e2fb4-cabundle0\") pod \"keda-operator-ffbb595cb-r2hzv\" (UID: \"30f8b014-51a7-4367-b39b-0e97d53e2fb4\") " pod="openshift-keda/keda-operator-ffbb595cb-r2hzv" Apr 16 16:10:21.063751 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:21.063525 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30f8b014-51a7-4367-b39b-0e97d53e2fb4-certificates\") pod \"keda-operator-ffbb595cb-r2hzv\" (UID: \"30f8b014-51a7-4367-b39b-0e97d53e2fb4\") " pod="openshift-keda/keda-operator-ffbb595cb-r2hzv" Apr 16 16:10:21.063751 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:10:21.063614 2577 secret.go:281] references non-existent secret key: ca.crt Apr 16 16:10:21.063751 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:10:21.063626 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 16:10:21.063751 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:10:21.063635 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-r2hzv: references non-existent secret key: ca.crt Apr 16 16:10:21.063751 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:10:21.063694 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30f8b014-51a7-4367-b39b-0e97d53e2fb4-certificates podName:30f8b014-51a7-4367-b39b-0e97d53e2fb4 nodeName:}" failed. No retries permitted until 2026-04-16 16:10:21.563677907 +0000 UTC m=+464.327302596 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/30f8b014-51a7-4367-b39b-0e97d53e2fb4-certificates") pod "keda-operator-ffbb595cb-r2hzv" (UID: "30f8b014-51a7-4367-b39b-0e97d53e2fb4") : references non-existent secret key: ca.crt Apr 16 16:10:21.064123 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:21.064104 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/30f8b014-51a7-4367-b39b-0e97d53e2fb4-cabundle0\") pod \"keda-operator-ffbb595cb-r2hzv\" (UID: \"30f8b014-51a7-4367-b39b-0e97d53e2fb4\") " pod="openshift-keda/keda-operator-ffbb595cb-r2hzv" Apr 16 16:10:21.072155 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:21.072075 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xqmb\" (UniqueName: \"kubernetes.io/projected/30f8b014-51a7-4367-b39b-0e97d53e2fb4-kube-api-access-8xqmb\") pod \"keda-operator-ffbb595cb-r2hzv\" (UID: \"30f8b014-51a7-4367-b39b-0e97d53e2fb4\") " pod="openshift-keda/keda-operator-ffbb595cb-r2hzv" Apr 16 16:10:21.352697 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:21.352605 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-wrvz9"] Apr 16 16:10:21.356220 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:21.356196 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-wrvz9" Apr 16 16:10:21.359888 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:21.359865 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 16:10:21.366151 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:21.366101 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-wrvz9"] Apr 16 16:10:21.467394 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:21.467364 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e288cf8c-df0f-4d66-8983-00afa12b91e3-certificates\") pod \"keda-admission-cf49989db-wrvz9\" (UID: \"e288cf8c-df0f-4d66-8983-00afa12b91e3\") " pod="openshift-keda/keda-admission-cf49989db-wrvz9" Apr 16 16:10:21.467588 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:21.467441 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxgbw\" (UniqueName: \"kubernetes.io/projected/e288cf8c-df0f-4d66-8983-00afa12b91e3-kube-api-access-xxgbw\") pod \"keda-admission-cf49989db-wrvz9\" (UID: \"e288cf8c-df0f-4d66-8983-00afa12b91e3\") " pod="openshift-keda/keda-admission-cf49989db-wrvz9" Apr 16 16:10:21.569097 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:21.569051 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30f8b014-51a7-4367-b39b-0e97d53e2fb4-certificates\") pod \"keda-operator-ffbb595cb-r2hzv\" (UID: \"30f8b014-51a7-4367-b39b-0e97d53e2fb4\") " pod="openshift-keda/keda-operator-ffbb595cb-r2hzv" Apr 16 16:10:21.569414 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:21.569381 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e288cf8c-df0f-4d66-8983-00afa12b91e3-certificates\") pod \"keda-admission-cf49989db-wrvz9\" (UID: \"e288cf8c-df0f-4d66-8983-00afa12b91e3\") " pod="openshift-keda/keda-admission-cf49989db-wrvz9" Apr 16 16:10:21.569539 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:10:21.569488 2577 secret.go:281] references non-existent secret key: ca.crt Apr 16 16:10:21.569539 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:10:21.569506 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 16:10:21.569539 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:10:21.569516 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-r2hzv: references non-existent secret key: ca.crt Apr 16 16:10:21.569680 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:10:21.569567 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30f8b014-51a7-4367-b39b-0e97d53e2fb4-certificates podName:30f8b014-51a7-4367-b39b-0e97d53e2fb4 nodeName:}" failed. No retries permitted until 2026-04-16 16:10:22.569551259 +0000 UTC m=+465.333175968 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/30f8b014-51a7-4367-b39b-0e97d53e2fb4-certificates") pod "keda-operator-ffbb595cb-r2hzv" (UID: "30f8b014-51a7-4367-b39b-0e97d53e2fb4") : references non-existent secret key: ca.crt Apr 16 16:10:21.569680 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:21.569582 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxgbw\" (UniqueName: \"kubernetes.io/projected/e288cf8c-df0f-4d66-8983-00afa12b91e3-kube-api-access-xxgbw\") pod \"keda-admission-cf49989db-wrvz9\" (UID: \"e288cf8c-df0f-4d66-8983-00afa12b91e3\") " pod="openshift-keda/keda-admission-cf49989db-wrvz9" Apr 16 16:10:21.571992 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:21.571965 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e288cf8c-df0f-4d66-8983-00afa12b91e3-certificates\") pod \"keda-admission-cf49989db-wrvz9\" (UID: \"e288cf8c-df0f-4d66-8983-00afa12b91e3\") " pod="openshift-keda/keda-admission-cf49989db-wrvz9" Apr 16 16:10:21.578694 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:21.578666 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxgbw\" (UniqueName: \"kubernetes.io/projected/e288cf8c-df0f-4d66-8983-00afa12b91e3-kube-api-access-xxgbw\") pod \"keda-admission-cf49989db-wrvz9\" (UID: \"e288cf8c-df0f-4d66-8983-00afa12b91e3\") " pod="openshift-keda/keda-admission-cf49989db-wrvz9" Apr 16 16:10:21.667159 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:21.667034 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-wrvz9" Apr 16 16:10:21.820203 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:21.820167 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-wrvz9"] Apr 16 16:10:21.823814 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:10:21.823787 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode288cf8c_df0f_4d66_8983_00afa12b91e3.slice/crio-0eafefad2c781c255907dbfbf874fad7fd05a2d9630d4c226aa4289732442d2f WatchSource:0}: Error finding container 0eafefad2c781c255907dbfbf874fad7fd05a2d9630d4c226aa4289732442d2f: Status 404 returned error can't find the container with id 0eafefad2c781c255907dbfbf874fad7fd05a2d9630d4c226aa4289732442d2f Apr 16 16:10:22.376598 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:22.376562 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-wrvz9" event={"ID":"e288cf8c-df0f-4d66-8983-00afa12b91e3","Type":"ContainerStarted","Data":"0eafefad2c781c255907dbfbf874fad7fd05a2d9630d4c226aa4289732442d2f"} Apr 16 16:10:22.578224 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:22.578177 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30f8b014-51a7-4367-b39b-0e97d53e2fb4-certificates\") pod \"keda-operator-ffbb595cb-r2hzv\" (UID: \"30f8b014-51a7-4367-b39b-0e97d53e2fb4\") " pod="openshift-keda/keda-operator-ffbb595cb-r2hzv" Apr 16 16:10:22.578403 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:10:22.578330 2577 secret.go:281] references non-existent secret key: ca.crt Apr 16 16:10:22.578403 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:10:22.578347 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 16:10:22.578403 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:10:22.578356 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-r2hzv: references non-existent secret key: ca.crt Apr 16 16:10:22.578578 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:10:22.578417 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30f8b014-51a7-4367-b39b-0e97d53e2fb4-certificates podName:30f8b014-51a7-4367-b39b-0e97d53e2fb4 nodeName:}" failed. No retries permitted until 2026-04-16 16:10:24.578397839 +0000 UTC m=+467.342022544 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/30f8b014-51a7-4367-b39b-0e97d53e2fb4-certificates") pod "keda-operator-ffbb595cb-r2hzv" (UID: "30f8b014-51a7-4367-b39b-0e97d53e2fb4") : references non-existent secret key: ca.crt Apr 16 16:10:23.380596 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:23.380561 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-wrvz9" event={"ID":"e288cf8c-df0f-4d66-8983-00afa12b91e3","Type":"ContainerStarted","Data":"d19a5301c5fed54f6be5bb16838504d044d9aefde940e17fe69d2f76f7bb4a8f"} Apr 16 16:10:23.380966 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:23.380673 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-wrvz9" Apr 16 16:10:23.396644 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:23.396580 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-wrvz9" podStartSLOduration=1.019563646 podStartE2EDuration="2.396559636s" podCreationTimestamp="2026-04-16 16:10:21 +0000 UTC" firstStartedPulling="2026-04-16 16:10:21.82507803 +0000 UTC m=+464.588702719" lastFinishedPulling="2026-04-16 16:10:23.202074017 +0000 UTC m=+465.965698709" observedRunningTime="2026-04-16 16:10:23.396044487 +0000 UTC m=+466.159669211" watchObservedRunningTime="2026-04-16 16:10:23.396559636 +0000 UTC m=+466.160184349" Apr 16 16:10:24.597285 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:24.597239 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30f8b014-51a7-4367-b39b-0e97d53e2fb4-certificates\") pod \"keda-operator-ffbb595cb-r2hzv\" (UID: \"30f8b014-51a7-4367-b39b-0e97d53e2fb4\") " pod="openshift-keda/keda-operator-ffbb595cb-r2hzv" Apr 16 16:10:24.597680 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:10:24.597390 2577 secret.go:281] references non-existent secret key: ca.crt Apr 16 16:10:24.597680 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:10:24.597408 2577 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 16:10:24.597680 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:10:24.597417 2577 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-r2hzv: references non-existent secret key: ca.crt Apr 16 16:10:24.597680 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:10:24.597479 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/30f8b014-51a7-4367-b39b-0e97d53e2fb4-certificates podName:30f8b014-51a7-4367-b39b-0e97d53e2fb4 nodeName:}" failed. No retries permitted until 2026-04-16 16:10:28.597464596 +0000 UTC m=+471.361089285 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/30f8b014-51a7-4367-b39b-0e97d53e2fb4-certificates") pod "keda-operator-ffbb595cb-r2hzv" (UID: "30f8b014-51a7-4367-b39b-0e97d53e2fb4") : references non-existent secret key: ca.crt Apr 16 16:10:28.635702 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:28.635657 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30f8b014-51a7-4367-b39b-0e97d53e2fb4-certificates\") pod \"keda-operator-ffbb595cb-r2hzv\" (UID: \"30f8b014-51a7-4367-b39b-0e97d53e2fb4\") " pod="openshift-keda/keda-operator-ffbb595cb-r2hzv" Apr 16 16:10:28.638263 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:28.638240 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/30f8b014-51a7-4367-b39b-0e97d53e2fb4-certificates\") pod \"keda-operator-ffbb595cb-r2hzv\" (UID: \"30f8b014-51a7-4367-b39b-0e97d53e2fb4\") " pod="openshift-keda/keda-operator-ffbb595cb-r2hzv" Apr 16 16:10:28.936875 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:28.936848 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-r2hzv" Apr 16 16:10:29.057059 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:29.056994 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-r2hzv"] Apr 16 16:10:29.059619 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:10:29.059593 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30f8b014_51a7_4367_b39b_0e97d53e2fb4.slice/crio-b98aba40fec6d0a69e1719d01a8ebd2316d23c8b273bd98ec5e322fbdf44f983 WatchSource:0}: Error finding container b98aba40fec6d0a69e1719d01a8ebd2316d23c8b273bd98ec5e322fbdf44f983: Status 404 returned error can't find the container with id b98aba40fec6d0a69e1719d01a8ebd2316d23c8b273bd98ec5e322fbdf44f983 Apr 16 16:10:29.400043 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:29.399957 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-r2hzv" event={"ID":"30f8b014-51a7-4367-b39b-0e97d53e2fb4","Type":"ContainerStarted","Data":"b98aba40fec6d0a69e1719d01a8ebd2316d23c8b273bd98ec5e322fbdf44f983"} Apr 16 16:10:32.410488 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:32.410452 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-r2hzv" event={"ID":"30f8b014-51a7-4367-b39b-0e97d53e2fb4","Type":"ContainerStarted","Data":"6902b1b83b77fe327206ae96948362711075d0be089de9a6d9837cb6b01fa11a"} Apr 16 16:10:32.410871 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:32.410586 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-r2hzv" Apr 16 16:10:32.426096 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:32.426043 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-r2hzv" podStartSLOduration=9.211029425 podStartE2EDuration="12.426031921s" podCreationTimestamp="2026-04-16 16:10:20 +0000 UTC" firstStartedPulling="2026-04-16 16:10:29.060882277 +0000 UTC m=+471.824506967" lastFinishedPulling="2026-04-16 16:10:32.275884774 +0000 UTC m=+475.039509463" observedRunningTime="2026-04-16 16:10:32.424538045 +0000 UTC m=+475.188162768" watchObservedRunningTime="2026-04-16 16:10:32.426031921 +0000 UTC m=+475.189656631" Apr 16 16:10:44.386189 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:44.386150 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-wrvz9" Apr 16 16:10:53.415525 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:10:53.415493 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-r2hzv" Apr 16 16:11:27.605002 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:11:27.604916 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-tmmd5"] Apr 16 16:11:27.611070 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:11:27.611044 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmmd5" Apr 16 16:11:27.616382 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:11:27.616354 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 16:11:27.616874 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:11:27.616380 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-thbxm\"" Apr 16 16:11:27.617295 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:11:27.616404 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 16:11:27.617398 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:11:27.616466 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 16:11:27.617951 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:11:27.617929 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-tmmd5"] Apr 16 16:11:27.745770 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:11:27.745728 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbt92\" (UniqueName: \"kubernetes.io/projected/5b50687c-6dca-4515-8303-4455d36b5c34-kube-api-access-rbt92\") pod \"llmisvc-controller-manager-68cc5db7c4-tmmd5\" (UID: \"5b50687c-6dca-4515-8303-4455d36b5c34\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmmd5" Apr 16 16:11:27.745993 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:11:27.745793 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b50687c-6dca-4515-8303-4455d36b5c34-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-tmmd5\" (UID: \"5b50687c-6dca-4515-8303-4455d36b5c34\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmmd5" Apr 16 16:11:27.846778 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:11:27.846727 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbt92\" (UniqueName: \"kubernetes.io/projected/5b50687c-6dca-4515-8303-4455d36b5c34-kube-api-access-rbt92\") pod \"llmisvc-controller-manager-68cc5db7c4-tmmd5\" (UID: \"5b50687c-6dca-4515-8303-4455d36b5c34\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmmd5" Apr 16 16:11:27.846948 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:11:27.846797 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b50687c-6dca-4515-8303-4455d36b5c34-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-tmmd5\" (UID: \"5b50687c-6dca-4515-8303-4455d36b5c34\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmmd5" Apr 16 16:11:27.849421 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:11:27.849395 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5b50687c-6dca-4515-8303-4455d36b5c34-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-tmmd5\" (UID: \"5b50687c-6dca-4515-8303-4455d36b5c34\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmmd5" Apr 16 16:11:27.854673 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:11:27.854647 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbt92\" (UniqueName: \"kubernetes.io/projected/5b50687c-6dca-4515-8303-4455d36b5c34-kube-api-access-rbt92\") pod \"llmisvc-controller-manager-68cc5db7c4-tmmd5\" (UID: \"5b50687c-6dca-4515-8303-4455d36b5c34\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmmd5" Apr 16 16:11:27.924035 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:11:27.924001 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmmd5" Apr 16 16:11:28.070109 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:11:28.070083 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-tmmd5"] Apr 16 16:11:28.071978 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:11:28.071950 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5b50687c_6dca_4515_8303_4455d36b5c34.slice/crio-576bb02e99a28c90763796c7de01f716ff2a2c72aa5d677636e7589a0f473045 WatchSource:0}: Error finding container 576bb02e99a28c90763796c7de01f716ff2a2c72aa5d677636e7589a0f473045: Status 404 returned error can't find the container with id 576bb02e99a28c90763796c7de01f716ff2a2c72aa5d677636e7589a0f473045 Apr 16 16:11:28.584415 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:11:28.584380 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmmd5" event={"ID":"5b50687c-6dca-4515-8303-4455d36b5c34","Type":"ContainerStarted","Data":"576bb02e99a28c90763796c7de01f716ff2a2c72aa5d677636e7589a0f473045"} Apr 16 16:11:30.593182 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:11:30.593141 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmmd5" event={"ID":"5b50687c-6dca-4515-8303-4455d36b5c34","Type":"ContainerStarted","Data":"72bad11a6ff0f865af376b755894659c1b5cc94df508ecb131ed523370564505"} Apr 16 16:11:30.593560 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:11:30.593192 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmmd5" Apr 16 16:11:30.612125 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:11:30.612072 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmmd5" podStartSLOduration=1.616938093 podStartE2EDuration="3.612057496s" podCreationTimestamp="2026-04-16 16:11:27 +0000 UTC" firstStartedPulling="2026-04-16 16:11:28.073226344 +0000 UTC m=+530.836851033" lastFinishedPulling="2026-04-16 16:11:30.068345731 +0000 UTC m=+532.831970436" observedRunningTime="2026-04-16 16:11:30.609626673 +0000 UTC m=+533.373251382" watchObservedRunningTime="2026-04-16 16:11:30.612057496 +0000 UTC m=+533.375682260" Apr 16 16:12:01.598723 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:01.598690 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmmd5" Apr 16 16:12:33.922528 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:33.922495 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d558c6874-khbd8"] Apr 16 16:12:33.925966 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:33.925945 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:33.936184 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:33.936159 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d558c6874-khbd8"] Apr 16 16:12:34.031080 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.031043 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bbbc794-64c0-4ffa-8843-be6ee3b4546c-trusted-ca-bundle\") pod \"console-6d558c6874-khbd8\" (UID: \"5bbbc794-64c0-4ffa-8843-be6ee3b4546c\") " pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:34.031282 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.031188 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5bbbc794-64c0-4ffa-8843-be6ee3b4546c-console-config\") pod \"console-6d558c6874-khbd8\" (UID: \"5bbbc794-64c0-4ffa-8843-be6ee3b4546c\") " pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:34.031282 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.031225 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5bbbc794-64c0-4ffa-8843-be6ee3b4546c-oauth-serving-cert\") pod \"console-6d558c6874-khbd8\" (UID: \"5bbbc794-64c0-4ffa-8843-be6ee3b4546c\") " pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:34.031282 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.031256 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5bbbc794-64c0-4ffa-8843-be6ee3b4546c-console-oauth-config\") pod \"console-6d558c6874-khbd8\" (UID: \"5bbbc794-64c0-4ffa-8843-be6ee3b4546c\") " pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:34.031282 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.031280 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5bbbc794-64c0-4ffa-8843-be6ee3b4546c-console-serving-cert\") pod \"console-6d558c6874-khbd8\" (UID: \"5bbbc794-64c0-4ffa-8843-be6ee3b4546c\") " pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:34.031447 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.031307 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74l92\" (UniqueName: \"kubernetes.io/projected/5bbbc794-64c0-4ffa-8843-be6ee3b4546c-kube-api-access-74l92\") pod \"console-6d558c6874-khbd8\" (UID: \"5bbbc794-64c0-4ffa-8843-be6ee3b4546c\") " pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:34.031447 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.031399 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5bbbc794-64c0-4ffa-8843-be6ee3b4546c-service-ca\") pod \"console-6d558c6874-khbd8\" (UID: \"5bbbc794-64c0-4ffa-8843-be6ee3b4546c\") " pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:34.131818 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.131777 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5bbbc794-64c0-4ffa-8843-be6ee3b4546c-console-config\") pod \"console-6d558c6874-khbd8\" (UID: \"5bbbc794-64c0-4ffa-8843-be6ee3b4546c\") " pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:34.131818 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.131821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5bbbc794-64c0-4ffa-8843-be6ee3b4546c-oauth-serving-cert\") pod \"console-6d558c6874-khbd8\" (UID: \"5bbbc794-64c0-4ffa-8843-be6ee3b4546c\") " pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:34.132070 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.131857 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5bbbc794-64c0-4ffa-8843-be6ee3b4546c-console-oauth-config\") pod \"console-6d558c6874-khbd8\" (UID: \"5bbbc794-64c0-4ffa-8843-be6ee3b4546c\") " pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:34.132070 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.131883 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5bbbc794-64c0-4ffa-8843-be6ee3b4546c-console-serving-cert\") pod \"console-6d558c6874-khbd8\" (UID: \"5bbbc794-64c0-4ffa-8843-be6ee3b4546c\") " pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:34.132070 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.131906 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74l92\" (UniqueName: \"kubernetes.io/projected/5bbbc794-64c0-4ffa-8843-be6ee3b4546c-kube-api-access-74l92\") pod \"console-6d558c6874-khbd8\" (UID: \"5bbbc794-64c0-4ffa-8843-be6ee3b4546c\") " pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:34.132266 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.132084 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5bbbc794-64c0-4ffa-8843-be6ee3b4546c-service-ca\") pod \"console-6d558c6874-khbd8\" (UID: \"5bbbc794-64c0-4ffa-8843-be6ee3b4546c\") " pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:34.132266 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.132157 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bbbc794-64c0-4ffa-8843-be6ee3b4546c-trusted-ca-bundle\") pod \"console-6d558c6874-khbd8\" (UID: \"5bbbc794-64c0-4ffa-8843-be6ee3b4546c\") " pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:34.132571 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.132541 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5bbbc794-64c0-4ffa-8843-be6ee3b4546c-console-config\") pod \"console-6d558c6874-khbd8\" (UID: \"5bbbc794-64c0-4ffa-8843-be6ee3b4546c\") " pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:34.132706 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.132591 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5bbbc794-64c0-4ffa-8843-be6ee3b4546c-oauth-serving-cert\") pod \"console-6d558c6874-khbd8\" (UID: \"5bbbc794-64c0-4ffa-8843-be6ee3b4546c\") " pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:34.132770 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.132737 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5bbbc794-64c0-4ffa-8843-be6ee3b4546c-service-ca\") pod \"console-6d558c6874-khbd8\" (UID: \"5bbbc794-64c0-4ffa-8843-be6ee3b4546c\") " pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:34.132931 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.132912 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bbbc794-64c0-4ffa-8843-be6ee3b4546c-trusted-ca-bundle\") pod \"console-6d558c6874-khbd8\" (UID: \"5bbbc794-64c0-4ffa-8843-be6ee3b4546c\") " pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:34.134432 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.134404 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5bbbc794-64c0-4ffa-8843-be6ee3b4546c-console-serving-cert\") pod \"console-6d558c6874-khbd8\" (UID: \"5bbbc794-64c0-4ffa-8843-be6ee3b4546c\") " pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:34.134532 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.134433 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5bbbc794-64c0-4ffa-8843-be6ee3b4546c-console-oauth-config\") pod \"console-6d558c6874-khbd8\" (UID: \"5bbbc794-64c0-4ffa-8843-be6ee3b4546c\") " pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:34.139858 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.139839 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74l92\" (UniqueName: \"kubernetes.io/projected/5bbbc794-64c0-4ffa-8843-be6ee3b4546c-kube-api-access-74l92\") pod \"console-6d558c6874-khbd8\" (UID: \"5bbbc794-64c0-4ffa-8843-be6ee3b4546c\") " pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:34.237272 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.237231 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:34.359967 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.359939 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d558c6874-khbd8"] Apr 16 16:12:34.365592 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:12:34.365559 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bbbc794_64c0_4ffa_8843_be6ee3b4546c.slice/crio-b203668ead795a63799aade1d97804a084e9f4a6438b575f8eb968ac9ce2381a WatchSource:0}: Error finding container b203668ead795a63799aade1d97804a084e9f4a6438b575f8eb968ac9ce2381a: Status 404 returned error can't find the container with id b203668ead795a63799aade1d97804a084e9f4a6438b575f8eb968ac9ce2381a Apr 16 16:12:34.810559 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.810502 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d558c6874-khbd8" event={"ID":"5bbbc794-64c0-4ffa-8843-be6ee3b4546c","Type":"ContainerStarted","Data":"b21758f24fee6208502d09babeff035e91449d3b19385d61dd81403159256dc5"} Apr 16 16:12:34.810559 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.810565 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d558c6874-khbd8" event={"ID":"5bbbc794-64c0-4ffa-8843-be6ee3b4546c","Type":"ContainerStarted","Data":"b203668ead795a63799aade1d97804a084e9f4a6438b575f8eb968ac9ce2381a"} Apr 16 16:12:34.828338 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:34.828291 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d558c6874-khbd8" podStartSLOduration=1.828275155 podStartE2EDuration="1.828275155s" podCreationTimestamp="2026-04-16 16:12:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:12:34.826149282 +0000 UTC m=+597.589773984" watchObservedRunningTime="2026-04-16 16:12:34.828275155 +0000 UTC m=+597.591899866" Apr 16 16:12:37.814178 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:37.814153 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2894b_ff7fcfa4-9774-4831-b686-678a7f92a456/console-operator/2.log" Apr 16 16:12:37.815249 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:37.815227 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2894b_ff7fcfa4-9774-4831-b686-678a7f92a456/console-operator/2.log" Apr 16 16:12:37.819985 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:37.819961 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/ovn-acl-logging/0.log" Apr 16 16:12:37.821295 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:37.821277 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/ovn-acl-logging/0.log" Apr 16 16:12:44.238164 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:44.238116 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:44.238164 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:44.238168 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:44.243765 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:44.243743 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:44.847210 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:44.847180 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6d558c6874-khbd8" Apr 16 16:12:44.900450 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:44.900419 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c6dcb9ffd-7kwcj"] Apr 16 16:12:51.782572 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:51.782538 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-9vcg7"] Apr 16 16:12:51.785691 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:51.785673 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-9vcg7" Apr 16 16:12:51.788008 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:51.787985 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 16:12:51.788143 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:51.787985 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-smnt5\"" Apr 16 16:12:51.791794 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:51.791762 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-9vcg7"] Apr 16 16:12:51.892850 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:51.892813 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld6x9\" (UniqueName: \"kubernetes.io/projected/3809161a-dca1-49bc-ae65-a6a7ebcb5221-kube-api-access-ld6x9\") pod \"s3-init-9vcg7\" (UID: \"3809161a-dca1-49bc-ae65-a6a7ebcb5221\") " pod="kserve/s3-init-9vcg7" Apr 16 16:12:51.993586 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:51.993552 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ld6x9\" (UniqueName: \"kubernetes.io/projected/3809161a-dca1-49bc-ae65-a6a7ebcb5221-kube-api-access-ld6x9\") pod \"s3-init-9vcg7\" (UID: \"3809161a-dca1-49bc-ae65-a6a7ebcb5221\") " pod="kserve/s3-init-9vcg7" Apr 16 16:12:52.003060 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:52.003021 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld6x9\" (UniqueName: \"kubernetes.io/projected/3809161a-dca1-49bc-ae65-a6a7ebcb5221-kube-api-access-ld6x9\") pod \"s3-init-9vcg7\" (UID: \"3809161a-dca1-49bc-ae65-a6a7ebcb5221\") " pod="kserve/s3-init-9vcg7" Apr 16 16:12:52.102973 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:52.102885 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-9vcg7" Apr 16 16:12:52.223745 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:52.223719 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-9vcg7"] Apr 16 16:12:52.226364 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:12:52.226329 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3809161a_dca1_49bc_ae65_a6a7ebcb5221.slice/crio-853a78e92235dca8d92a32b8243506c012e8b95dd0cbe2e4a25f43b1b8bc6a7a WatchSource:0}: Error finding container 853a78e92235dca8d92a32b8243506c012e8b95dd0cbe2e4a25f43b1b8bc6a7a: Status 404 returned error can't find the container with id 853a78e92235dca8d92a32b8243506c012e8b95dd0cbe2e4a25f43b1b8bc6a7a Apr 16 16:12:52.879917 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:52.879864 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-9vcg7" event={"ID":"3809161a-dca1-49bc-ae65-a6a7ebcb5221","Type":"ContainerStarted","Data":"853a78e92235dca8d92a32b8243506c012e8b95dd0cbe2e4a25f43b1b8bc6a7a"} Apr 16 16:12:56.897865 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:56.897824 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-9vcg7" event={"ID":"3809161a-dca1-49bc-ae65-a6a7ebcb5221","Type":"ContainerStarted","Data":"0d70080aad43ffba2c3be8bb46c2175f12a8206ce5583013721c02e75ef88ea4"} Apr 16 16:12:56.912859 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:56.912781 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-9vcg7" podStartSLOduration=1.4114981979999999 podStartE2EDuration="5.912764569s" podCreationTimestamp="2026-04-16 16:12:51 +0000 UTC" firstStartedPulling="2026-04-16 16:12:52.228173621 +0000 UTC m=+614.991798311" lastFinishedPulling="2026-04-16 16:12:56.729439989 +0000 UTC m=+619.493064682" observedRunningTime="2026-04-16 16:12:56.910967596 +0000 UTC m=+619.674592308" watchObservedRunningTime="2026-04-16 16:12:56.912764569 +0000 UTC m=+619.676389279" Apr 16 16:12:59.909187 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:59.909151 2577 generic.go:358] "Generic (PLEG): container finished" podID="3809161a-dca1-49bc-ae65-a6a7ebcb5221" containerID="0d70080aad43ffba2c3be8bb46c2175f12a8206ce5583013721c02e75ef88ea4" exitCode=0 Apr 16 16:12:59.909554 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:12:59.909159 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-9vcg7" event={"ID":"3809161a-dca1-49bc-ae65-a6a7ebcb5221","Type":"ContainerDied","Data":"0d70080aad43ffba2c3be8bb46c2175f12a8206ce5583013721c02e75ef88ea4"} Apr 16 16:13:01.042933 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:01.042905 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-9vcg7" Apr 16 16:13:01.182378 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:01.182349 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld6x9\" (UniqueName: \"kubernetes.io/projected/3809161a-dca1-49bc-ae65-a6a7ebcb5221-kube-api-access-ld6x9\") pod \"3809161a-dca1-49bc-ae65-a6a7ebcb5221\" (UID: \"3809161a-dca1-49bc-ae65-a6a7ebcb5221\") " Apr 16 16:13:01.184646 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:01.184618 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3809161a-dca1-49bc-ae65-a6a7ebcb5221-kube-api-access-ld6x9" (OuterVolumeSpecName: "kube-api-access-ld6x9") pod "3809161a-dca1-49bc-ae65-a6a7ebcb5221" (UID: "3809161a-dca1-49bc-ae65-a6a7ebcb5221"). InnerVolumeSpecName "kube-api-access-ld6x9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:13:01.283066 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:01.283028 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ld6x9\" (UniqueName: \"kubernetes.io/projected/3809161a-dca1-49bc-ae65-a6a7ebcb5221-kube-api-access-ld6x9\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:13:01.918004 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:01.917974 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-9vcg7" Apr 16 16:13:01.918464 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:01.918440 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-9vcg7" event={"ID":"3809161a-dca1-49bc-ae65-a6a7ebcb5221","Type":"ContainerDied","Data":"853a78e92235dca8d92a32b8243506c012e8b95dd0cbe2e4a25f43b1b8bc6a7a"} Apr 16 16:13:01.918579 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:01.918470 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="853a78e92235dca8d92a32b8243506c012e8b95dd0cbe2e4a25f43b1b8bc6a7a" Apr 16 16:13:09.925086 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:09.925032 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5c6dcb9ffd-7kwcj" podUID="52a2d57c-c665-4c97-b321-ce358ec8b0ac" containerName="console" containerID="cri-o://5af6c7e3a98168055ea44fac940ac4a93c96419db2dedfccf32c23c6c3bc3a32" gracePeriod=15 Apr 16 16:13:10.068800 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.068755 2577 patch_prober.go:28] interesting pod/console-5c6dcb9ffd-7kwcj container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.132.0.27:8443/health\": dial tcp 10.132.0.27:8443: connect: connection refused" start-of-body= Apr 16 16:13:10.068983 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.068818 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-5c6dcb9ffd-7kwcj" podUID="52a2d57c-c665-4c97-b321-ce358ec8b0ac" containerName="console" probeResult="failure" output="Get \"https://10.132.0.27:8443/health\": dial tcp 10.132.0.27:8443: connect: connection refused" Apr 16 16:13:10.167042 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.167019 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c6dcb9ffd-7kwcj_52a2d57c-c665-4c97-b321-ce358ec8b0ac/console/0.log" Apr 16 16:13:10.167203 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.167079 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:13:10.251861 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.251772 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-oauth-serving-cert\") pod \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " Apr 16 16:13:10.251861 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.251816 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-console-config\") pod \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " Apr 16 16:13:10.251861 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.251837 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-service-ca\") pod \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " Apr 16 16:13:10.252104 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.251953 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52a2d57c-c665-4c97-b321-ce358ec8b0ac-console-serving-cert\") pod \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " Apr 16 16:13:10.252104 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.251984 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52a2d57c-c665-4c97-b321-ce358ec8b0ac-console-oauth-config\") pod \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " Apr 16 16:13:10.252104 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.252078 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj6zp\" (UniqueName: \"kubernetes.io/projected/52a2d57c-c665-4c97-b321-ce358ec8b0ac-kube-api-access-sj6zp\") pod \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " Apr 16 16:13:10.252301 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.252161 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-trusted-ca-bundle\") pod \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\" (UID: \"52a2d57c-c665-4c97-b321-ce358ec8b0ac\") " Apr 16 16:13:10.252301 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.252238 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "52a2d57c-c665-4c97-b321-ce358ec8b0ac" (UID: "52a2d57c-c665-4c97-b321-ce358ec8b0ac"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:13:10.252413 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.252306 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-service-ca" (OuterVolumeSpecName: "service-ca") pod "52a2d57c-c665-4c97-b321-ce358ec8b0ac" (UID: "52a2d57c-c665-4c97-b321-ce358ec8b0ac"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:13:10.252413 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.252362 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-console-config" (OuterVolumeSpecName: "console-config") pod "52a2d57c-c665-4c97-b321-ce358ec8b0ac" (UID: "52a2d57c-c665-4c97-b321-ce358ec8b0ac"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:13:10.252526 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.252459 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-oauth-serving-cert\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:13:10.252526 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.252477 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-console-config\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:13:10.252526 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.252492 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-service-ca\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:13:10.252718 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.252696 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "52a2d57c-c665-4c97-b321-ce358ec8b0ac" (UID: "52a2d57c-c665-4c97-b321-ce358ec8b0ac"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:13:10.254318 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.254289 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52a2d57c-c665-4c97-b321-ce358ec8b0ac-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "52a2d57c-c665-4c97-b321-ce358ec8b0ac" (UID: "52a2d57c-c665-4c97-b321-ce358ec8b0ac"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:13:10.254318 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.254305 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52a2d57c-c665-4c97-b321-ce358ec8b0ac-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "52a2d57c-c665-4c97-b321-ce358ec8b0ac" (UID: "52a2d57c-c665-4c97-b321-ce358ec8b0ac"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:13:10.254469 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.254352 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52a2d57c-c665-4c97-b321-ce358ec8b0ac-kube-api-access-sj6zp" (OuterVolumeSpecName: "kube-api-access-sj6zp") pod "52a2d57c-c665-4c97-b321-ce358ec8b0ac" (UID: "52a2d57c-c665-4c97-b321-ce358ec8b0ac"). InnerVolumeSpecName "kube-api-access-sj6zp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:13:10.352881 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.352844 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/52a2d57c-c665-4c97-b321-ce358ec8b0ac-console-serving-cert\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:13:10.352881 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.352876 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/52a2d57c-c665-4c97-b321-ce358ec8b0ac-console-oauth-config\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:13:10.352881 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.352889 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sj6zp\" (UniqueName: \"kubernetes.io/projected/52a2d57c-c665-4c97-b321-ce358ec8b0ac-kube-api-access-sj6zp\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:13:10.353176 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.352904 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/52a2d57c-c665-4c97-b321-ce358ec8b0ac-trusted-ca-bundle\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:13:10.948927 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.948904 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c6dcb9ffd-7kwcj_52a2d57c-c665-4c97-b321-ce358ec8b0ac/console/0.log" Apr 16 16:13:10.949402 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.948945 2577 generic.go:358] "Generic (PLEG): container finished" podID="52a2d57c-c665-4c97-b321-ce358ec8b0ac" containerID="5af6c7e3a98168055ea44fac940ac4a93c96419db2dedfccf32c23c6c3bc3a32" exitCode=2 Apr 16 16:13:10.949402 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.948985 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6dcb9ffd-7kwcj" event={"ID":"52a2d57c-c665-4c97-b321-ce358ec8b0ac","Type":"ContainerDied","Data":"5af6c7e3a98168055ea44fac940ac4a93c96419db2dedfccf32c23c6c3bc3a32"} Apr 16 16:13:10.949402 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.949020 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c6dcb9ffd-7kwcj" Apr 16 16:13:10.949402 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.949033 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c6dcb9ffd-7kwcj" event={"ID":"52a2d57c-c665-4c97-b321-ce358ec8b0ac","Type":"ContainerDied","Data":"2236f6358f5b3115d31d1000850825281cefb36eedb6aba42a1e90f8c9a25d4a"} Apr 16 16:13:10.949402 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.949058 2577 scope.go:117] "RemoveContainer" containerID="5af6c7e3a98168055ea44fac940ac4a93c96419db2dedfccf32c23c6c3bc3a32" Apr 16 16:13:10.958187 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.958168 2577 scope.go:117] "RemoveContainer" containerID="5af6c7e3a98168055ea44fac940ac4a93c96419db2dedfccf32c23c6c3bc3a32" Apr 16 16:13:10.958463 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:13:10.958446 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5af6c7e3a98168055ea44fac940ac4a93c96419db2dedfccf32c23c6c3bc3a32\": container with ID starting with 5af6c7e3a98168055ea44fac940ac4a93c96419db2dedfccf32c23c6c3bc3a32 not found: ID does not exist" containerID="5af6c7e3a98168055ea44fac940ac4a93c96419db2dedfccf32c23c6c3bc3a32" Apr 16 16:13:10.958515 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.958470 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5af6c7e3a98168055ea44fac940ac4a93c96419db2dedfccf32c23c6c3bc3a32"} err="failed to get container status \"5af6c7e3a98168055ea44fac940ac4a93c96419db2dedfccf32c23c6c3bc3a32\": rpc error: code = NotFound desc = could not find container \"5af6c7e3a98168055ea44fac940ac4a93c96419db2dedfccf32c23c6c3bc3a32\": container with ID starting with 5af6c7e3a98168055ea44fac940ac4a93c96419db2dedfccf32c23c6c3bc3a32 not found: ID does not exist" Apr 16 16:13:10.970980 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.970955 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c6dcb9ffd-7kwcj"] Apr 16 16:13:10.976409 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:10.976387 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c6dcb9ffd-7kwcj"] Apr 16 16:13:11.919217 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:13:11.919185 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52a2d57c-c665-4c97-b321-ce358ec8b0ac" path="/var/lib/kubelet/pods/52a2d57c-c665-4c97-b321-ce358ec8b0ac/volumes" Apr 16 16:16:26.751037 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:26.750996 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-3b643-7f87dff547-9r7xz"] Apr 16 16:16:26.751583 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:26.751535 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52a2d57c-c665-4c97-b321-ce358ec8b0ac" containerName="console" Apr 16 16:16:26.751583 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:26.751554 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="52a2d57c-c665-4c97-b321-ce358ec8b0ac" containerName="console" Apr 16 16:16:26.751699 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:26.751597 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3809161a-dca1-49bc-ae65-a6a7ebcb5221" containerName="s3-init" Apr 16 16:16:26.751699 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:26.751605 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="3809161a-dca1-49bc-ae65-a6a7ebcb5221" containerName="s3-init" Apr 16 16:16:26.751699 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:26.751691 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="52a2d57c-c665-4c97-b321-ce358ec8b0ac" containerName="console" Apr 16 16:16:26.751841 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:26.751706 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="3809161a-dca1-49bc-ae65-a6a7ebcb5221" containerName="s3-init" Apr 16 16:16:26.753753 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:26.753733 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-3b643-7f87dff547-9r7xz" Apr 16 16:16:26.756076 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:26.756052 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 16:16:26.756208 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:26.756181 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-wqwt4\"" Apr 16 16:16:26.762573 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:26.762547 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-3b643-7f87dff547-9r7xz"] Apr 16 16:16:26.776570 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:26.776540 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c95eb78-181b-4996-a4a5-b1f30325d319-openshift-service-ca-bundle\") pod \"model-chainer-raw-3b643-7f87dff547-9r7xz\" (UID: \"2c95eb78-181b-4996-a4a5-b1f30325d319\") " pod="kserve-ci-e2e-test/model-chainer-raw-3b643-7f87dff547-9r7xz" Apr 16 16:16:26.877165 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:26.877116 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c95eb78-181b-4996-a4a5-b1f30325d319-openshift-service-ca-bundle\") pod \"model-chainer-raw-3b643-7f87dff547-9r7xz\" (UID: \"2c95eb78-181b-4996-a4a5-b1f30325d319\") " pod="kserve-ci-e2e-test/model-chainer-raw-3b643-7f87dff547-9r7xz" Apr 16 16:16:26.877741 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:26.877711 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c95eb78-181b-4996-a4a5-b1f30325d319-openshift-service-ca-bundle\") pod \"model-chainer-raw-3b643-7f87dff547-9r7xz\" (UID: \"2c95eb78-181b-4996-a4a5-b1f30325d319\") " pod="kserve-ci-e2e-test/model-chainer-raw-3b643-7f87dff547-9r7xz" Apr 16 16:16:27.064990 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:27.064910 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-3b643-7f87dff547-9r7xz" Apr 16 16:16:27.195833 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:27.195798 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-3b643-7f87dff547-9r7xz"] Apr 16 16:16:27.199775 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:16:27.199736 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c95eb78_181b_4996_a4a5_b1f30325d319.slice/crio-7980846bafb8c2e1734ee1c058b02d83b2362732f4d38db1e00a14bff72ccbe3 WatchSource:0}: Error finding container 7980846bafb8c2e1734ee1c058b02d83b2362732f4d38db1e00a14bff72ccbe3: Status 404 returned error can't find the container with id 7980846bafb8c2e1734ee1c058b02d83b2362732f4d38db1e00a14bff72ccbe3 Apr 16 16:16:27.201512 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:27.201493 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:16:27.604190 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:27.604156 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-3b643-7f87dff547-9r7xz" event={"ID":"2c95eb78-181b-4996-a4a5-b1f30325d319","Type":"ContainerStarted","Data":"7980846bafb8c2e1734ee1c058b02d83b2362732f4d38db1e00a14bff72ccbe3"} Apr 16 16:16:30.617267 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:30.617232 2577 generic.go:358] "Generic (PLEG): container finished" podID="2c95eb78-181b-4996-a4a5-b1f30325d319" containerID="616699b44e1a5c8c450f9586d56d9901f8f215e4e0e7454e5059b7ea0082df34" exitCode=1 Apr 16 16:16:30.617868 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:30.617290 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-3b643-7f87dff547-9r7xz" event={"ID":"2c95eb78-181b-4996-a4a5-b1f30325d319","Type":"ContainerDied","Data":"616699b44e1a5c8c450f9586d56d9901f8f215e4e0e7454e5059b7ea0082df34"} Apr 16 16:16:30.617868 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:30.617516 2577 scope.go:117] "RemoveContainer" containerID="616699b44e1a5c8c450f9586d56d9901f8f215e4e0e7454e5059b7ea0082df34" Apr 16 16:16:31.622668 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:31.622626 2577 generic.go:358] "Generic (PLEG): container finished" podID="2c95eb78-181b-4996-a4a5-b1f30325d319" containerID="04dea6e521c91d3ca9d8846997a600b43753cec4befe53934cc20950a009044c" exitCode=1 Apr 16 16:16:31.623116 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:31.622666 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-3b643-7f87dff547-9r7xz" event={"ID":"2c95eb78-181b-4996-a4a5-b1f30325d319","Type":"ContainerDied","Data":"04dea6e521c91d3ca9d8846997a600b43753cec4befe53934cc20950a009044c"} Apr 16 16:16:31.623116 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:31.622708 2577 scope.go:117] "RemoveContainer" containerID="616699b44e1a5c8c450f9586d56d9901f8f215e4e0e7454e5059b7ea0082df34" Apr 16 16:16:31.623116 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:31.622923 2577 scope.go:117] "RemoveContainer" containerID="04dea6e521c91d3ca9d8846997a600b43753cec4befe53934cc20950a009044c" Apr 16 16:16:31.623279 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:16:31.623154 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"model-chainer-raw-3b643\" with CrashLoopBackOff: \"back-off 10s restarting failed container=model-chainer-raw-3b643 pod=model-chainer-raw-3b643-7f87dff547-9r7xz_kserve-ci-e2e-test(2c95eb78-181b-4996-a4a5-b1f30325d319)\"" pod="kserve-ci-e2e-test/model-chainer-raw-3b643-7f87dff547-9r7xz" podUID="2c95eb78-181b-4996-a4a5-b1f30325d319" Apr 16 16:16:32.065092 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:32.065055 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-3b643-7f87dff547-9r7xz" Apr 16 16:16:32.627691 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:32.627662 2577 scope.go:117] "RemoveContainer" containerID="04dea6e521c91d3ca9d8846997a600b43753cec4befe53934cc20950a009044c" Apr 16 16:16:32.628056 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:16:32.627877 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"model-chainer-raw-3b643\" with CrashLoopBackOff: \"back-off 10s restarting failed container=model-chainer-raw-3b643 pod=model-chainer-raw-3b643-7f87dff547-9r7xz_kserve-ci-e2e-test(2c95eb78-181b-4996-a4a5-b1f30325d319)\"" pod="kserve-ci-e2e-test/model-chainer-raw-3b643-7f87dff547-9r7xz" podUID="2c95eb78-181b-4996-a4a5-b1f30325d319" Apr 16 16:16:33.630477 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:33.630451 2577 scope.go:117] "RemoveContainer" containerID="04dea6e521c91d3ca9d8846997a600b43753cec4befe53934cc20950a009044c" Apr 16 16:16:33.630849 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:16:33.630679 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"model-chainer-raw-3b643\" with CrashLoopBackOff: \"back-off 10s restarting failed container=model-chainer-raw-3b643 pod=model-chainer-raw-3b643-7f87dff547-9r7xz_kserve-ci-e2e-test(2c95eb78-181b-4996-a4a5-b1f30325d319)\"" pod="kserve-ci-e2e-test/model-chainer-raw-3b643-7f87dff547-9r7xz" podUID="2c95eb78-181b-4996-a4a5-b1f30325d319" Apr 16 16:16:36.781407 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:36.781375 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-3b643-7f87dff547-9r7xz"] Apr 16 16:16:36.914081 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:36.914053 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-3b643-7f87dff547-9r7xz" Apr 16 16:16:36.969806 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:36.969771 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c95eb78-181b-4996-a4a5-b1f30325d319-openshift-service-ca-bundle\") pod \"2c95eb78-181b-4996-a4a5-b1f30325d319\" (UID: \"2c95eb78-181b-4996-a4a5-b1f30325d319\") " Apr 16 16:16:36.970204 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:36.970182 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c95eb78-181b-4996-a4a5-b1f30325d319-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "2c95eb78-181b-4996-a4a5-b1f30325d319" (UID: "2c95eb78-181b-4996-a4a5-b1f30325d319"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:16:37.070723 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:37.070625 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c95eb78-181b-4996-a4a5-b1f30325d319-openshift-service-ca-bundle\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:16:37.651997 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:37.651967 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-3b643-7f87dff547-9r7xz" Apr 16 16:16:37.652189 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:37.651966 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-3b643-7f87dff547-9r7xz" event={"ID":"2c95eb78-181b-4996-a4a5-b1f30325d319","Type":"ContainerDied","Data":"7980846bafb8c2e1734ee1c058b02d83b2362732f4d38db1e00a14bff72ccbe3"} Apr 16 16:16:37.652189 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:37.652083 2577 scope.go:117] "RemoveContainer" containerID="04dea6e521c91d3ca9d8846997a600b43753cec4befe53934cc20950a009044c" Apr 16 16:16:37.673785 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:37.673755 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-3b643-7f87dff547-9r7xz"] Apr 16 16:16:37.677114 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:37.677086 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-3b643-7f87dff547-9r7xz"] Apr 16 16:16:37.918997 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:16:37.918926 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c95eb78-181b-4996-a4a5-b1f30325d319" path="/var/lib/kubelet/pods/2c95eb78-181b-4996-a4a5-b1f30325d319/volumes" Apr 16 16:17:37.838702 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:17:37.838622 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2894b_ff7fcfa4-9774-4831-b686-678a7f92a456/console-operator/2.log" Apr 16 16:17:37.841145 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:17:37.841105 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2894b_ff7fcfa4-9774-4831-b686-678a7f92a456/console-operator/2.log" Apr 16 16:17:37.844927 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:17:37.844908 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/ovn-acl-logging/0.log" Apr 16 16:17:37.846848 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:17:37.846831 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/ovn-acl-logging/0.log" Apr 16 16:18:07.052787 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:07.052752 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-724bc-d588cbf89-2tdkd"] Apr 16 16:18:07.053250 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:07.053119 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c95eb78-181b-4996-a4a5-b1f30325d319" containerName="model-chainer-raw-3b643" Apr 16 16:18:07.053250 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:07.053144 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c95eb78-181b-4996-a4a5-b1f30325d319" containerName="model-chainer-raw-3b643" Apr 16 16:18:07.053250 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:07.053222 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c95eb78-181b-4996-a4a5-b1f30325d319" containerName="model-chainer-raw-3b643" Apr 16 16:18:07.056227 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:07.056212 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-724bc-d588cbf89-2tdkd" Apr 16 16:18:07.058430 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:07.058407 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 16:18:07.058575 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:07.058502 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-wqwt4\"" Apr 16 16:18:07.062335 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:07.062309 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-724bc-d588cbf89-2tdkd"] Apr 16 16:18:07.160122 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:07.160080 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44c561b0-a63e-4057-8b7b-792102cd2f53-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-724bc-d588cbf89-2tdkd\" (UID: \"44c561b0-a63e-4057-8b7b-792102cd2f53\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-724bc-d588cbf89-2tdkd" Apr 16 16:18:07.261375 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:07.261340 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44c561b0-a63e-4057-8b7b-792102cd2f53-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-724bc-d588cbf89-2tdkd\" (UID: \"44c561b0-a63e-4057-8b7b-792102cd2f53\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-724bc-d588cbf89-2tdkd" Apr 16 16:18:07.262138 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:07.262107 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44c561b0-a63e-4057-8b7b-792102cd2f53-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-724bc-d588cbf89-2tdkd\" (UID: \"44c561b0-a63e-4057-8b7b-792102cd2f53\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-724bc-d588cbf89-2tdkd" Apr 16 16:18:07.367737 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:07.367645 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-724bc-d588cbf89-2tdkd" Apr 16 16:18:07.490316 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:07.490292 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-724bc-d588cbf89-2tdkd"] Apr 16 16:18:07.492326 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:18:07.492299 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44c561b0_a63e_4057_8b7b_792102cd2f53.slice/crio-124674a5e88636b6827509c080f8df5bc44c537ab5d84e3ddbf81860a9e0e0d8 WatchSource:0}: Error finding container 124674a5e88636b6827509c080f8df5bc44c537ab5d84e3ddbf81860a9e0e0d8: Status 404 returned error can't find the container with id 124674a5e88636b6827509c080f8df5bc44c537ab5d84e3ddbf81860a9e0e0d8 Apr 16 16:18:07.950218 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:07.950181 2577 generic.go:358] "Generic (PLEG): container finished" podID="44c561b0-a63e-4057-8b7b-792102cd2f53" containerID="dd83d42f83d412237888d2d6c06fa0eded87bb2e0ab4cb4c49a222b1a0fc3b76" exitCode=1 Apr 16 16:18:07.950416 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:07.950265 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-724bc-d588cbf89-2tdkd" event={"ID":"44c561b0-a63e-4057-8b7b-792102cd2f53","Type":"ContainerDied","Data":"dd83d42f83d412237888d2d6c06fa0eded87bb2e0ab4cb4c49a222b1a0fc3b76"} Apr 16 16:18:07.950416 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:07.950298 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-724bc-d588cbf89-2tdkd" event={"ID":"44c561b0-a63e-4057-8b7b-792102cd2f53","Type":"ContainerStarted","Data":"124674a5e88636b6827509c080f8df5bc44c537ab5d84e3ddbf81860a9e0e0d8"} Apr 16 16:18:07.950504 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:07.950492 2577 scope.go:117] "RemoveContainer" containerID="dd83d42f83d412237888d2d6c06fa0eded87bb2e0ab4cb4c49a222b1a0fc3b76" Apr 16 16:18:08.955467 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:08.955433 2577 generic.go:358] "Generic (PLEG): container finished" podID="44c561b0-a63e-4057-8b7b-792102cd2f53" containerID="5897b5d4b746a2d0f0d7e0694966e91c6498e3dea1a5a9495114ea72d8b737ba" exitCode=1 Apr 16 16:18:08.955890 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:08.955551 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-724bc-d588cbf89-2tdkd" event={"ID":"44c561b0-a63e-4057-8b7b-792102cd2f53","Type":"ContainerDied","Data":"5897b5d4b746a2d0f0d7e0694966e91c6498e3dea1a5a9495114ea72d8b737ba"} Apr 16 16:18:08.955890 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:08.955577 2577 scope.go:117] "RemoveContainer" containerID="dd83d42f83d412237888d2d6c06fa0eded87bb2e0ab4cb4c49a222b1a0fc3b76" Apr 16 16:18:08.955890 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:08.955713 2577 scope.go:117] "RemoveContainer" containerID="5897b5d4b746a2d0f0d7e0694966e91c6498e3dea1a5a9495114ea72d8b737ba" Apr 16 16:18:08.956018 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:18:08.955937 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"model-chainer-raw-hpa-724bc\" with CrashLoopBackOff: \"back-off 10s restarting failed container=model-chainer-raw-hpa-724bc pod=model-chainer-raw-hpa-724bc-d588cbf89-2tdkd_kserve-ci-e2e-test(44c561b0-a63e-4057-8b7b-792102cd2f53)\"" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-724bc-d588cbf89-2tdkd" podUID="44c561b0-a63e-4057-8b7b-792102cd2f53" Apr 16 16:18:09.960591 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:09.960555 2577 scope.go:117] "RemoveContainer" containerID="5897b5d4b746a2d0f0d7e0694966e91c6498e3dea1a5a9495114ea72d8b737ba" Apr 16 16:18:09.960968 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:18:09.960782 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"model-chainer-raw-hpa-724bc\" with CrashLoopBackOff: \"back-off 10s restarting failed container=model-chainer-raw-hpa-724bc pod=model-chainer-raw-hpa-724bc-d588cbf89-2tdkd_kserve-ci-e2e-test(44c561b0-a63e-4057-8b7b-792102cd2f53)\"" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-724bc-d588cbf89-2tdkd" podUID="44c561b0-a63e-4057-8b7b-792102cd2f53" Apr 16 16:18:12.368463 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:12.368421 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-724bc-d588cbf89-2tdkd" Apr 16 16:18:12.368931 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:12.368796 2577 scope.go:117] "RemoveContainer" containerID="5897b5d4b746a2d0f0d7e0694966e91c6498e3dea1a5a9495114ea72d8b737ba" Apr 16 16:18:12.368992 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:18:12.368975 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"model-chainer-raw-hpa-724bc\" with CrashLoopBackOff: \"back-off 10s restarting failed container=model-chainer-raw-hpa-724bc pod=model-chainer-raw-hpa-724bc-d588cbf89-2tdkd_kserve-ci-e2e-test(44c561b0-a63e-4057-8b7b-792102cd2f53)\"" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-724bc-d588cbf89-2tdkd" podUID="44c561b0-a63e-4057-8b7b-792102cd2f53" Apr 16 16:18:17.101882 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:17.101846 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-724bc-d588cbf89-2tdkd"] Apr 16 16:18:17.232604 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:17.232581 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-724bc-d588cbf89-2tdkd" Apr 16 16:18:17.279640 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:17.279607 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-74f70-predictor-767887b594-gh5xm"] Apr 16 16:18:17.279955 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:17.279944 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c95eb78-181b-4996-a4a5-b1f30325d319" containerName="model-chainer-raw-3b643" Apr 16 16:18:17.280004 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:17.279957 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c95eb78-181b-4996-a4a5-b1f30325d319" containerName="model-chainer-raw-3b643" Apr 16 16:18:17.280004 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:17.279966 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44c561b0-a63e-4057-8b7b-792102cd2f53" containerName="model-chainer-raw-hpa-724bc" Apr 16 16:18:17.280004 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:17.279972 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c561b0-a63e-4057-8b7b-792102cd2f53" containerName="model-chainer-raw-hpa-724bc" Apr 16 16:18:17.280004 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:17.279982 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44c561b0-a63e-4057-8b7b-792102cd2f53" containerName="model-chainer-raw-hpa-724bc" Apr 16 16:18:17.280004 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:17.279988 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c561b0-a63e-4057-8b7b-792102cd2f53" containerName="model-chainer-raw-hpa-724bc" Apr 16 16:18:17.280481 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:17.280450 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c95eb78-181b-4996-a4a5-b1f30325d319" containerName="model-chainer-raw-3b643" Apr 16 16:18:17.280684 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:17.280664 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="44c561b0-a63e-4057-8b7b-792102cd2f53" containerName="model-chainer-raw-hpa-724bc" Apr 16 16:18:17.288627 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:17.288595 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-74f70-predictor-767887b594-gh5xm" Apr 16 16:18:17.289552 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:17.289528 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-74f70-predictor-767887b594-gh5xm"] Apr 16 16:18:17.298714 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:17.298694 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-74f70-predictor-767887b594-gh5xm" Apr 16 16:18:17.347793 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:17.347751 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44c561b0-a63e-4057-8b7b-792102cd2f53-openshift-service-ca-bundle\") pod \"44c561b0-a63e-4057-8b7b-792102cd2f53\" (UID: \"44c561b0-a63e-4057-8b7b-792102cd2f53\") " Apr 16 16:18:17.348183 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:17.348154 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/44c561b0-a63e-4057-8b7b-792102cd2f53-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "44c561b0-a63e-4057-8b7b-792102cd2f53" (UID: "44c561b0-a63e-4057-8b7b-792102cd2f53"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:18:17.425746 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:17.425584 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-74f70-predictor-767887b594-gh5xm"] Apr 16 16:18:17.428408 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:18:17.428381 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf02fbfe_8138_493d_92ae_dba214ad3704.slice/crio-842b0d521080c38e979f328c965715ae6f2bf06c88eb62d7bda0f1109157c90e WatchSource:0}: Error finding container 842b0d521080c38e979f328c965715ae6f2bf06c88eb62d7bda0f1109157c90e: Status 404 returned error can't find the container with id 842b0d521080c38e979f328c965715ae6f2bf06c88eb62d7bda0f1109157c90e Apr 16 16:18:17.449356 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:17.449318 2577 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/44c561b0-a63e-4057-8b7b-792102cd2f53-openshift-service-ca-bundle\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:18:17.986852 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:17.986818 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-74f70-predictor-767887b594-gh5xm" event={"ID":"cf02fbfe-8138-493d-92ae-dba214ad3704","Type":"ContainerStarted","Data":"842b0d521080c38e979f328c965715ae6f2bf06c88eb62d7bda0f1109157c90e"} Apr 16 16:18:17.988100 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:17.988068 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-724bc-d588cbf89-2tdkd" event={"ID":"44c561b0-a63e-4057-8b7b-792102cd2f53","Type":"ContainerDied","Data":"124674a5e88636b6827509c080f8df5bc44c537ab5d84e3ddbf81860a9e0e0d8"} Apr 16 16:18:17.988233 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:17.988109 2577 scope.go:117] "RemoveContainer" containerID="5897b5d4b746a2d0f0d7e0694966e91c6498e3dea1a5a9495114ea72d8b737ba" Apr 16 16:18:17.988233 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:17.988086 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-724bc-d588cbf89-2tdkd" Apr 16 16:18:18.004378 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:18.004342 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-724bc-d588cbf89-2tdkd"] Apr 16 16:18:18.010096 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:18.010068 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-724bc-d588cbf89-2tdkd"] Apr 16 16:18:18.993754 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:18.993716 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-74f70-predictor-767887b594-gh5xm" event={"ID":"cf02fbfe-8138-493d-92ae-dba214ad3704","Type":"ContainerStarted","Data":"135d43b0e0fc5d1a3764ea2773d01a902c6fe4554a40b3e0527622e166484de9"} Apr 16 16:18:18.994234 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:18.993924 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-raw-74f70-predictor-767887b594-gh5xm" Apr 16 16:18:18.995390 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:18.995373 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-raw-74f70-predictor-767887b594-gh5xm" Apr 16 16:18:19.022608 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:19.022561 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-raw-74f70-predictor-767887b594-gh5xm" podStartSLOduration=1.083158651 podStartE2EDuration="2.02254667s" podCreationTimestamp="2026-04-16 16:18:17 +0000 UTC" firstStartedPulling="2026-04-16 16:18:17.430185695 +0000 UTC m=+940.193810388" lastFinishedPulling="2026-04-16 16:18:18.369573709 +0000 UTC m=+941.133198407" observedRunningTime="2026-04-16 16:18:19.021062612 +0000 UTC m=+941.784687322" watchObservedRunningTime="2026-04-16 16:18:19.02254667 +0000 UTC m=+941.786171380" Apr 16 16:18:19.919229 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:18:19.919197 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44c561b0-a63e-4057-8b7b-792102cd2f53" path="/var/lib/kubelet/pods/44c561b0-a63e-4057-8b7b-792102cd2f53/volumes" Apr 16 16:19:52.348731 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:19:52.348694 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-raw-74f70-predictor-767887b594-gh5xm_cf02fbfe-8138-493d-92ae-dba214ad3704/kserve-container/0.log" Apr 16 16:19:52.651893 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:19:52.651797 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-74f70-predictor-767887b594-gh5xm"] Apr 16 16:19:52.652061 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:19:52.652026 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-raw-74f70-predictor-767887b594-gh5xm" podUID="cf02fbfe-8138-493d-92ae-dba214ad3704" containerName="kserve-container" containerID="cri-o://135d43b0e0fc5d1a3764ea2773d01a902c6fe4554a40b3e0527622e166484de9" gracePeriod=30 Apr 16 16:19:52.896837 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:19:52.896814 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-74f70-predictor-767887b594-gh5xm" Apr 16 16:19:53.326733 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:19:53.326697 2577 generic.go:358] "Generic (PLEG): container finished" podID="cf02fbfe-8138-493d-92ae-dba214ad3704" containerID="135d43b0e0fc5d1a3764ea2773d01a902c6fe4554a40b3e0527622e166484de9" exitCode=2 Apr 16 16:19:53.326905 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:19:53.326766 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-74f70-predictor-767887b594-gh5xm" event={"ID":"cf02fbfe-8138-493d-92ae-dba214ad3704","Type":"ContainerDied","Data":"135d43b0e0fc5d1a3764ea2773d01a902c6fe4554a40b3e0527622e166484de9"} Apr 16 16:19:53.326905 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:19:53.326790 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-raw-74f70-predictor-767887b594-gh5xm" event={"ID":"cf02fbfe-8138-493d-92ae-dba214ad3704","Type":"ContainerDied","Data":"842b0d521080c38e979f328c965715ae6f2bf06c88eb62d7bda0f1109157c90e"} Apr 16 16:19:53.326905 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:19:53.326804 2577 scope.go:117] "RemoveContainer" containerID="135d43b0e0fc5d1a3764ea2773d01a902c6fe4554a40b3e0527622e166484de9" Apr 16 16:19:53.326905 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:19:53.326768 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-raw-74f70-predictor-767887b594-gh5xm" Apr 16 16:19:53.335541 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:19:53.335522 2577 scope.go:117] "RemoveContainer" containerID="135d43b0e0fc5d1a3764ea2773d01a902c6fe4554a40b3e0527622e166484de9" Apr 16 16:19:53.335787 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:19:53.335769 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"135d43b0e0fc5d1a3764ea2773d01a902c6fe4554a40b3e0527622e166484de9\": container with ID starting with 135d43b0e0fc5d1a3764ea2773d01a902c6fe4554a40b3e0527622e166484de9 not found: ID does not exist" containerID="135d43b0e0fc5d1a3764ea2773d01a902c6fe4554a40b3e0527622e166484de9" Apr 16 16:19:53.335842 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:19:53.335796 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"135d43b0e0fc5d1a3764ea2773d01a902c6fe4554a40b3e0527622e166484de9"} err="failed to get container status \"135d43b0e0fc5d1a3764ea2773d01a902c6fe4554a40b3e0527622e166484de9\": rpc error: code = NotFound desc = could not find container \"135d43b0e0fc5d1a3764ea2773d01a902c6fe4554a40b3e0527622e166484de9\": container with ID starting with 135d43b0e0fc5d1a3764ea2773d01a902c6fe4554a40b3e0527622e166484de9 not found: ID does not exist" Apr 16 16:19:53.346701 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:19:53.346669 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-74f70-predictor-767887b594-gh5xm"] Apr 16 16:19:53.351737 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:19:53.351716 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-raw-74f70-predictor-767887b594-gh5xm"] Apr 16 16:19:53.918610 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:19:53.918572 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf02fbfe-8138-493d-92ae-dba214ad3704" path="/var/lib/kubelet/pods/cf02fbfe-8138-493d-92ae-dba214ad3704/volumes" Apr 16 16:22:37.864013 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:22:37.863985 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2894b_ff7fcfa4-9774-4831-b686-678a7f92a456/console-operator/2.log" Apr 16 16:22:37.867077 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:22:37.867055 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2894b_ff7fcfa4-9774-4831-b686-678a7f92a456/console-operator/2.log" Apr 16 16:22:37.869836 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:22:37.869817 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/ovn-acl-logging/0.log" Apr 16 16:22:37.872385 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:22:37.872367 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/ovn-acl-logging/0.log" Apr 16 16:26:52.672384 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:52.672348 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l6xzd/must-gather-56vxn"] Apr 16 16:26:52.672864 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:52.672689 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cf02fbfe-8138-493d-92ae-dba214ad3704" containerName="kserve-container" Apr 16 16:26:52.672864 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:52.672699 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf02fbfe-8138-493d-92ae-dba214ad3704" containerName="kserve-container" Apr 16 16:26:52.672864 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:52.672748 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="cf02fbfe-8138-493d-92ae-dba214ad3704" containerName="kserve-container" Apr 16 16:26:52.672864 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:52.672759 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="44c561b0-a63e-4057-8b7b-792102cd2f53" containerName="model-chainer-raw-hpa-724bc" Apr 16 16:26:52.675700 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:52.675684 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l6xzd/must-gather-56vxn" Apr 16 16:26:52.677981 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:52.677957 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-l6xzd\"/\"kube-root-ca.crt\"" Apr 16 16:26:52.678099 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:52.678071 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-l6xzd\"/\"openshift-service-ca.crt\"" Apr 16 16:26:52.678714 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:52.678701 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-l6xzd\"/\"default-dockercfg-hrzx4\"" Apr 16 16:26:52.682045 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:52.682021 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l6xzd/must-gather-56vxn"] Apr 16 16:26:52.843100 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:52.843067 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0abff368-40aa-47fe-a037-b930d71d6938-must-gather-output\") pod \"must-gather-56vxn\" (UID: \"0abff368-40aa-47fe-a037-b930d71d6938\") " pod="openshift-must-gather-l6xzd/must-gather-56vxn" Apr 16 16:26:52.843100 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:52.843101 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qhc9\" (UniqueName: \"kubernetes.io/projected/0abff368-40aa-47fe-a037-b930d71d6938-kube-api-access-6qhc9\") pod \"must-gather-56vxn\" (UID: \"0abff368-40aa-47fe-a037-b930d71d6938\") " pod="openshift-must-gather-l6xzd/must-gather-56vxn" Apr 16 16:26:52.944493 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:52.944461 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0abff368-40aa-47fe-a037-b930d71d6938-must-gather-output\") pod \"must-gather-56vxn\" (UID: \"0abff368-40aa-47fe-a037-b930d71d6938\") " pod="openshift-must-gather-l6xzd/must-gather-56vxn" Apr 16 16:26:52.944680 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:52.944498 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6qhc9\" (UniqueName: \"kubernetes.io/projected/0abff368-40aa-47fe-a037-b930d71d6938-kube-api-access-6qhc9\") pod \"must-gather-56vxn\" (UID: \"0abff368-40aa-47fe-a037-b930d71d6938\") " pod="openshift-must-gather-l6xzd/must-gather-56vxn" Apr 16 16:26:52.944856 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:52.944834 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0abff368-40aa-47fe-a037-b930d71d6938-must-gather-output\") pod \"must-gather-56vxn\" (UID: \"0abff368-40aa-47fe-a037-b930d71d6938\") " pod="openshift-must-gather-l6xzd/must-gather-56vxn" Apr 16 16:26:52.952346 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:52.952324 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qhc9\" (UniqueName: \"kubernetes.io/projected/0abff368-40aa-47fe-a037-b930d71d6938-kube-api-access-6qhc9\") pod \"must-gather-56vxn\" (UID: \"0abff368-40aa-47fe-a037-b930d71d6938\") " pod="openshift-must-gather-l6xzd/must-gather-56vxn" Apr 16 16:26:53.002984 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:53.002948 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l6xzd/must-gather-56vxn" Apr 16 16:26:53.121486 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:53.121454 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l6xzd/must-gather-56vxn"] Apr 16 16:26:53.124498 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:26:53.124472 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0abff368_40aa_47fe_a037_b930d71d6938.slice/crio-a26eaa05ce58e731c84e902c7de88dfe3499d202c2b67554431c76044de281c0 WatchSource:0}: Error finding container a26eaa05ce58e731c84e902c7de88dfe3499d202c2b67554431c76044de281c0: Status 404 returned error can't find the container with id a26eaa05ce58e731c84e902c7de88dfe3499d202c2b67554431c76044de281c0 Apr 16 16:26:53.126065 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:53.126045 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:26:53.733831 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:53.733793 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l6xzd/must-gather-56vxn" event={"ID":"0abff368-40aa-47fe-a037-b930d71d6938","Type":"ContainerStarted","Data":"a26eaa05ce58e731c84e902c7de88dfe3499d202c2b67554431c76044de281c0"} Apr 16 16:26:58.756083 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:58.756042 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l6xzd/must-gather-56vxn" event={"ID":"0abff368-40aa-47fe-a037-b930d71d6938","Type":"ContainerStarted","Data":"f37749f681cf45e6a0e32cc482a5da2dfd7c6d7cb48fac1a053706fc13cfabf4"} Apr 16 16:26:58.756083 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:58.756082 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l6xzd/must-gather-56vxn" event={"ID":"0abff368-40aa-47fe-a037-b930d71d6938","Type":"ContainerStarted","Data":"6912d749e420db2b6cf0b429b27d5380c45c6c349ff4aeeab3a605d6658c6440"} Apr 16 16:26:58.773089 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:26:58.773041 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-l6xzd/must-gather-56vxn" podStartSLOduration=2.082465782 podStartE2EDuration="6.773027945s" podCreationTimestamp="2026-04-16 16:26:52 +0000 UTC" firstStartedPulling="2026-04-16 16:26:53.126210569 +0000 UTC m=+1455.889835261" lastFinishedPulling="2026-04-16 16:26:57.816772732 +0000 UTC m=+1460.580397424" observedRunningTime="2026-04-16 16:26:58.770510288 +0000 UTC m=+1461.534134999" watchObservedRunningTime="2026-04-16 16:26:58.773027945 +0000 UTC m=+1461.536652696" Apr 16 16:27:15.820917 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:15.820830 2577 generic.go:358] "Generic (PLEG): container finished" podID="0abff368-40aa-47fe-a037-b930d71d6938" containerID="6912d749e420db2b6cf0b429b27d5380c45c6c349ff4aeeab3a605d6658c6440" exitCode=0 Apr 16 16:27:15.820917 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:15.820879 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l6xzd/must-gather-56vxn" event={"ID":"0abff368-40aa-47fe-a037-b930d71d6938","Type":"ContainerDied","Data":"6912d749e420db2b6cf0b429b27d5380c45c6c349ff4aeeab3a605d6658c6440"} Apr 16 16:27:15.821345 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:15.821216 2577 scope.go:117] "RemoveContainer" containerID="6912d749e420db2b6cf0b429b27d5380c45c6c349ff4aeeab3a605d6658c6440" Apr 16 16:27:16.457337 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:16.457307 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l6xzd_must-gather-56vxn_0abff368-40aa-47fe-a037-b930d71d6938/gather/0.log" Apr 16 16:27:19.643006 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:19.642982 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-gtdb4_489cc865-55f8-43eb-9e86-909e0581fa83/global-pull-secret-syncer/0.log" Apr 16 16:27:19.847743 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:19.847713 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-dkplw_36fdb312-0ed7-418b-b720-16a3c46fff51/konnectivity-agent/0.log" Apr 16 16:27:19.866807 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:19.866781 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-182.ec2.internal_f8506f0b2e52254b607e38e68b2f4dae/haproxy/0.log" Apr 16 16:27:21.823892 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:21.823860 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-l6xzd/must-gather-56vxn"] Apr 16 16:27:21.824300 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:21.824090 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-l6xzd/must-gather-56vxn" podUID="0abff368-40aa-47fe-a037-b930d71d6938" containerName="copy" containerID="cri-o://f37749f681cf45e6a0e32cc482a5da2dfd7c6d7cb48fac1a053706fc13cfabf4" gracePeriod=2 Apr 16 16:27:21.826237 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:21.826205 2577 status_manager.go:895] "Failed to get status for pod" podUID="0abff368-40aa-47fe-a037-b930d71d6938" pod="openshift-must-gather-l6xzd/must-gather-56vxn" err="pods \"must-gather-56vxn\" is forbidden: User \"system:node:ip-10-0-129-182.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-l6xzd\": no relationship found between node 'ip-10-0-129-182.ec2.internal' and this object" Apr 16 16:27:21.828591 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:21.827870 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-l6xzd/must-gather-56vxn"] Apr 16 16:27:22.049078 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:22.049053 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l6xzd_must-gather-56vxn_0abff368-40aa-47fe-a037-b930d71d6938/copy/0.log" Apr 16 16:27:22.049379 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:22.049365 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l6xzd/must-gather-56vxn" Apr 16 16:27:22.095753 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:22.095674 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qhc9\" (UniqueName: \"kubernetes.io/projected/0abff368-40aa-47fe-a037-b930d71d6938-kube-api-access-6qhc9\") pod \"0abff368-40aa-47fe-a037-b930d71d6938\" (UID: \"0abff368-40aa-47fe-a037-b930d71d6938\") " Apr 16 16:27:22.095753 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:22.095721 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0abff368-40aa-47fe-a037-b930d71d6938-must-gather-output\") pod \"0abff368-40aa-47fe-a037-b930d71d6938\" (UID: \"0abff368-40aa-47fe-a037-b930d71d6938\") " Apr 16 16:27:22.097167 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:22.097125 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0abff368-40aa-47fe-a037-b930d71d6938-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0abff368-40aa-47fe-a037-b930d71d6938" (UID: "0abff368-40aa-47fe-a037-b930d71d6938"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:27:22.097942 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:22.097923 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0abff368-40aa-47fe-a037-b930d71d6938-kube-api-access-6qhc9" (OuterVolumeSpecName: "kube-api-access-6qhc9") pod "0abff368-40aa-47fe-a037-b930d71d6938" (UID: "0abff368-40aa-47fe-a037-b930d71d6938"). InnerVolumeSpecName "kube-api-access-6qhc9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:27:22.198238 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:22.198203 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6qhc9\" (UniqueName: \"kubernetes.io/projected/0abff368-40aa-47fe-a037-b930d71d6938-kube-api-access-6qhc9\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:27:22.198238 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:22.198238 2577 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0abff368-40aa-47fe-a037-b930d71d6938-must-gather-output\") on node \"ip-10-0-129-182.ec2.internal\" DevicePath \"\"" Apr 16 16:27:22.847441 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:22.847412 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l6xzd_must-gather-56vxn_0abff368-40aa-47fe-a037-b930d71d6938/copy/0.log" Apr 16 16:27:22.847833 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:22.847748 2577 generic.go:358] "Generic (PLEG): container finished" podID="0abff368-40aa-47fe-a037-b930d71d6938" containerID="f37749f681cf45e6a0e32cc482a5da2dfd7c6d7cb48fac1a053706fc13cfabf4" exitCode=143 Apr 16 16:27:22.847833 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:22.847815 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l6xzd/must-gather-56vxn" Apr 16 16:27:22.847939 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:22.847845 2577 scope.go:117] "RemoveContainer" containerID="f37749f681cf45e6a0e32cc482a5da2dfd7c6d7cb48fac1a053706fc13cfabf4" Apr 16 16:27:22.856985 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:22.856967 2577 scope.go:117] "RemoveContainer" containerID="6912d749e420db2b6cf0b429b27d5380c45c6c349ff4aeeab3a605d6658c6440" Apr 16 16:27:22.867715 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:22.867698 2577 scope.go:117] "RemoveContainer" containerID="f37749f681cf45e6a0e32cc482a5da2dfd7c6d7cb48fac1a053706fc13cfabf4" Apr 16 16:27:22.868001 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:27:22.867981 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f37749f681cf45e6a0e32cc482a5da2dfd7c6d7cb48fac1a053706fc13cfabf4\": container with ID starting with f37749f681cf45e6a0e32cc482a5da2dfd7c6d7cb48fac1a053706fc13cfabf4 not found: ID does not exist" containerID="f37749f681cf45e6a0e32cc482a5da2dfd7c6d7cb48fac1a053706fc13cfabf4" Apr 16 16:27:22.868054 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:22.868013 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f37749f681cf45e6a0e32cc482a5da2dfd7c6d7cb48fac1a053706fc13cfabf4"} err="failed to get container status \"f37749f681cf45e6a0e32cc482a5da2dfd7c6d7cb48fac1a053706fc13cfabf4\": rpc error: code = NotFound desc = could not find container \"f37749f681cf45e6a0e32cc482a5da2dfd7c6d7cb48fac1a053706fc13cfabf4\": container with ID starting with f37749f681cf45e6a0e32cc482a5da2dfd7c6d7cb48fac1a053706fc13cfabf4 not found: ID does not exist" Apr 16 16:27:22.868054 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:22.868033 2577 scope.go:117] "RemoveContainer" containerID="6912d749e420db2b6cf0b429b27d5380c45c6c349ff4aeeab3a605d6658c6440" Apr 16 16:27:22.868349 ip-10-0-129-182 kubenswrapper[2577]: E0416 16:27:22.868326 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6912d749e420db2b6cf0b429b27d5380c45c6c349ff4aeeab3a605d6658c6440\": container with ID starting with 6912d749e420db2b6cf0b429b27d5380c45c6c349ff4aeeab3a605d6658c6440 not found: ID does not exist" containerID="6912d749e420db2b6cf0b429b27d5380c45c6c349ff4aeeab3a605d6658c6440" Apr 16 16:27:22.868443 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:22.868354 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6912d749e420db2b6cf0b429b27d5380c45c6c349ff4aeeab3a605d6658c6440"} err="failed to get container status \"6912d749e420db2b6cf0b429b27d5380c45c6c349ff4aeeab3a605d6658c6440\": rpc error: code = NotFound desc = could not find container \"6912d749e420db2b6cf0b429b27d5380c45c6c349ff4aeeab3a605d6658c6440\": container with ID starting with 6912d749e420db2b6cf0b429b27d5380c45c6c349ff4aeeab3a605d6658c6440 not found: ID does not exist" Apr 16 16:27:23.240504 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:23.240476 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_049ea6b7-7647-4ba0-bc85-2f00639e58c5/alertmanager/0.log" Apr 16 16:27:23.273304 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:23.273271 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_049ea6b7-7647-4ba0-bc85-2f00639e58c5/config-reloader/0.log" Apr 16 16:27:23.298933 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:23.298905 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_049ea6b7-7647-4ba0-bc85-2f00639e58c5/kube-rbac-proxy-web/0.log" Apr 16 16:27:23.321012 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:23.320987 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_049ea6b7-7647-4ba0-bc85-2f00639e58c5/kube-rbac-proxy/0.log" Apr 16 16:27:23.343735 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:23.343707 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_049ea6b7-7647-4ba0-bc85-2f00639e58c5/kube-rbac-proxy-metric/0.log" Apr 16 16:27:23.367542 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:23.367514 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_049ea6b7-7647-4ba0-bc85-2f00639e58c5/prom-label-proxy/0.log" Apr 16 16:27:23.392935 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:23.392902 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_049ea6b7-7647-4ba0-bc85-2f00639e58c5/init-config-reloader/0.log" Apr 16 16:27:23.428873 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:23.428841 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-c7z5l_5a2e5aa5-e21e-48f0-b556-613a38c7c168/cluster-monitoring-operator/0.log" Apr 16 16:27:23.458227 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:23.458203 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-rfpjk_152001c5-8370-4271-b714-21043a493948/kube-state-metrics/0.log" Apr 16 16:27:23.487928 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:23.487888 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-rfpjk_152001c5-8370-4271-b714-21043a493948/kube-rbac-proxy-main/0.log" Apr 16 16:27:23.514995 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:23.514916 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-rfpjk_152001c5-8370-4271-b714-21043a493948/kube-rbac-proxy-self/0.log" Apr 16 16:27:23.582468 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:23.582439 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-q4vvz_725494dc-ff16-4479-a274-23d904bd29ca/monitoring-plugin/0.log" Apr 16 16:27:23.614772 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:23.614744 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-44r2t_387cec09-a424-4de4-8906-da52b3743df9/node-exporter/0.log" Apr 16 16:27:23.638575 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:23.638553 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-44r2t_387cec09-a424-4de4-8906-da52b3743df9/kube-rbac-proxy/0.log" Apr 16 16:27:23.660636 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:23.660609 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-44r2t_387cec09-a424-4de4-8906-da52b3743df9/init-textfile/0.log" Apr 16 16:27:23.918662 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:23.918584 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0abff368-40aa-47fe-a037-b930d71d6938" path="/var/lib/kubelet/pods/0abff368-40aa-47fe-a037-b930d71d6938/volumes" Apr 16 16:27:24.107597 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:24.107569 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-95j9v_7e7187f1-cf40-463f-b670-90c9d4fc7c3d/prometheus-operator/0.log" Apr 16 16:27:24.133793 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:24.133762 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-95j9v_7e7187f1-cf40-463f-b670-90c9d4fc7c3d/kube-rbac-proxy/0.log" Apr 16 16:27:24.189721 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:24.189683 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7b94b48476-s499c_bacaae01-d5b9-4bfb-9444-11a0a9791f46/telemeter-client/0.log" Apr 16 16:27:24.216817 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:24.216793 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7b94b48476-s499c_bacaae01-d5b9-4bfb-9444-11a0a9791f46/reload/0.log" Apr 16 16:27:24.243371 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:24.243348 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-7b94b48476-s499c_bacaae01-d5b9-4bfb-9444-11a0a9791f46/kube-rbac-proxy/0.log" Apr 16 16:27:26.031950 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:26.031921 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2894b_ff7fcfa4-9774-4831-b686-678a7f92a456/console-operator/2.log" Apr 16 16:27:26.037386 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:26.037365 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2894b_ff7fcfa4-9774-4831-b686-678a7f92a456/console-operator/3.log" Apr 16 16:27:26.408396 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:26.408315 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d558c6874-khbd8_5bbbc794-64c0-4ffa-8843-be6ee3b4546c/console/0.log" Apr 16 16:27:26.438085 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:26.438062 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-8pkwl_5d436e24-a61c-48a2-8a51-1d4972aa2081/download-server/0.log" Apr 16 16:27:26.840228 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:26.840197 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-wjq4m_e55b08d2-3aff-471f-9d16-c3b3eb293f68/volume-data-source-validator/0.log" Apr 16 16:27:26.979324 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:26.979290 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2"] Apr 16 16:27:26.979685 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:26.979670 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0abff368-40aa-47fe-a037-b930d71d6938" containerName="copy" Apr 16 16:27:26.979777 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:26.979689 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0abff368-40aa-47fe-a037-b930d71d6938" containerName="copy" Apr 16 16:27:26.979777 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:26.979719 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0abff368-40aa-47fe-a037-b930d71d6938" containerName="gather" Apr 16 16:27:26.979777 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:26.979727 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0abff368-40aa-47fe-a037-b930d71d6938" containerName="gather" Apr 16 16:27:26.979923 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:26.979836 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="0abff368-40aa-47fe-a037-b930d71d6938" containerName="copy" Apr 16 16:27:26.979923 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:26.979853 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="0abff368-40aa-47fe-a037-b930d71d6938" containerName="gather" Apr 16 16:27:26.983617 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:26.983597 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2" Apr 16 16:27:26.985687 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:26.985670 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4jnll\"/\"openshift-service-ca.crt\"" Apr 16 16:27:26.986901 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:26.986729 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-4jnll\"/\"kube-root-ca.crt\"" Apr 16 16:27:26.986901 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:26.986778 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-4jnll\"/\"default-dockercfg-bq59w\"" Apr 16 16:27:26.992446 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:26.992425 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2"] Apr 16 16:27:27.038436 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.038398 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ee8f7af-206e-40ca-aaaa-63fd4fe505cf-sys\") pod \"perf-node-gather-daemonset-dkls2\" (UID: \"6ee8f7af-206e-40ca-aaaa-63fd4fe505cf\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2" Apr 16 16:27:27.038822 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.038458 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6ee8f7af-206e-40ca-aaaa-63fd4fe505cf-podres\") pod \"perf-node-gather-daemonset-dkls2\" (UID: \"6ee8f7af-206e-40ca-aaaa-63fd4fe505cf\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2" Apr 16 16:27:27.038822 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.038478 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ee8f7af-206e-40ca-aaaa-63fd4fe505cf-lib-modules\") pod \"perf-node-gather-daemonset-dkls2\" (UID: \"6ee8f7af-206e-40ca-aaaa-63fd4fe505cf\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2" Apr 16 16:27:27.038822 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.038506 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdrqn\" (UniqueName: \"kubernetes.io/projected/6ee8f7af-206e-40ca-aaaa-63fd4fe505cf-kube-api-access-mdrqn\") pod \"perf-node-gather-daemonset-dkls2\" (UID: \"6ee8f7af-206e-40ca-aaaa-63fd4fe505cf\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2" Apr 16 16:27:27.038822 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.038537 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6ee8f7af-206e-40ca-aaaa-63fd4fe505cf-proc\") pod \"perf-node-gather-daemonset-dkls2\" (UID: \"6ee8f7af-206e-40ca-aaaa-63fd4fe505cf\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2" Apr 16 16:27:27.139166 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.139049 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6ee8f7af-206e-40ca-aaaa-63fd4fe505cf-podres\") pod \"perf-node-gather-daemonset-dkls2\" (UID: \"6ee8f7af-206e-40ca-aaaa-63fd4fe505cf\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2" Apr 16 16:27:27.139166 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.139091 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ee8f7af-206e-40ca-aaaa-63fd4fe505cf-lib-modules\") pod \"perf-node-gather-daemonset-dkls2\" (UID: \"6ee8f7af-206e-40ca-aaaa-63fd4fe505cf\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2" Apr 16 16:27:27.139166 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.139123 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdrqn\" (UniqueName: \"kubernetes.io/projected/6ee8f7af-206e-40ca-aaaa-63fd4fe505cf-kube-api-access-mdrqn\") pod \"perf-node-gather-daemonset-dkls2\" (UID: \"6ee8f7af-206e-40ca-aaaa-63fd4fe505cf\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2" Apr 16 16:27:27.139166 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.139170 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6ee8f7af-206e-40ca-aaaa-63fd4fe505cf-proc\") pod \"perf-node-gather-daemonset-dkls2\" (UID: \"6ee8f7af-206e-40ca-aaaa-63fd4fe505cf\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2" Apr 16 16:27:27.139413 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.139221 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ee8f7af-206e-40ca-aaaa-63fd4fe505cf-sys\") pod \"perf-node-gather-daemonset-dkls2\" (UID: \"6ee8f7af-206e-40ca-aaaa-63fd4fe505cf\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2" Apr 16 16:27:27.139413 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.139256 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6ee8f7af-206e-40ca-aaaa-63fd4fe505cf-podres\") pod \"perf-node-gather-daemonset-dkls2\" (UID: \"6ee8f7af-206e-40ca-aaaa-63fd4fe505cf\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2" Apr 16 16:27:27.139413 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.139275 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6ee8f7af-206e-40ca-aaaa-63fd4fe505cf-proc\") pod \"perf-node-gather-daemonset-dkls2\" (UID: \"6ee8f7af-206e-40ca-aaaa-63fd4fe505cf\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2" Apr 16 16:27:27.139413 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.139290 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ee8f7af-206e-40ca-aaaa-63fd4fe505cf-lib-modules\") pod \"perf-node-gather-daemonset-dkls2\" (UID: \"6ee8f7af-206e-40ca-aaaa-63fd4fe505cf\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2" Apr 16 16:27:27.139413 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.139295 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6ee8f7af-206e-40ca-aaaa-63fd4fe505cf-sys\") pod \"perf-node-gather-daemonset-dkls2\" (UID: \"6ee8f7af-206e-40ca-aaaa-63fd4fe505cf\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2" Apr 16 16:27:27.147026 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.147008 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdrqn\" (UniqueName: \"kubernetes.io/projected/6ee8f7af-206e-40ca-aaaa-63fd4fe505cf-kube-api-access-mdrqn\") pod \"perf-node-gather-daemonset-dkls2\" (UID: \"6ee8f7af-206e-40ca-aaaa-63fd4fe505cf\") " pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2" Apr 16 16:27:27.294387 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.294349 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2" Apr 16 16:27:27.417518 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.417489 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2"] Apr 16 16:27:27.419868 ip-10-0-129-182 kubenswrapper[2577]: W0416 16:27:27.419839 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6ee8f7af_206e_40ca_aaaa_63fd4fe505cf.slice/crio-7df45d8ab4cf7d2e6356c9558f195d8ba2337c46b6feb328f99848f2b219cce1 WatchSource:0}: Error finding container 7df45d8ab4cf7d2e6356c9558f195d8ba2337c46b6feb328f99848f2b219cce1: Status 404 returned error can't find the container with id 7df45d8ab4cf7d2e6356c9558f195d8ba2337c46b6feb328f99848f2b219cce1 Apr 16 16:27:27.521901 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.521881 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2fk94_98c6de58-d4f8-4f67-ab26-d582517a2717/dns/0.log" Apr 16 16:27:27.541817 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.541796 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2fk94_98c6de58-d4f8-4f67-ab26-d582517a2717/kube-rbac-proxy/0.log" Apr 16 16:27:27.652731 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.652702 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-jt7zj_6463e7fd-1eba-4228-8782-edee4a55c601/dns-node-resolver/0.log" Apr 16 16:27:27.866939 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.866901 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2" event={"ID":"6ee8f7af-206e-40ca-aaaa-63fd4fe505cf","Type":"ContainerStarted","Data":"e236dd4757ec45da65914dd93cda041f0ebd127aba80af72329232fe65970707"} Apr 16 16:27:27.866939 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.866943 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2" event={"ID":"6ee8f7af-206e-40ca-aaaa-63fd4fe505cf","Type":"ContainerStarted","Data":"7df45d8ab4cf7d2e6356c9558f195d8ba2337c46b6feb328f99848f2b219cce1"} Apr 16 16:27:27.867177 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.866982 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2" Apr 16 16:27:27.882479 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:27.882439 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2" podStartSLOduration=1.882425183 podStartE2EDuration="1.882425183s" podCreationTimestamp="2026-04-16 16:27:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:27:27.881499755 +0000 UTC m=+1490.645124462" watchObservedRunningTime="2026-04-16 16:27:27.882425183 +0000 UTC m=+1490.646049893" Apr 16 16:27:28.153782 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:28.153708 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vjpkx_1b804e33-713f-4c6f-a4a1-0181d77254e1/node-ca/0.log" Apr 16 16:27:29.190094 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:29.190056 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ckdqq_24ed6202-e738-4abc-b26d-eec84a76b75b/serve-healthcheck-canary/0.log" Apr 16 16:27:29.624273 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:29.624192 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-8n252_0d70c29e-9509-41d0-b800-db4a02a3db76/insights-operator/1.log" Apr 16 16:27:29.624432 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:29.624340 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-8n252_0d70c29e-9509-41d0-b800-db4a02a3db76/insights-operator/0.log" Apr 16 16:27:29.708046 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:29.708020 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pxhmv_f190f975-83f3-4c5e-977d-1e7688992f0e/kube-rbac-proxy/0.log" Apr 16 16:27:29.729039 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:29.729010 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pxhmv_f190f975-83f3-4c5e-977d-1e7688992f0e/exporter/0.log" Apr 16 16:27:29.750309 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:29.750278 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-pxhmv_f190f975-83f3-4c5e-977d-1e7688992f0e/extractor/0.log" Apr 16 16:27:31.738963 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:31.738930 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-tmmd5_5b50687c-6dca-4515-8303-4455d36b5c34/manager/0.log" Apr 16 16:27:31.852337 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:31.852307 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-9vcg7_3809161a-dca1-49bc-ae65-a6a7ebcb5221/s3-init/0.log" Apr 16 16:27:33.879206 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:33.879183 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-4jnll/perf-node-gather-daemonset-dkls2" Apr 16 16:27:35.629573 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:35.629538 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-wnvth_f1e35f44-03e0-4625-857c-c50176b17d74/migrator/0.log" Apr 16 16:27:35.650014 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:35.649986 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-wnvth_f1e35f44-03e0-4625-857c-c50176b17d74/graceful-termination/0.log" Apr 16 16:27:36.009976 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:36.009940 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-vfg5m_106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb/kube-storage-version-migrator-operator/1.log" Apr 16 16:27:36.010845 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:36.010825 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-vfg5m_106f6ad6-d8e6-4fc6-9a3a-e824bb4270cb/kube-storage-version-migrator-operator/0.log" Apr 16 16:27:37.011166 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:37.011116 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jftn9_6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf/kube-multus-additional-cni-plugins/0.log" Apr 16 16:27:37.032338 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:37.032315 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jftn9_6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf/egress-router-binary-copy/0.log" Apr 16 16:27:37.055945 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:37.055921 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jftn9_6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf/cni-plugins/0.log" Apr 16 16:27:37.076910 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:37.076845 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jftn9_6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf/bond-cni-plugin/0.log" Apr 16 16:27:37.097102 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:37.097078 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jftn9_6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf/routeoverride-cni/0.log" Apr 16 16:27:37.117843 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:37.117816 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jftn9_6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf/whereabouts-cni-bincopy/0.log" Apr 16 16:27:37.136994 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:37.136971 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-jftn9_6fbc43fe-a6ce-49d9-81b1-17e5e07dfacf/whereabouts-cni/0.log" Apr 16 16:27:37.512682 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:37.512657 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-wfqhd_024bdd5b-3034-463f-aa2d-3e55d292bbd0/kube-multus/0.log" Apr 16 16:27:37.534529 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:37.534507 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2b9mp_64649692-472e-4f06-9640-7e6075d1e84f/network-metrics-daemon/0.log" Apr 16 16:27:37.555339 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:37.555313 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2b9mp_64649692-472e-4f06-9640-7e6075d1e84f/kube-rbac-proxy/0.log" Apr 16 16:27:37.890079 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:37.890054 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2894b_ff7fcfa4-9774-4831-b686-678a7f92a456/console-operator/2.log" Apr 16 16:27:37.892907 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:37.892884 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-2894b_ff7fcfa4-9774-4831-b686-678a7f92a456/console-operator/2.log" Apr 16 16:27:37.895773 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:37.895747 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/ovn-acl-logging/0.log" Apr 16 16:27:37.898658 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:37.898643 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/ovn-acl-logging/0.log" Apr 16 16:27:38.690282 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:38.690237 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/ovn-controller/0.log" Apr 16 16:27:38.712071 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:38.712041 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/ovn-acl-logging/0.log" Apr 16 16:27:38.717336 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:38.717311 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/ovn-acl-logging/1.log" Apr 16 16:27:38.736701 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:38.736673 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/kube-rbac-proxy-node/0.log" Apr 16 16:27:38.759007 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:38.758970 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 16:27:38.778335 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:38.778312 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/northd/0.log" Apr 16 16:27:38.801059 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:38.801036 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/nbdb/0.log" Apr 16 16:27:38.822264 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:38.822237 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/sbdb/0.log" Apr 16 16:27:38.921032 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:38.921008 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m2jms_a2826faf-f579-4f51-8772-9882e98d4593/ovnkube-controller/0.log" Apr 16 16:27:40.137651 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:40.137617 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-j96wv_2d37db46-a278-4fec-8cea-0900b1dfb12d/network-check-target-container/0.log" Apr 16 16:27:41.041580 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:41.041552 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-8bxsq_0f6a5a36-0127-4a26-a722-43d4a0b49496/iptables-alerter/0.log" Apr 16 16:27:41.688930 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:41.688899 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-4mstk_bb2a3f4f-a59a-4bd9-867e-c1fc03475427/tuned/0.log" Apr 16 16:27:43.508100 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:43.508074 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-667775844f-czs75_3520921b-a85a-4467-a9fe-2c2eb4308569/cluster-samples-operator/0.log" Apr 16 16:27:43.523643 ip-10-0-129-182 kubenswrapper[2577]: I0416 16:27:43.523611 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-667775844f-czs75_3520921b-a85a-4467-a9fe-2c2eb4308569/cluster-samples-operator-watch/0.log"