Apr 28 19:15:44.950205 ip-10-0-132-160 systemd[1]: Starting Kubernetes Kubelet... Apr 28 19:15:45.373981 ip-10-0-132-160 kubenswrapper[2582]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:15:45.373981 ip-10-0-132-160 kubenswrapper[2582]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 28 19:15:45.373981 ip-10-0-132-160 kubenswrapper[2582]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:15:45.373981 ip-10-0-132-160 kubenswrapper[2582]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 28 19:15:45.373981 ip-10-0-132-160 kubenswrapper[2582]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 28 19:15:45.376227 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.376134 2582 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 28 19:15:45.379629 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379612 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:15:45.379629 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379629 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:15:45.379699 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379633 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:15:45.379699 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379637 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:15:45.379699 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379640 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:15:45.379699 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379643 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:15:45.379699 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379646 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:15:45.379699 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379649 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:15:45.379699 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379652 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:15:45.379699 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379659 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:15:45.379699 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379662 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:15:45.379699 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379665 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:15:45.379699 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379668 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:15:45.379699 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379671 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:15:45.379699 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379674 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:15:45.379699 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379676 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:15:45.379699 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379679 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:15:45.379699 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379682 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:15:45.379699 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379684 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:15:45.379699 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379687 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:15:45.379699 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379689 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:15:45.379699 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379692 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:15:45.380196 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379695 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:15:45.380196 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379697 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:15:45.380196 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379700 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:15:45.380196 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379703 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:15:45.380196 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379705 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:15:45.380196 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379710 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:15:45.380196 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379714 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:15:45.380196 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379716 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:15:45.380196 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379719 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:15:45.380196 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379722 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:15:45.380196 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379727 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:15:45.380196 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379729 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:15:45.380196 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379732 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:15:45.380196 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379735 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:15:45.380196 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379738 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:15:45.380196 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379741 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:15:45.380196 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379743 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:15:45.380196 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379746 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:15:45.380196 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379748 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:15:45.380821 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379751 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:15:45.380821 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379754 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:15:45.380821 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379756 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:15:45.380821 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379758 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:15:45.380821 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379761 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:15:45.380821 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379764 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:15:45.380821 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379766 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:15:45.380821 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379769 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:15:45.380821 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379771 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:15:45.380821 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379773 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:15:45.380821 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379776 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:15:45.380821 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379778 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:15:45.380821 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379781 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:15:45.380821 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379783 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:15:45.380821 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379787 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:15:45.380821 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379790 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:15:45.380821 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379792 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:15:45.380821 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379795 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:15:45.380821 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379797 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:15:45.380821 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379800 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:15:45.381437 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379802 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:15:45.381437 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379805 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:15:45.381437 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379811 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:15:45.381437 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379814 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:15:45.381437 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379817 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:15:45.381437 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379820 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:15:45.381437 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379823 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:15:45.381437 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379826 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:15:45.381437 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379829 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:15:45.381437 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379831 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:15:45.381437 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379834 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:15:45.381437 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379837 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:15:45.381437 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379840 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:15:45.381437 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379842 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:15:45.381437 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379847 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:15:45.381437 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379850 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:15:45.381437 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379853 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:15:45.381437 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379856 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:15:45.381437 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379858 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:15:45.381915 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379861 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:15:45.381915 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379863 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:15:45.381915 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379866 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:15:45.381915 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379869 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:15:45.381915 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379871 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:15:45.381915 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.379874 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:15:45.383883 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383866 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:15:45.383883 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383883 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:15:45.383962 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383888 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:15:45.383962 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383910 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:15:45.383962 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383916 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:15:45.383962 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383920 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:15:45.383962 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383923 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:15:45.383962 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383926 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:15:45.383962 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383929 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:15:45.383962 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383932 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:15:45.383962 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383935 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:15:45.383962 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383938 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:15:45.383962 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383941 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:15:45.383962 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383943 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:15:45.383962 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383947 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:15:45.383962 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383949 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:15:45.383962 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383952 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:15:45.383962 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383954 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:15:45.383962 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383957 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:15:45.383962 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383959 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:15:45.383962 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383962 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:15:45.384414 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383965 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:15:45.384414 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383967 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:15:45.384414 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383970 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:15:45.384414 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383973 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:15:45.384414 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383976 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:15:45.384414 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383980 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:15:45.384414 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383983 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:15:45.384414 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383985 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:15:45.384414 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383987 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:15:45.384414 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.383990 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:15:45.384414 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384014 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:15:45.384414 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384018 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:15:45.384414 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384021 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:15:45.384414 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384024 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:15:45.384414 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384027 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:15:45.384414 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384030 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:15:45.384414 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384032 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:15:45.384414 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384034 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:15:45.384414 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384037 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:15:45.384414 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384039 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:15:45.384923 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384042 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:15:45.384923 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384044 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:15:45.384923 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384046 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:15:45.384923 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384049 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:15:45.384923 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384051 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:15:45.384923 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384055 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:15:45.384923 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384057 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:15:45.384923 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384060 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:15:45.384923 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384062 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:15:45.384923 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384065 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:15:45.384923 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384068 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:15:45.384923 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384070 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:15:45.384923 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384073 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:15:45.384923 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384076 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:15:45.384923 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384078 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:15:45.384923 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384081 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:15:45.384923 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384084 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:15:45.384923 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384086 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:15:45.384923 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384089 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:15:45.384923 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384092 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:15:45.385408 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384095 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:15:45.385408 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384097 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:15:45.385408 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384100 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:15:45.385408 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384103 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:15:45.385408 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384105 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:15:45.385408 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384108 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:15:45.385408 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384110 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:15:45.385408 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384113 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:15:45.385408 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384115 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:15:45.385408 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384118 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:15:45.385408 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384122 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:15:45.385408 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384126 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:15:45.385408 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384128 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:15:45.385408 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384131 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:15:45.385408 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384135 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:15:45.385408 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384137 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:15:45.385408 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384141 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:15:45.385408 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384150 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:15:45.385408 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384153 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384155 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384158 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384161 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384163 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384166 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384168 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384247 2582 flags.go:64] FLAG: --address="0.0.0.0" Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384255 2582 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384262 2582 flags.go:64] FLAG: --anonymous-auth="true" Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384268 2582 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384273 2582 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384276 2582 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384280 2582 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384285 2582 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384288 2582 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384291 2582 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384295 2582 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384298 2582 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384301 2582 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384304 2582 flags.go:64] FLAG: --cgroup-root="" Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384307 2582 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384310 2582 flags.go:64] FLAG: --client-ca-file="" Apr 28 19:15:45.385863 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384313 2582 flags.go:64] FLAG: --cloud-config="" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384315 2582 flags.go:64] FLAG: --cloud-provider="external" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384318 2582 flags.go:64] FLAG: --cluster-dns="[]" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384323 2582 flags.go:64] FLAG: --cluster-domain="" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384326 2582 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384329 2582 flags.go:64] FLAG: --config-dir="" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384331 2582 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384335 2582 flags.go:64] FLAG: --container-log-max-files="5" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384339 2582 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384348 2582 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384351 2582 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384354 2582 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384357 2582 flags.go:64] FLAG: --contention-profiling="false" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384360 2582 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384363 2582 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384366 2582 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384369 2582 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384374 2582 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384377 2582 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384380 2582 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384382 2582 flags.go:64] FLAG: --enable-load-reader="false" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384386 2582 flags.go:64] FLAG: --enable-server="true" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384388 2582 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384393 2582 flags.go:64] FLAG: --event-burst="100" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384397 2582 flags.go:64] FLAG: --event-qps="50" Apr 28 19:15:45.386450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384399 2582 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384402 2582 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384405 2582 flags.go:64] FLAG: --eviction-hard="" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384409 2582 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384412 2582 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384415 2582 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384418 2582 flags.go:64] FLAG: --eviction-soft="" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384421 2582 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384424 2582 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384427 2582 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384430 2582 flags.go:64] FLAG: --experimental-mounter-path="" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384433 2582 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384436 2582 flags.go:64] FLAG: --fail-swap-on="true" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384439 2582 flags.go:64] FLAG: --feature-gates="" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384443 2582 flags.go:64] FLAG: --file-check-frequency="20s" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384446 2582 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384449 2582 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384458 2582 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384461 2582 flags.go:64] FLAG: --healthz-port="10248" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384465 2582 flags.go:64] FLAG: --help="false" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384468 2582 flags.go:64] FLAG: --hostname-override="ip-10-0-132-160.ec2.internal" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384471 2582 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384474 2582 flags.go:64] FLAG: --http-check-frequency="20s" Apr 28 19:15:45.387099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384477 2582 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384481 2582 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384484 2582 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384487 2582 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384489 2582 flags.go:64] FLAG: --image-service-endpoint="" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384492 2582 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384495 2582 flags.go:64] FLAG: --kube-api-burst="100" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384498 2582 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384501 2582 flags.go:64] FLAG: --kube-api-qps="50" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384504 2582 flags.go:64] FLAG: --kube-reserved="" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384506 2582 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384509 2582 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384512 2582 flags.go:64] FLAG: --kubelet-cgroups="" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384515 2582 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384517 2582 flags.go:64] FLAG: --lock-file="" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384521 2582 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384523 2582 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384526 2582 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384532 2582 flags.go:64] FLAG: --log-json-split-stream="false" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384535 2582 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384537 2582 flags.go:64] FLAG: --log-text-split-stream="false" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384540 2582 flags.go:64] FLAG: --logging-format="text" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384543 2582 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384546 2582 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 28 19:15:45.387648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384550 2582 flags.go:64] FLAG: --manifest-url="" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384553 2582 flags.go:64] FLAG: --manifest-url-header="" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384557 2582 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384566 2582 flags.go:64] FLAG: --max-open-files="1000000" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384570 2582 flags.go:64] FLAG: --max-pods="110" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384573 2582 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384576 2582 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384579 2582 flags.go:64] FLAG: --memory-manager-policy="None" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384582 2582 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384585 2582 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384588 2582 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384591 2582 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384601 2582 flags.go:64] FLAG: --node-status-max-images="50" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384604 2582 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384607 2582 flags.go:64] FLAG: --oom-score-adj="-999" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384610 2582 flags.go:64] FLAG: --pod-cidr="" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384613 2582 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384619 2582 flags.go:64] FLAG: --pod-manifest-path="" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384621 2582 flags.go:64] FLAG: --pod-max-pids="-1" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384624 2582 flags.go:64] FLAG: --pods-per-core="0" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384627 2582 flags.go:64] FLAG: --port="10250" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384630 2582 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384633 2582 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0b3c31d14a8150618" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384636 2582 flags.go:64] FLAG: --qos-reserved="" Apr 28 19:15:45.388231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384639 2582 flags.go:64] FLAG: --read-only-port="10255" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384642 2582 flags.go:64] FLAG: --register-node="true" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384644 2582 flags.go:64] FLAG: --register-schedulable="true" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384647 2582 flags.go:64] FLAG: --register-with-taints="" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384651 2582 flags.go:64] FLAG: --registry-burst="10" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384654 2582 flags.go:64] FLAG: --registry-qps="5" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384656 2582 flags.go:64] FLAG: --reserved-cpus="" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384659 2582 flags.go:64] FLAG: --reserved-memory="" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384663 2582 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384666 2582 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384669 2582 flags.go:64] FLAG: --rotate-certificates="false" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384672 2582 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384680 2582 flags.go:64] FLAG: --runonce="false" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384683 2582 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384686 2582 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384689 2582 flags.go:64] FLAG: --seccomp-default="false" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384692 2582 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384695 2582 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384698 2582 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384701 2582 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384704 2582 flags.go:64] FLAG: --storage-driver-password="root" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384707 2582 flags.go:64] FLAG: --storage-driver-secure="false" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384710 2582 flags.go:64] FLAG: --storage-driver-table="stats" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384712 2582 flags.go:64] FLAG: --storage-driver-user="root" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384715 2582 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384719 2582 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 28 19:15:45.388996 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384721 2582 flags.go:64] FLAG: --system-cgroups="" Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384724 2582 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384730 2582 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384732 2582 flags.go:64] FLAG: --tls-cert-file="" Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384735 2582 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384742 2582 flags.go:64] FLAG: --tls-min-version="" Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384744 2582 flags.go:64] FLAG: --tls-private-key-file="" Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384747 2582 flags.go:64] FLAG: --topology-manager-policy="none" Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384750 2582 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384752 2582 flags.go:64] FLAG: --topology-manager-scope="container" Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384755 2582 flags.go:64] FLAG: --v="2" Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384760 2582 flags.go:64] FLAG: --version="false" Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384764 2582 flags.go:64] FLAG: --vmodule="" Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384768 2582 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.384772 2582 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384865 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384869 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384872 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384875 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384883 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384886 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384889 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384892 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:15:45.389623 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384909 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:15:45.390227 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384912 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:15:45.390227 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384915 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:15:45.390227 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384918 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:15:45.390227 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384921 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:15:45.390227 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384924 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:15:45.390227 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384928 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:15:45.390227 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384932 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:15:45.390227 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384934 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:15:45.390227 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384937 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:15:45.390227 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384940 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:15:45.390227 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384942 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:15:45.390227 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384946 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:15:45.390227 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384950 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:15:45.390227 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384953 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:15:45.390227 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384956 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:15:45.390227 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384959 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:15:45.390227 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384962 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:15:45.390227 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384964 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:15:45.390227 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384967 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:15:45.390724 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384969 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:15:45.390724 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384971 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:15:45.390724 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384974 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:15:45.390724 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384977 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:15:45.390724 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384980 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:15:45.390724 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384982 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:15:45.390724 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384984 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:15:45.390724 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384987 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:15:45.390724 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.384989 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:15:45.390724 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385000 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:15:45.390724 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385002 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:15:45.390724 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385008 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:15:45.390724 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385010 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:15:45.390724 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385014 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:15:45.390724 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385016 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:15:45.390724 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385019 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:15:45.390724 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385021 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:15:45.390724 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385024 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:15:45.390724 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385026 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:15:45.390724 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385029 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:15:45.391265 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385031 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:15:45.391265 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385034 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:15:45.391265 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385036 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:15:45.391265 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385039 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:15:45.391265 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385041 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:15:45.391265 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385044 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:15:45.391265 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385046 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:15:45.391265 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385049 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:15:45.391265 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385052 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:15:45.391265 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385055 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:15:45.391265 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385057 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:15:45.391265 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385059 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:15:45.391265 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385062 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:15:45.391265 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385064 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:15:45.391265 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385067 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:15:45.391265 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385070 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:15:45.391265 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385072 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:15:45.391265 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385075 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:15:45.391265 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385077 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:15:45.391265 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385080 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:15:45.391745 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385082 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:15:45.391745 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385085 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:15:45.391745 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385094 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:15:45.391745 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385098 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:15:45.391745 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385101 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:15:45.391745 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385104 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:15:45.391745 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385106 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:15:45.391745 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385109 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:15:45.391745 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385111 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:15:45.391745 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385114 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:15:45.391745 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385116 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:15:45.391745 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385119 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:15:45.391745 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385121 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:15:45.391745 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385124 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:15:45.391745 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385127 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:15:45.391745 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385129 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:15:45.391745 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385131 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:15:45.391745 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.385134 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:15:45.392205 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.385816 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:15:45.393201 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.393180 2582 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 28 19:15:45.393237 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.393203 2582 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 28 19:15:45.393267 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393261 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:15:45.393267 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393266 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:15:45.393322 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393270 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:15:45.393322 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393273 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:15:45.393322 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393276 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:15:45.393322 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393279 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:15:45.393322 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393282 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:15:45.393322 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393285 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:15:45.393322 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393287 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:15:45.393322 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393290 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:15:45.393322 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393293 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:15:45.393322 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393295 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:15:45.393322 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393298 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:15:45.393322 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393308 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:15:45.393322 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393311 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:15:45.393322 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393314 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:15:45.393322 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393316 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:15:45.393322 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393320 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:15:45.393322 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393322 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:15:45.393322 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393325 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:15:45.393322 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393327 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:15:45.393322 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393330 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:15:45.393818 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393333 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:15:45.393818 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393336 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:15:45.393818 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393338 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:15:45.393818 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393341 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:15:45.393818 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393343 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:15:45.393818 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393345 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:15:45.393818 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393348 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:15:45.393818 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393350 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:15:45.393818 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393353 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:15:45.393818 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393355 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:15:45.393818 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393358 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:15:45.393818 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393360 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:15:45.393818 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393362 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:15:45.393818 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393365 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:15:45.393818 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393368 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:15:45.393818 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393370 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:15:45.393818 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393372 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:15:45.393818 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393375 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:15:45.393818 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393377 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:15:45.393818 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393380 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:15:45.394340 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393382 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:15:45.394340 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393385 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:15:45.394340 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393387 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:15:45.394340 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393390 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:15:45.394340 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393400 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:15:45.394340 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393403 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:15:45.394340 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393405 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:15:45.394340 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393407 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:15:45.394340 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393410 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:15:45.394340 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393412 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:15:45.394340 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393415 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:15:45.394340 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393417 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:15:45.394340 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393419 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:15:45.394340 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393423 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:15:45.394340 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393425 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:15:45.394340 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393428 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:15:45.394340 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393431 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:15:45.394340 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393433 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:15:45.394340 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393436 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:15:45.394340 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393438 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:15:45.394866 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393441 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:15:45.394866 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393443 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:15:45.394866 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393446 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:15:45.394866 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393449 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:15:45.394866 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393451 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:15:45.394866 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393453 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:15:45.394866 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393456 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:15:45.394866 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393459 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:15:45.394866 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393463 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:15:45.394866 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393467 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:15:45.394866 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393470 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:15:45.394866 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393473 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:15:45.394866 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393476 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:15:45.394866 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393478 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:15:45.394866 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393481 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:15:45.394866 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393484 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:15:45.394866 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393488 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:15:45.394866 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393497 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:15:45.394866 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393500 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:15:45.395368 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393503 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:15:45.395368 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393506 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:15:45.395368 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393508 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:15:45.395368 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393511 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:15:45.395368 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393514 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:15:45.395368 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.393519 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:15:45.395368 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393623 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 28 19:15:45.395368 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393628 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 28 19:15:45.395368 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393632 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 28 19:15:45.395368 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393634 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 28 19:15:45.395368 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393637 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 28 19:15:45.395368 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393640 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 28 19:15:45.395368 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393642 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 28 19:15:45.395368 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393644 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 28 19:15:45.395368 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393647 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 28 19:15:45.395735 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393649 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 28 19:15:45.395735 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393652 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 28 19:15:45.395735 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393654 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 28 19:15:45.395735 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393657 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 28 19:15:45.395735 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393659 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 28 19:15:45.395735 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393662 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 28 19:15:45.395735 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393664 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 28 19:15:45.395735 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393667 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 28 19:15:45.395735 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393669 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 28 19:15:45.395735 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393671 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 28 19:15:45.395735 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393674 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 28 19:15:45.395735 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393677 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 28 19:15:45.395735 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393679 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 28 19:15:45.395735 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393681 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 28 19:15:45.395735 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393684 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 28 19:15:45.395735 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393687 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 28 19:15:45.395735 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393691 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 28 19:15:45.395735 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393693 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 28 19:15:45.395735 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393696 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 28 19:15:45.395735 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393698 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 28 19:15:45.396240 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393701 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 28 19:15:45.396240 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393703 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 28 19:15:45.396240 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393706 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 28 19:15:45.396240 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393709 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 28 19:15:45.396240 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393711 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 28 19:15:45.396240 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393714 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 28 19:15:45.396240 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393717 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 28 19:15:45.396240 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393719 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 28 19:15:45.396240 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393722 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 28 19:15:45.396240 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393724 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 28 19:15:45.396240 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393728 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 28 19:15:45.396240 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393730 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 28 19:15:45.396240 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393733 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 28 19:15:45.396240 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393736 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 28 19:15:45.396240 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393740 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 28 19:15:45.396240 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393743 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 28 19:15:45.396240 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393745 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 28 19:15:45.396240 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393748 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 28 19:15:45.396240 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393750 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 28 19:15:45.396700 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393752 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 28 19:15:45.396700 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393755 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 28 19:15:45.396700 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393758 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 28 19:15:45.396700 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393760 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 28 19:15:45.396700 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393763 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 28 19:15:45.396700 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393767 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 28 19:15:45.396700 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393770 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 28 19:15:45.396700 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393773 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 28 19:15:45.396700 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393776 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 28 19:15:45.396700 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393779 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 28 19:15:45.396700 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393783 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 28 19:15:45.396700 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393785 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 28 19:15:45.396700 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393788 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 28 19:15:45.396700 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393791 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 28 19:15:45.396700 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393793 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 28 19:15:45.396700 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393796 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 28 19:15:45.396700 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393799 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 28 19:15:45.396700 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393801 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 28 19:15:45.396700 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393804 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 28 19:15:45.396700 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393806 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 28 19:15:45.397203 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393809 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 28 19:15:45.397203 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393811 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 28 19:15:45.397203 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393814 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 28 19:15:45.397203 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393816 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 28 19:15:45.397203 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393819 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 28 19:15:45.397203 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393822 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 28 19:15:45.397203 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393824 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 28 19:15:45.397203 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393827 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 28 19:15:45.397203 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393830 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 28 19:15:45.397203 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393832 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 28 19:15:45.397203 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393835 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 28 19:15:45.397203 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393837 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 28 19:15:45.397203 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393839 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 28 19:15:45.397203 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393842 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 28 19:15:45.397203 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393844 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 28 19:15:45.397203 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393847 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 28 19:15:45.397203 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393849 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 28 19:15:45.397203 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:45.393852 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 28 19:15:45.397644 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.393857 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 28 19:15:45.397644 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.394606 2582 server.go:962] "Client rotation is on, will bootstrap in background" Apr 28 19:15:45.397644 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.397219 2582 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 28 19:15:45.398099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.398087 2582 server.go:1019] "Starting client certificate rotation" Apr 28 19:15:45.398196 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.398181 2582 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 28 19:15:45.398224 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.398214 2582 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 28 19:15:45.421863 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.421842 2582 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 28 19:15:45.427631 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.427593 2582 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 28 19:15:45.443816 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.443664 2582 log.go:25] "Validated CRI v1 runtime API" Apr 28 19:15:45.449738 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.449718 2582 log.go:25] "Validated CRI v1 image API" Apr 28 19:15:45.450983 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.450952 2582 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 28 19:15:45.453462 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.453438 2582 fs.go:135] Filesystem UUIDs: map[53c8fdab-8c73-4993-882a-5b5ff8463951:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 8fbb4217-7cbb-4a68-a8a5-cab94a1e16cd:/dev/nvme0n1p3] Apr 28 19:15:45.453537 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.453461 2582 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 28 19:15:45.456087 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.456065 2582 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 28 19:15:45.459710 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.459593 2582 manager.go:217] Machine: {Timestamp:2026-04-28 19:15:45.457708289 +0000 UTC m=+0.392753430 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101899 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec24739a7c6ba62af6647933f8b4a2f3 SystemUUID:ec24739a-7c6b-a62a-f664-7933f8b4a2f3 BootID:bdb8ad71-513c-4190-ae0b-988f54577f3b Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:f7:44:9d:9f:6f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:f7:44:9d:9f:6f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ba:4e:e9:cc:fd:0b Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 28 19:15:45.459710 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.459706 2582 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 28 19:15:45.459803 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.459795 2582 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 28 19:15:45.462740 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.462710 2582 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 28 19:15:45.462909 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.462743 2582 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-160.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 28 19:15:45.462953 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.462925 2582 topology_manager.go:138] "Creating topology manager with none policy" Apr 28 19:15:45.462953 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.462935 2582 container_manager_linux.go:306] "Creating device plugin manager" Apr 28 19:15:45.462953 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.462949 2582 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 28 19:15:45.463651 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.463641 2582 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 28 19:15:45.464467 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.464457 2582 state_mem.go:36] "Initialized new in-memory state store" Apr 28 19:15:45.464578 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.464569 2582 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 28 19:15:45.467322 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.467312 2582 kubelet.go:491] "Attempting to sync node with API server" Apr 28 19:15:45.467363 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.467327 2582 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 28 19:15:45.467363 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.467339 2582 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 28 19:15:45.467363 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.467350 2582 kubelet.go:397] "Adding apiserver pod source" Apr 28 19:15:45.467363 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.467358 2582 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 28 19:15:45.468686 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.468672 2582 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 28 19:15:45.468760 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.468690 2582 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 28 19:15:45.472060 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.472045 2582 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 28 19:15:45.473359 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.473345 2582 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 28 19:15:45.475153 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.475139 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 28 19:15:45.475153 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.475157 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 28 19:15:45.475254 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.475163 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 28 19:15:45.475254 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.475169 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 28 19:15:45.475254 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.475174 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 28 19:15:45.475254 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.475180 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 28 19:15:45.475254 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.475186 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 28 19:15:45.475254 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.475192 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 28 19:15:45.475254 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.475199 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 28 19:15:45.475254 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.475204 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 28 19:15:45.475254 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.475220 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 28 19:15:45.475254 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.475229 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 28 19:15:45.476275 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.476263 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 28 19:15:45.476313 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.476277 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 28 19:15:45.479824 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.479811 2582 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 28 19:15:45.479892 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.479848 2582 server.go:1295] "Started kubelet" Apr 28 19:15:45.480312 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.480045 2582 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 28 19:15:45.480553 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.480489 2582 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 28 19:15:45.480633 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.480574 2582 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 28 19:15:45.480717 ip-10-0-132-160 systemd[1]: Started Kubernetes Kubelet. Apr 28 19:15:45.481749 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.481727 2582 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 28 19:15:45.486845 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.486535 2582 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-160.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 28 19:15:45.487205 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:45.487176 2582 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-160.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 28 19:15:45.487296 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:45.487200 2582 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 28 19:15:45.487526 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.487509 2582 server.go:317] "Adding debug handlers to kubelet server" Apr 28 19:15:45.490553 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.490529 2582 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 28 19:15:45.490651 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.490579 2582 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 28 19:15:45.491311 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.491285 2582 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 28 19:15:45.491311 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.491320 2582 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 28 19:15:45.491522 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.491433 2582 factory.go:55] Registering systemd factory Apr 28 19:15:45.491522 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.491449 2582 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 28 19:15:45.491522 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.491465 2582 factory.go:223] Registration of the systemd container factory successfully Apr 28 19:15:45.491522 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.491503 2582 reconstruct.go:97] "Volume reconstruction finished" Apr 28 19:15:45.491522 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.491512 2582 reconciler.go:26] "Reconciler: start to sync state" Apr 28 19:15:45.491733 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:45.491539 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-160.ec2.internal\" not found" Apr 28 19:15:45.491877 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.491860 2582 factory.go:153] Registering CRI-O factory Apr 28 19:15:45.491877 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.491876 2582 factory.go:223] Registration of the crio container factory successfully Apr 28 19:15:45.492026 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.491947 2582 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 28 19:15:45.492026 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.491972 2582 factory.go:103] Registering Raw factory Apr 28 19:15:45.492026 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.491987 2582 manager.go:1196] Started watching for new ooms in manager Apr 28 19:15:45.492324 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.492314 2582 manager.go:319] Starting recovery of all containers Apr 28 19:15:45.492356 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:45.491297 2582 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-160.ec2.internal.18aa9b4a17c6119a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-160.ec2.internal,UID:ip-10-0-132-160.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-160.ec2.internal,},FirstTimestamp:2026-04-28 19:15:45.47982377 +0000 UTC m=+0.414868922,LastTimestamp:2026-04-28 19:15:45.47982377 +0000 UTC m=+0.414868922,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-160.ec2.internal,}" Apr 28 19:15:45.492968 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:45.492944 2582 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 28 19:15:45.498299 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:45.498248 2582 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-132-160.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 28 19:15:45.498431 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:45.498306 2582 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 28 19:15:45.503507 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.503487 2582 manager.go:324] Recovery completed Apr 28 19:15:45.507980 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.507960 2582 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qh7mn" Apr 28 19:15:45.508114 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.508102 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:45.510397 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.510383 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:45.510456 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.510418 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:45.510456 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.510430 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:45.511101 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.511046 2582 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 28 19:15:45.511101 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.511059 2582 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 28 19:15:45.511101 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.511079 2582 state_mem.go:36] "Initialized new in-memory state store" Apr 28 19:15:45.513166 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:45.513098 2582 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-160.ec2.internal.18aa9b4a19989c8c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-160.ec2.internal,UID:ip-10-0-132-160.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-132-160.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-132-160.ec2.internal,},FirstTimestamp:2026-04-28 19:15:45.510399116 +0000 UTC m=+0.445444256,LastTimestamp:2026-04-28 19:15:45.510399116 +0000 UTC m=+0.445444256,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-160.ec2.internal,}" Apr 28 19:15:45.514192 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.514177 2582 policy_none.go:49] "None policy: Start" Apr 28 19:15:45.514258 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.514197 2582 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 28 19:15:45.514258 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.514212 2582 state_mem.go:35] "Initializing new in-memory state store" Apr 28 19:15:45.520965 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.520946 2582 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-qh7mn" Apr 28 19:15:45.523245 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:45.523179 2582 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-160.ec2.internal.18aa9b4a1998fb10 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-160.ec2.internal,UID:ip-10-0-132-160.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-132-160.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-132-160.ec2.internal,},FirstTimestamp:2026-04-28 19:15:45.510423312 +0000 UTC m=+0.445468452,LastTimestamp:2026-04-28 19:15:45.510423312 +0000 UTC m=+0.445468452,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-160.ec2.internal,}" Apr 28 19:15:45.549513 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.549494 2582 manager.go:341] "Starting Device Plugin manager" Apr 28 19:15:45.575438 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:45.549534 2582 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 28 19:15:45.575438 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.549545 2582 server.go:85] "Starting device plugin registration server" Apr 28 19:15:45.575438 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.549820 2582 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 28 19:15:45.575438 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.549833 2582 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 28 19:15:45.575438 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.549960 2582 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 28 19:15:45.575438 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.550049 2582 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 28 19:15:45.575438 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.550058 2582 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 28 19:15:45.575438 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:45.550800 2582 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 28 19:15:45.575438 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:45.550833 2582 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-160.ec2.internal\" not found" Apr 28 19:15:45.630681 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.630578 2582 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 28 19:15:45.632025 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.632009 2582 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 28 19:15:45.632130 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.632038 2582 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 28 19:15:45.632130 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.632069 2582 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 28 19:15:45.632130 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.632079 2582 kubelet.go:2451] "Starting kubelet main sync loop" Apr 28 19:15:45.632254 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:45.632171 2582 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 28 19:15:45.635232 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.635210 2582 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:15:45.650511 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.650487 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:45.651772 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.651754 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:45.651881 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.651787 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:45.651881 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.651801 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:45.651881 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.651824 2582 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-160.ec2.internal" Apr 28 19:15:45.662372 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.662356 2582 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-160.ec2.internal" Apr 28 19:15:45.662448 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:45.662381 2582 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-160.ec2.internal\": node \"ip-10-0-132-160.ec2.internal\" not found" Apr 28 19:15:45.681605 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:45.681582 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-160.ec2.internal\" not found" Apr 28 19:15:45.733165 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.733096 2582 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-160.ec2.internal"] Apr 28 19:15:45.733266 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.733220 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:45.734692 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.734676 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:45.734774 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.734705 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:45.734774 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.734715 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:45.736871 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.736858 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:45.737023 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.737009 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" Apr 28 19:15:45.737061 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.737039 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:45.737703 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.737683 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:45.737811 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.737718 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:45.737811 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.737731 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:45.737811 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.737689 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:45.737811 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.737783 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:45.737811 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.737793 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:45.740154 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.740136 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-160.ec2.internal" Apr 28 19:15:45.740234 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.740161 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 28 19:15:45.740797 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.740783 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientMemory" Apr 28 19:15:45.740861 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.740805 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasNoDiskPressure" Apr 28 19:15:45.740861 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.740819 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeHasSufficientPID" Apr 28 19:15:45.756529 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:45.756499 2582 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-160.ec2.internal\" not found" node="ip-10-0-132-160.ec2.internal" Apr 28 19:15:45.760815 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:45.760798 2582 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-160.ec2.internal\" not found" node="ip-10-0-132-160.ec2.internal" Apr 28 19:15:45.781764 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:45.781742 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-160.ec2.internal\" not found" Apr 28 19:15:45.793005 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.792982 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a0393c54dc21367a93a647d0297c0f90-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal\" (UID: \"a0393c54dc21367a93a647d0297c0f90\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" Apr 28 19:15:45.793097 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.793010 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0393c54dc21367a93a647d0297c0f90-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal\" (UID: \"a0393c54dc21367a93a647d0297c0f90\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" Apr 28 19:15:45.793097 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.793028 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/206ff994154571336dcc99880b36f4f2-config\") pod \"kube-apiserver-proxy-ip-10-0-132-160.ec2.internal\" (UID: \"206ff994154571336dcc99880b36f4f2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-160.ec2.internal" Apr 28 19:15:45.882123 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:45.882028 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-160.ec2.internal\" not found" Apr 28 19:15:45.893729 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.893705 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a0393c54dc21367a93a647d0297c0f90-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal\" (UID: \"a0393c54dc21367a93a647d0297c0f90\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" Apr 28 19:15:45.893877 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.893746 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0393c54dc21367a93a647d0297c0f90-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal\" (UID: \"a0393c54dc21367a93a647d0297c0f90\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" Apr 28 19:15:45.893877 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.893774 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/206ff994154571336dcc99880b36f4f2-config\") pod \"kube-apiserver-proxy-ip-10-0-132-160.ec2.internal\" (UID: \"206ff994154571336dcc99880b36f4f2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-160.ec2.internal" Apr 28 19:15:45.893877 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.893819 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/206ff994154571336dcc99880b36f4f2-config\") pod \"kube-apiserver-proxy-ip-10-0-132-160.ec2.internal\" (UID: \"206ff994154571336dcc99880b36f4f2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-160.ec2.internal" Apr 28 19:15:45.893877 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.893846 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0393c54dc21367a93a647d0297c0f90-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal\" (UID: \"a0393c54dc21367a93a647d0297c0f90\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" Apr 28 19:15:45.893877 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:45.893823 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a0393c54dc21367a93a647d0297c0f90-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal\" (UID: \"a0393c54dc21367a93a647d0297c0f90\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" Apr 28 19:15:45.982844 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:45.982809 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-160.ec2.internal\" not found" Apr 28 19:15:46.058346 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.058292 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" Apr 28 19:15:46.062842 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.062824 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-160.ec2.internal" Apr 28 19:15:46.083281 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:46.083251 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-160.ec2.internal\" not found" Apr 28 19:15:46.183960 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:46.183852 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-160.ec2.internal\" not found" Apr 28 19:15:46.284445 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:46.284407 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-160.ec2.internal\" not found" Apr 28 19:15:46.384976 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:46.384943 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-160.ec2.internal\" not found" Apr 28 19:15:46.397113 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.397087 2582 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:15:46.397966 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.397944 2582 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 28 19:15:46.398128 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.398107 2582 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 28 19:15:46.468367 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.468288 2582 apiserver.go:52] "Watching apiserver" Apr 28 19:15:46.488019 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.487994 2582 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 28 19:15:46.490402 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.490380 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-fmthr","openshift-cluster-node-tuning-operator/tuned-8l89d","openshift-image-registry/node-ca-qtt7w","openshift-multus/multus-89vjp","openshift-multus/multus-additional-cni-plugins-qtgvt","openshift-network-operator/iptables-alerter-nvcxv","openshift-ovn-kubernetes/ovnkube-node-6xgsz","kube-system/konnectivity-agent-h8xww","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2","openshift-dns/node-resolver-dn855","openshift-multus/network-metrics-daemon-q2wj9"] Apr 28 19:15:46.491366 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.491328 2582 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 28 19:15:46.491474 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.491337 2582 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" Apr 28 19:15:46.492859 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.492844 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:15:46.492946 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:46.492928 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fmthr" podUID="789c39a9-aea8-4abd-b196-f303f6c6f063" Apr 28 19:15:46.497045 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.497024 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc82k\" (UniqueName: \"kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k\") pod \"network-check-target-fmthr\" (UID: \"789c39a9-aea8-4abd-b196-f303f6c6f063\") " pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:15:46.497374 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.497359 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.499420 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.499390 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qtt7w" Apr 28 19:15:46.500131 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.500112 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-z7lvq\"" Apr 28 19:15:46.500255 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.500234 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 28 19:15:46.500320 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.500234 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:15:46.501685 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.501672 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.501918 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.501845 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.502400 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.502384 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 28 19:15:46.502491 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.502398 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-kbrhp\"" Apr 28 19:15:46.502491 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.502386 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 28 19:15:46.502491 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.502385 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 28 19:15:46.504320 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.504301 2582 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 28 19:15:46.504428 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.504409 2582 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-160.ec2.internal" Apr 28 19:15:46.506874 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.505560 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 28 19:15:46.506874 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.505650 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 28 19:15:46.506874 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.505978 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-n7tb2\"" Apr 28 19:15:46.506874 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.506015 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-9bdgz\"" Apr 28 19:15:46.506874 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.506245 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 28 19:15:46.506874 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.506518 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 28 19:15:46.506874 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.506684 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 28 19:15:46.506874 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.506744 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 28 19:15:46.507542 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.507522 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-h8xww" Apr 28 19:15:46.507706 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.507685 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.510142 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.510120 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.510582 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.510558 2582 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 28 19:15:46.510582 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.510569 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 28 19:15:46.510721 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.510649 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 28 19:15:46.511804 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.511790 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 28 19:15:46.511868 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.511838 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 28 19:15:46.512004 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.511989 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-swf9m\"" Apr 28 19:15:46.512073 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.512011 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 28 19:15:46.512073 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.512031 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 28 19:15:46.512073 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.511993 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 28 19:15:46.512208 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.512078 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-97snv\"" Apr 28 19:15:46.512208 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.512164 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 28 19:15:46.512328 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.512314 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dn855" Apr 28 19:15:46.512979 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.512964 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 28 19:15:46.513147 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.513132 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-bf2sf\"" Apr 28 19:15:46.514586 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.514570 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:15:46.514674 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:46.514645 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q2wj9" podUID="ae2a816c-4f04-45d7-bb27-80786c738721" Apr 28 19:15:46.516721 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.516701 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal"] Apr 28 19:15:46.516820 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.516807 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nvcxv" Apr 28 19:15:46.517047 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.517028 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 28 19:15:46.517131 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.517046 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 28 19:15:46.517575 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.517454 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-132-160.ec2.internal"] Apr 28 19:15:46.517917 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.517797 2582 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 28 19:15:46.518758 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.518741 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 28 19:15:46.518976 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.518957 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 28 19:15:46.519107 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.519092 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-w6sdf\"" Apr 28 19:15:46.521025 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.521003 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 28 19:15:46.522617 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.522467 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:15:46.522709 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.522475 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 28 19:15:46.522709 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.522581 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-hvftn\"" Apr 28 19:15:46.522807 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.522747 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-27 19:10:45 +0000 UTC" deadline="2027-10-01 07:32:42.34632574 +0000 UTC" Apr 28 19:15:46.522807 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.522771 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12492h16m55.823557717s" Apr 28 19:15:46.538526 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.538503 2582 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-6m9d7" Apr 28 19:15:46.548948 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.548922 2582 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-6m9d7" Apr 28 19:15:46.560327 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.560298 2582 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:15:46.592918 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.592879 2582 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 28 19:15:46.598148 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.598123 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-etc-sysconfig\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.598282 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.598170 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-lib-modules\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.598282 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.598190 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-host-run-netns\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.598282 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.598206 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d2c7c82c-bdc2-456a-b466-42dee787562e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.598282 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.598235 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r547k\" (UniqueName: \"kubernetes.io/projected/d2c7c82c-bdc2-456a-b466-42dee787562e-kube-api-access-r547k\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.598282 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.598259 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d2c7c82c-bdc2-456a-b466-42dee787562e-system-cni-dir\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.598282 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.598282 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwbrx\" (UniqueName: \"kubernetes.io/projected/5a509ee1-53b3-4bd7-822e-06cb6363beff-kube-api-access-rwbrx\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.598520 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.598321 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-etc-modprobe-d\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.598520 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.598341 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-sys\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.598682 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.598399 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-os-release\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.598838 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.598810 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2h42\" (UniqueName: \"kubernetes.io/projected/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-kube-api-access-h2h42\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.599305 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.598883 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-var-lib-openvswitch\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.599305 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.598979 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1c539cc3-1090-4486-ab6c-9d184f87803d-serviceca\") pod \"node-ca-qtt7w\" (UID: \"1c539cc3-1090-4486-ab6c-9d184f87803d\") " pod="openshift-image-registry/node-ca-qtt7w" Apr 28 19:15:46.599305 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.599046 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktg5w\" (UniqueName: \"kubernetes.io/projected/51106027-8f90-4285-9257-0da036866696-kube-api-access-ktg5w\") pod \"node-resolver-dn855\" (UID: \"51106027-8f90-4285-9257-0da036866696\") " pod="openshift-dns/node-resolver-dn855" Apr 28 19:15:46.599305 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.599101 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d2c7c82c-bdc2-456a-b466-42dee787562e-os-release\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.599305 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.599130 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d2c7c82c-bdc2-456a-b466-42dee787562e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.599305 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.599151 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f830cd46-083b-489f-b13d-5a749b919ab7-env-overrides\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.599305 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.599179 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f830cd46-083b-489f-b13d-5a749b919ab7-ovnkube-script-lib\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.599305 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.599218 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-cni-binary-copy\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.599305 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.599262 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d2c7c82c-bdc2-456a-b466-42dee787562e-cni-binary-copy\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.599305 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.599290 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-host-slash\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.599744 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.599348 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-host-cni-netd\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.599744 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.599409 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/daa35919-aaa2-4021-afc5-aeb72485f1ea-registration-dir\") pod \"aws-ebs-csi-driver-node-kqcc2\" (UID: \"daa35919-aaa2-4021-afc5-aeb72485f1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.599744 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.599450 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-etc-kubernetes\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.599744 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.599509 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-etc-systemd\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.599744 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.599561 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-multus-cni-dir\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.599744 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.599602 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/daa35919-aaa2-4021-afc5-aeb72485f1ea-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kqcc2\" (UID: \"daa35919-aaa2-4021-afc5-aeb72485f1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.599744 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.599640 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5a509ee1-53b3-4bd7-822e-06cb6363beff-etc-tuned\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.599744 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.599682 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/51106027-8f90-4285-9257-0da036866696-hosts-file\") pod \"node-resolver-dn855\" (UID: \"51106027-8f90-4285-9257-0da036866696\") " pod="openshift-dns/node-resolver-dn855" Apr 28 19:15:46.599744 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.599724 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-host-run-k8s-cni-cncf-io\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.600136 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.599757 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-hostroot\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.600136 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.599801 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-host-run-multus-certs\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.600136 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.599824 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-log-socket\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.600136 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.599856 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/daa35919-aaa2-4021-afc5-aeb72485f1ea-sys-fs\") pod \"aws-ebs-csi-driver-node-kqcc2\" (UID: \"daa35919-aaa2-4021-afc5-aeb72485f1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.600136 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.599885 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-var-lib-kubelet\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.600136 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.600111 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-host-var-lib-kubelet\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.600391 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.600142 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-etc-kubernetes\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.600391 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.600166 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-systemd-units\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.600391 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.600212 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-etc-openvswitch\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.600391 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.600260 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-etc-sysctl-conf\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.600391 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.600310 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.600603 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.600402 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjxx9\" (UniqueName: \"kubernetes.io/projected/f830cd46-083b-489f-b13d-5a749b919ab7-kube-api-access-kjxx9\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.600603 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.600464 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/51106027-8f90-4285-9257-0da036866696-tmp-dir\") pod \"node-resolver-dn855\" (UID: \"51106027-8f90-4285-9257-0da036866696\") " pod="openshift-dns/node-resolver-dn855" Apr 28 19:15:46.600698 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.600663 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-cnibin\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.600744 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.600696 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-multus-socket-dir-parent\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.600785 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.600742 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-multus-daemon-config\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.601360 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.600813 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-host-kubelet\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.601360 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.600859 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-host-cni-bin\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.601360 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.600888 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/daa35919-aaa2-4021-afc5-aeb72485f1ea-etc-selinux\") pod \"aws-ebs-csi-driver-node-kqcc2\" (UID: \"daa35919-aaa2-4021-afc5-aeb72485f1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.601360 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.600970 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phndr\" (UniqueName: \"kubernetes.io/projected/13bd3061-759d-43b2-bf3d-0c09c0a62063-kube-api-access-phndr\") pod \"iptables-alerter-nvcxv\" (UID: \"13bd3061-759d-43b2-bf3d-0c09c0a62063\") " pod="openshift-network-operator/iptables-alerter-nvcxv" Apr 28 19:15:46.601360 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.601000 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-host-var-lib-cni-bin\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.601360 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.601025 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-host-run-ovn-kubernetes\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.601360 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.601052 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-run\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.601360 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.601100 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwr56\" (UniqueName: \"kubernetes.io/projected/1c539cc3-1090-4486-ab6c-9d184f87803d-kube-api-access-xwr56\") pod \"node-ca-qtt7w\" (UID: \"1c539cc3-1090-4486-ab6c-9d184f87803d\") " pod="openshift-image-registry/node-ca-qtt7w" Apr 28 19:15:46.601360 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.601257 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/13bd3061-759d-43b2-bf3d-0c09c0a62063-iptables-alerter-script\") pod \"iptables-alerter-nvcxv\" (UID: \"13bd3061-759d-43b2-bf3d-0c09c0a62063\") " pod="openshift-network-operator/iptables-alerter-nvcxv" Apr 28 19:15:46.601360 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.601316 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f830cd46-083b-489f-b13d-5a749b919ab7-ovnkube-config\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.601360 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.601350 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c539cc3-1090-4486-ab6c-9d184f87803d-host\") pod \"node-ca-qtt7w\" (UID: \"1c539cc3-1090-4486-ab6c-9d184f87803d\") " pod="openshift-image-registry/node-ca-qtt7w" Apr 28 19:15:46.601882 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.601378 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ad20da8d-206b-440c-8c98-5039db8e6f65-agent-certs\") pod \"konnectivity-agent-h8xww\" (UID: \"ad20da8d-206b-440c-8c98-5039db8e6f65\") " pod="kube-system/konnectivity-agent-h8xww" Apr 28 19:15:46.601882 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.601436 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ad20da8d-206b-440c-8c98-5039db8e6f65-konnectivity-ca\") pod \"konnectivity-agent-h8xww\" (UID: \"ad20da8d-206b-440c-8c98-5039db8e6f65\") " pod="kube-system/konnectivity-agent-h8xww" Apr 28 19:15:46.601882 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.601522 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/daa35919-aaa2-4021-afc5-aeb72485f1ea-device-dir\") pod \"aws-ebs-csi-driver-node-kqcc2\" (UID: \"daa35919-aaa2-4021-afc5-aeb72485f1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.601882 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.601607 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5a509ee1-53b3-4bd7-822e-06cb6363beff-tmp\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.601882 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.601687 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13bd3061-759d-43b2-bf3d-0c09c0a62063-host-slash\") pod \"iptables-alerter-nvcxv\" (UID: \"13bd3061-759d-43b2-bf3d-0c09c0a62063\") " pod="openshift-network-operator/iptables-alerter-nvcxv" Apr 28 19:15:46.601882 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.601738 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-system-cni-dir\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.601882 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.601792 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-multus-conf-dir\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.601882 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.601844 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-host\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.602222 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.601890 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs\") pod \"network-metrics-daemon-q2wj9\" (UID: \"ae2a816c-4f04-45d7-bb27-80786c738721\") " pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:15:46.602222 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.601975 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d2c7c82c-bdc2-456a-b466-42dee787562e-cnibin\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.602222 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.602034 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-host-run-netns\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.602222 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.602079 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-run-systemd\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.602502 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.602132 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-run-openvswitch\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.602546 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.602532 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-node-log\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.602582 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.602567 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f830cd46-083b-489f-b13d-5a749b919ab7-ovn-node-metrics-cert\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.602622 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.602603 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cc82k\" (UniqueName: \"kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k\") pod \"network-check-target-fmthr\" (UID: \"789c39a9-aea8-4abd-b196-f303f6c6f063\") " pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:15:46.602665 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.602630 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/daa35919-aaa2-4021-afc5-aeb72485f1ea-socket-dir\") pod \"aws-ebs-csi-driver-node-kqcc2\" (UID: \"daa35919-aaa2-4021-afc5-aeb72485f1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.602710 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.602674 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs7q5\" (UniqueName: \"kubernetes.io/projected/daa35919-aaa2-4021-afc5-aeb72485f1ea-kube-api-access-rs7q5\") pod \"aws-ebs-csi-driver-node-kqcc2\" (UID: \"daa35919-aaa2-4021-afc5-aeb72485f1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.602744 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.602710 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-etc-sysctl-d\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.602786 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.602737 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mxxm\" (UniqueName: \"kubernetes.io/projected/ae2a816c-4f04-45d7-bb27-80786c738721-kube-api-access-6mxxm\") pod \"network-metrics-daemon-q2wj9\" (UID: \"ae2a816c-4f04-45d7-bb27-80786c738721\") " pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:15:46.602822 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.602785 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-host-var-lib-cni-multus\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.602822 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.602808 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d2c7c82c-bdc2-456a-b466-42dee787562e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.602913 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.602830 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-run-ovn\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.610770 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:46.610749 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:15:46.610770 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:46.610771 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:15:46.610950 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:46.610781 2582 projected.go:194] Error preparing data for projected volume kube-api-access-cc82k for pod openshift-network-diagnostics/network-check-target-fmthr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:46.610950 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:46.610859 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k podName:789c39a9-aea8-4abd-b196-f303f6c6f063 nodeName:}" failed. No retries permitted until 2026-04-28 19:15:47.110832844 +0000 UTC m=+2.045877994 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cc82k" (UniqueName: "kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k") pod "network-check-target-fmthr" (UID: "789c39a9-aea8-4abd-b196-f303f6c6f063") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:46.629305 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:46.629256 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod206ff994154571336dcc99880b36f4f2.slice/crio-f4c81d8d8762d361dbda1464e6f3fea65bbe93f14d3ed8acb32ca0556763ec8c WatchSource:0}: Error finding container f4c81d8d8762d361dbda1464e6f3fea65bbe93f14d3ed8acb32ca0556763ec8c: Status 404 returned error can't find the container with id f4c81d8d8762d361dbda1464e6f3fea65bbe93f14d3ed8acb32ca0556763ec8c Apr 28 19:15:46.629536 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:46.629512 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0393c54dc21367a93a647d0297c0f90.slice/crio-475fe9c03c2ccbe1e316465ec05769f26efeab2603867a65257b49a4546e3815 WatchSource:0}: Error finding container 475fe9c03c2ccbe1e316465ec05769f26efeab2603867a65257b49a4546e3815: Status 404 returned error can't find the container with id 475fe9c03c2ccbe1e316465ec05769f26efeab2603867a65257b49a4546e3815 Apr 28 19:15:46.634007 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.633987 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:15:46.635629 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.635563 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-160.ec2.internal" event={"ID":"206ff994154571336dcc99880b36f4f2","Type":"ContainerStarted","Data":"f4c81d8d8762d361dbda1464e6f3fea65bbe93f14d3ed8acb32ca0556763ec8c"} Apr 28 19:15:46.636620 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.636600 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" event={"ID":"a0393c54dc21367a93a647d0297c0f90","Type":"ContainerStarted","Data":"475fe9c03c2ccbe1e316465ec05769f26efeab2603867a65257b49a4546e3815"} Apr 28 19:15:46.703738 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.703698 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/daa35919-aaa2-4021-afc5-aeb72485f1ea-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kqcc2\" (UID: \"daa35919-aaa2-4021-afc5-aeb72485f1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.703738 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.703740 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5a509ee1-53b3-4bd7-822e-06cb6363beff-etc-tuned\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.704011 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.703758 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/51106027-8f90-4285-9257-0da036866696-hosts-file\") pod \"node-resolver-dn855\" (UID: \"51106027-8f90-4285-9257-0da036866696\") " pod="openshift-dns/node-resolver-dn855" Apr 28 19:15:46.704011 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.703776 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-host-run-k8s-cni-cncf-io\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.704011 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.703792 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-hostroot\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.704011 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.703807 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-host-run-multus-certs\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.704011 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.703828 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-log-socket\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.704011 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.703853 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/51106027-8f90-4285-9257-0da036866696-hosts-file\") pod \"node-resolver-dn855\" (UID: \"51106027-8f90-4285-9257-0da036866696\") " pod="openshift-dns/node-resolver-dn855" Apr 28 19:15:46.704011 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.703868 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-host-run-k8s-cni-cncf-io\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.704011 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.703875 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-hostroot\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.704011 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.703864 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/daa35919-aaa2-4021-afc5-aeb72485f1ea-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kqcc2\" (UID: \"daa35919-aaa2-4021-afc5-aeb72485f1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.704011 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.703877 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/daa35919-aaa2-4021-afc5-aeb72485f1ea-sys-fs\") pod \"aws-ebs-csi-driver-node-kqcc2\" (UID: \"daa35919-aaa2-4021-afc5-aeb72485f1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.704011 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.703929 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-log-socket\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.704011 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.703931 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-host-run-multus-certs\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.704011 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.703953 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-var-lib-kubelet\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.704011 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.703972 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/daa35919-aaa2-4021-afc5-aeb72485f1ea-sys-fs\") pod \"aws-ebs-csi-driver-node-kqcc2\" (UID: \"daa35919-aaa2-4021-afc5-aeb72485f1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.704011 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.703985 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-host-var-lib-kubelet\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.704011 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704010 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-etc-kubernetes\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.704615 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704031 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-host-var-lib-kubelet\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.704615 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704029 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-var-lib-kubelet\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.704615 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704064 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-etc-kubernetes\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.704615 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704061 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-systemd-units\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.704615 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704095 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-etc-openvswitch\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.704615 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704110 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-etc-sysctl-conf\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.704615 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704144 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.704615 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704149 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-etc-openvswitch\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.704615 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704163 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjxx9\" (UniqueName: \"kubernetes.io/projected/f830cd46-083b-489f-b13d-5a749b919ab7-kube-api-access-kjxx9\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.704615 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704159 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-systemd-units\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.704615 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704141 2582 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 28 19:15:46.704615 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704200 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.704615 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704252 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-etc-sysctl-conf\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.704615 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704290 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/51106027-8f90-4285-9257-0da036866696-tmp-dir\") pod \"node-resolver-dn855\" (UID: \"51106027-8f90-4285-9257-0da036866696\") " pod="openshift-dns/node-resolver-dn855" Apr 28 19:15:46.704615 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704332 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-cnibin\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.704615 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704359 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-multus-socket-dir-parent\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.704615 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704389 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-multus-daemon-config\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.704615 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704414 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-host-kubelet\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.705403 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704438 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-host-cni-bin\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.705403 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704446 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-multus-socket-dir-parent\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.705403 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704455 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-host-kubelet\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.705403 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704438 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-cnibin\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.705403 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704482 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-host-cni-bin\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.705403 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704514 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/daa35919-aaa2-4021-afc5-aeb72485f1ea-etc-selinux\") pod \"aws-ebs-csi-driver-node-kqcc2\" (UID: \"daa35919-aaa2-4021-afc5-aeb72485f1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.705403 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704543 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phndr\" (UniqueName: \"kubernetes.io/projected/13bd3061-759d-43b2-bf3d-0c09c0a62063-kube-api-access-phndr\") pod \"iptables-alerter-nvcxv\" (UID: \"13bd3061-759d-43b2-bf3d-0c09c0a62063\") " pod="openshift-network-operator/iptables-alerter-nvcxv" Apr 28 19:15:46.705403 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704581 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/daa35919-aaa2-4021-afc5-aeb72485f1ea-etc-selinux\") pod \"aws-ebs-csi-driver-node-kqcc2\" (UID: \"daa35919-aaa2-4021-afc5-aeb72485f1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.705403 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704596 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-host-var-lib-cni-bin\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.705403 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704601 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/51106027-8f90-4285-9257-0da036866696-tmp-dir\") pod \"node-resolver-dn855\" (UID: \"51106027-8f90-4285-9257-0da036866696\") " pod="openshift-dns/node-resolver-dn855" Apr 28 19:15:46.705403 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704636 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-host-run-ovn-kubernetes\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.705403 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704665 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-run\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.705403 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704704 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-host-var-lib-cni-bin\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.705403 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704709 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwr56\" (UniqueName: \"kubernetes.io/projected/1c539cc3-1090-4486-ab6c-9d184f87803d-kube-api-access-xwr56\") pod \"node-ca-qtt7w\" (UID: \"1c539cc3-1090-4486-ab6c-9d184f87803d\") " pod="openshift-image-registry/node-ca-qtt7w" Apr 28 19:15:46.705403 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704710 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-host-run-ovn-kubernetes\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.705403 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704740 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/13bd3061-759d-43b2-bf3d-0c09c0a62063-iptables-alerter-script\") pod \"iptables-alerter-nvcxv\" (UID: \"13bd3061-759d-43b2-bf3d-0c09c0a62063\") " pod="openshift-network-operator/iptables-alerter-nvcxv" Apr 28 19:15:46.705403 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704752 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-run\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.705403 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704765 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f830cd46-083b-489f-b13d-5a749b919ab7-ovnkube-config\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.706223 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704794 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c539cc3-1090-4486-ab6c-9d184f87803d-host\") pod \"node-ca-qtt7w\" (UID: \"1c539cc3-1090-4486-ab6c-9d184f87803d\") " pod="openshift-image-registry/node-ca-qtt7w" Apr 28 19:15:46.706223 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704827 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ad20da8d-206b-440c-8c98-5039db8e6f65-agent-certs\") pod \"konnectivity-agent-h8xww\" (UID: \"ad20da8d-206b-440c-8c98-5039db8e6f65\") " pod="kube-system/konnectivity-agent-h8xww" Apr 28 19:15:46.706223 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704851 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ad20da8d-206b-440c-8c98-5039db8e6f65-konnectivity-ca\") pod \"konnectivity-agent-h8xww\" (UID: \"ad20da8d-206b-440c-8c98-5039db8e6f65\") " pod="kube-system/konnectivity-agent-h8xww" Apr 28 19:15:46.706223 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.704881 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/daa35919-aaa2-4021-afc5-aeb72485f1ea-device-dir\") pod \"aws-ebs-csi-driver-node-kqcc2\" (UID: \"daa35919-aaa2-4021-afc5-aeb72485f1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.706223 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705031 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1c539cc3-1090-4486-ab6c-9d184f87803d-host\") pod \"node-ca-qtt7w\" (UID: \"1c539cc3-1090-4486-ab6c-9d184f87803d\") " pod="openshift-image-registry/node-ca-qtt7w" Apr 28 19:15:46.706223 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705081 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/daa35919-aaa2-4021-afc5-aeb72485f1ea-device-dir\") pod \"aws-ebs-csi-driver-node-kqcc2\" (UID: \"daa35919-aaa2-4021-afc5-aeb72485f1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.706223 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705108 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5a509ee1-53b3-4bd7-822e-06cb6363beff-tmp\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.706223 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705131 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13bd3061-759d-43b2-bf3d-0c09c0a62063-host-slash\") pod \"iptables-alerter-nvcxv\" (UID: \"13bd3061-759d-43b2-bf3d-0c09c0a62063\") " pod="openshift-network-operator/iptables-alerter-nvcxv" Apr 28 19:15:46.706223 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705151 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-system-cni-dir\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.706223 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705158 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-multus-daemon-config\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.706223 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705170 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-multus-conf-dir\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.706223 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705189 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-host\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.706223 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705211 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs\") pod \"network-metrics-daemon-q2wj9\" (UID: \"ae2a816c-4f04-45d7-bb27-80786c738721\") " pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:15:46.706223 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705233 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13bd3061-759d-43b2-bf3d-0c09c0a62063-host-slash\") pod \"iptables-alerter-nvcxv\" (UID: \"13bd3061-759d-43b2-bf3d-0c09c0a62063\") " pod="openshift-network-operator/iptables-alerter-nvcxv" Apr 28 19:15:46.706223 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705233 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d2c7c82c-bdc2-456a-b466-42dee787562e-cnibin\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.706223 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705272 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-host-run-netns\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.706223 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705295 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-run-systemd\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.706223 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705315 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/13bd3061-759d-43b2-bf3d-0c09c0a62063-iptables-alerter-script\") pod \"iptables-alerter-nvcxv\" (UID: \"13bd3061-759d-43b2-bf3d-0c09c0a62063\") " pod="openshift-network-operator/iptables-alerter-nvcxv" Apr 28 19:15:46.707062 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705319 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-run-openvswitch\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.707062 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705349 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f830cd46-083b-489f-b13d-5a749b919ab7-ovnkube-config\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.707062 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705362 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-node-log\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.707062 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705403 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-multus-conf-dir\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.707062 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705408 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-node-log\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.707062 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705440 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f830cd46-083b-489f-b13d-5a749b919ab7-ovn-node-metrics-cert\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.707062 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705465 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d2c7c82c-bdc2-456a-b466-42dee787562e-cnibin\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.707062 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:46.705472 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:46.707062 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705488 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/daa35919-aaa2-4021-afc5-aeb72485f1ea-socket-dir\") pod \"aws-ebs-csi-driver-node-kqcc2\" (UID: \"daa35919-aaa2-4021-afc5-aeb72485f1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.707062 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705502 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-host-run-netns\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.707062 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705512 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rs7q5\" (UniqueName: \"kubernetes.io/projected/daa35919-aaa2-4021-afc5-aeb72485f1ea-kube-api-access-rs7q5\") pod \"aws-ebs-csi-driver-node-kqcc2\" (UID: \"daa35919-aaa2-4021-afc5-aeb72485f1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.707062 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705526 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ad20da8d-206b-440c-8c98-5039db8e6f65-konnectivity-ca\") pod \"konnectivity-agent-h8xww\" (UID: \"ad20da8d-206b-440c-8c98-5039db8e6f65\") " pod="kube-system/konnectivity-agent-h8xww" Apr 28 19:15:46.707062 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:46.705545 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs podName:ae2a816c-4f04-45d7-bb27-80786c738721 nodeName:}" failed. No retries permitted until 2026-04-28 19:15:47.205513263 +0000 UTC m=+2.140558392 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs") pod "network-metrics-daemon-q2wj9" (UID: "ae2a816c-4f04-45d7-bb27-80786c738721") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:46.707062 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705542 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-run-systemd\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.707062 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705591 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-host\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.707062 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705617 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-run-openvswitch\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.707062 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705669 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-etc-sysctl-d\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.707809 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705692 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mxxm\" (UniqueName: \"kubernetes.io/projected/ae2a816c-4f04-45d7-bb27-80786c738721-kube-api-access-6mxxm\") pod \"network-metrics-daemon-q2wj9\" (UID: \"ae2a816c-4f04-45d7-bb27-80786c738721\") " pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:15:46.707809 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705727 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-host-var-lib-cni-multus\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.707809 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705801 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-host-var-lib-cni-multus\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.707809 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705798 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-etc-sysctl-d\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.707809 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705855 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-system-cni-dir\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.707809 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705913 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d2c7c82c-bdc2-456a-b466-42dee787562e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.707809 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705948 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-run-ovn\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.707809 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705973 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-etc-sysconfig\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.707809 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.705996 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-lib-modules\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.707809 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706019 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/daa35919-aaa2-4021-afc5-aeb72485f1ea-socket-dir\") pod \"aws-ebs-csi-driver-node-kqcc2\" (UID: \"daa35919-aaa2-4021-afc5-aeb72485f1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.707809 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706027 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-host-run-netns\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.707809 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706055 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d2c7c82c-bdc2-456a-b466-42dee787562e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.707809 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706069 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-etc-sysconfig\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.707809 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706086 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r547k\" (UniqueName: \"kubernetes.io/projected/d2c7c82c-bdc2-456a-b466-42dee787562e-kube-api-access-r547k\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.707809 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706104 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-run-ovn\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.707809 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706114 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d2c7c82c-bdc2-456a-b466-42dee787562e-system-cni-dir\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.707809 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706139 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwbrx\" (UniqueName: \"kubernetes.io/projected/5a509ee1-53b3-4bd7-822e-06cb6363beff-kube-api-access-rwbrx\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.708556 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706138 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-host-run-netns\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.708556 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706168 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-etc-modprobe-d\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.708556 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706193 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-sys\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.708556 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706217 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-os-release\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.708556 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706241 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h2h42\" (UniqueName: \"kubernetes.io/projected/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-kube-api-access-h2h42\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.708556 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706234 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-lib-modules\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.708556 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706348 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-var-lib-openvswitch\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.708556 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706517 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1c539cc3-1090-4486-ab6c-9d184f87803d-serviceca\") pod \"node-ca-qtt7w\" (UID: \"1c539cc3-1090-4486-ab6c-9d184f87803d\") " pod="openshift-image-registry/node-ca-qtt7w" Apr 28 19:15:46.708556 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706544 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktg5w\" (UniqueName: \"kubernetes.io/projected/51106027-8f90-4285-9257-0da036866696-kube-api-access-ktg5w\") pod \"node-resolver-dn855\" (UID: \"51106027-8f90-4285-9257-0da036866696\") " pod="openshift-dns/node-resolver-dn855" Apr 28 19:15:46.708556 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706571 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d2c7c82c-bdc2-456a-b466-42dee787562e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.708556 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706580 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d2c7c82c-bdc2-456a-b466-42dee787562e-os-release\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.708556 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706606 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d2c7c82c-bdc2-456a-b466-42dee787562e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.708556 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706632 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f830cd46-083b-489f-b13d-5a749b919ab7-env-overrides\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.708556 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706672 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f830cd46-083b-489f-b13d-5a749b919ab7-ovnkube-script-lib\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.708556 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706702 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-cni-binary-copy\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.708556 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706726 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d2c7c82c-bdc2-456a-b466-42dee787562e-cni-binary-copy\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.708556 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706757 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-host-slash\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.709104 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706782 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-host-cni-netd\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.709104 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706808 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/daa35919-aaa2-4021-afc5-aeb72485f1ea-registration-dir\") pod \"aws-ebs-csi-driver-node-kqcc2\" (UID: \"daa35919-aaa2-4021-afc5-aeb72485f1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.709104 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706842 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-etc-kubernetes\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.709104 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706876 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-etc-systemd\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.709104 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.706922 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-multus-cni-dir\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.709104 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.707014 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d2c7c82c-bdc2-456a-b466-42dee787562e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.709104 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.707046 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-multus-cni-dir\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.709104 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.707073 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d2c7c82c-bdc2-456a-b466-42dee787562e-system-cni-dir\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.709104 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.707100 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-etc-modprobe-d\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.709104 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.707109 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d2c7c82c-bdc2-456a-b466-42dee787562e-os-release\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.709104 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.707161 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-var-lib-openvswitch\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.709104 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.707265 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d2c7c82c-bdc2-456a-b466-42dee787562e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.709104 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.707364 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-sys\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.709104 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.707424 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-os-release\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.709104 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.707477 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-host-slash\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.709104 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.707703 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f830cd46-083b-489f-b13d-5a749b919ab7-env-overrides\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.709104 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.707749 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/5a509ee1-53b3-4bd7-822e-06cb6363beff-etc-tuned\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.709605 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.707774 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f830cd46-083b-489f-b13d-5a749b919ab7-host-cni-netd\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.709605 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.707830 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/daa35919-aaa2-4021-afc5-aeb72485f1ea-registration-dir\") pod \"aws-ebs-csi-driver-node-kqcc2\" (UID: \"daa35919-aaa2-4021-afc5-aeb72485f1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.709605 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.707882 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-etc-kubernetes\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.709605 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.707955 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/5a509ee1-53b3-4bd7-822e-06cb6363beff-etc-systemd\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.709605 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.708038 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-cni-binary-copy\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.709605 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.708271 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/5a509ee1-53b3-4bd7-822e-06cb6363beff-tmp\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.709605 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.708345 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1c539cc3-1090-4486-ab6c-9d184f87803d-serviceca\") pod \"node-ca-qtt7w\" (UID: \"1c539cc3-1090-4486-ab6c-9d184f87803d\") " pod="openshift-image-registry/node-ca-qtt7w" Apr 28 19:15:46.709605 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.708547 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f830cd46-083b-489f-b13d-5a749b919ab7-ovnkube-script-lib\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.709605 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.708577 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f830cd46-083b-489f-b13d-5a749b919ab7-ovn-node-metrics-cert\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.709605 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.708701 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ad20da8d-206b-440c-8c98-5039db8e6f65-agent-certs\") pod \"konnectivity-agent-h8xww\" (UID: \"ad20da8d-206b-440c-8c98-5039db8e6f65\") " pod="kube-system/konnectivity-agent-h8xww" Apr 28 19:15:46.713101 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.713083 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d2c7c82c-bdc2-456a-b466-42dee787562e-cni-binary-copy\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.716653 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.716622 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phndr\" (UniqueName: \"kubernetes.io/projected/13bd3061-759d-43b2-bf3d-0c09c0a62063-kube-api-access-phndr\") pod \"iptables-alerter-nvcxv\" (UID: \"13bd3061-759d-43b2-bf3d-0c09c0a62063\") " pod="openshift-network-operator/iptables-alerter-nvcxv" Apr 28 19:15:46.716790 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.716624 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjxx9\" (UniqueName: \"kubernetes.io/projected/f830cd46-083b-489f-b13d-5a749b919ab7-kube-api-access-kjxx9\") pod \"ovnkube-node-6xgsz\" (UID: \"f830cd46-083b-489f-b13d-5a749b919ab7\") " pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.718003 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.717698 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwr56\" (UniqueName: \"kubernetes.io/projected/1c539cc3-1090-4486-ab6c-9d184f87803d-kube-api-access-xwr56\") pod \"node-ca-qtt7w\" (UID: \"1c539cc3-1090-4486-ab6c-9d184f87803d\") " pod="openshift-image-registry/node-ca-qtt7w" Apr 28 19:15:46.718626 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.718539 2582 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:15:46.718959 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.718923 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r547k\" (UniqueName: \"kubernetes.io/projected/d2c7c82c-bdc2-456a-b466-42dee787562e-kube-api-access-r547k\") pod \"multus-additional-cni-plugins-qtgvt\" (UID: \"d2c7c82c-bdc2-456a-b466-42dee787562e\") " pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.720440 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.720375 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2h42\" (UniqueName: \"kubernetes.io/projected/ae3db714-6a3b-402e-af99-fdeffaa6cdfa-kube-api-access-h2h42\") pod \"multus-89vjp\" (UID: \"ae3db714-6a3b-402e-af99-fdeffaa6cdfa\") " pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.720549 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.720528 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs7q5\" (UniqueName: \"kubernetes.io/projected/daa35919-aaa2-4021-afc5-aeb72485f1ea-kube-api-access-rs7q5\") pod \"aws-ebs-csi-driver-node-kqcc2\" (UID: \"daa35919-aaa2-4021-afc5-aeb72485f1ea\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.722541 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.720701 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktg5w\" (UniqueName: \"kubernetes.io/projected/51106027-8f90-4285-9257-0da036866696-kube-api-access-ktg5w\") pod \"node-resolver-dn855\" (UID: \"51106027-8f90-4285-9257-0da036866696\") " pod="openshift-dns/node-resolver-dn855" Apr 28 19:15:46.722541 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.721298 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwbrx\" (UniqueName: \"kubernetes.io/projected/5a509ee1-53b3-4bd7-822e-06cb6363beff-kube-api-access-rwbrx\") pod \"tuned-8l89d\" (UID: \"5a509ee1-53b3-4bd7-822e-06cb6363beff\") " pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.722541 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.721742 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mxxm\" (UniqueName: \"kubernetes.io/projected/ae2a816c-4f04-45d7-bb27-80786c738721-kube-api-access-6mxxm\") pod \"network-metrics-daemon-q2wj9\" (UID: \"ae2a816c-4f04-45d7-bb27-80786c738721\") " pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:15:46.822163 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.822123 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8l89d" Apr 28 19:15:46.829035 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:46.829009 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a509ee1_53b3_4bd7_822e_06cb6363beff.slice/crio-2f531b59b7adb4fc7299465c5448e04ebbbf2d462c5ff4634153a7f998ec7f7f WatchSource:0}: Error finding container 2f531b59b7adb4fc7299465c5448e04ebbbf2d462c5ff4634153a7f998ec7f7f: Status 404 returned error can't find the container with id 2f531b59b7adb4fc7299465c5448e04ebbbf2d462c5ff4634153a7f998ec7f7f Apr 28 19:15:46.837000 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.836980 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qtt7w" Apr 28 19:15:46.843680 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:46.843657 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c539cc3_1090_4486_ab6c_9d184f87803d.slice/crio-5a7cb7d4ad664267467f891e3df1aae9310193279d0be44cf1b1e9b47534d175 WatchSource:0}: Error finding container 5a7cb7d4ad664267467f891e3df1aae9310193279d0be44cf1b1e9b47534d175: Status 404 returned error can't find the container with id 5a7cb7d4ad664267467f891e3df1aae9310193279d0be44cf1b1e9b47534d175 Apr 28 19:15:46.847680 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.847658 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-89vjp" Apr 28 19:15:46.855294 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:46.855267 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae3db714_6a3b_402e_af99_fdeffaa6cdfa.slice/crio-9e85d0485aa3f986871bc23540eee5ed57f861084d2f9946fd7820fa459d3679 WatchSource:0}: Error finding container 9e85d0485aa3f986871bc23540eee5ed57f861084d2f9946fd7820fa459d3679: Status 404 returned error can't find the container with id 9e85d0485aa3f986871bc23540eee5ed57f861084d2f9946fd7820fa459d3679 Apr 28 19:15:46.862494 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.862474 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qtgvt" Apr 28 19:15:46.868067 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.868049 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-h8xww" Apr 28 19:15:46.868736 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:46.868713 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2c7c82c_bdc2_456a_b466_42dee787562e.slice/crio-b427cd85b80d223b1533825ae53909c693668f186f4b65d5b77bc278a72738b7 WatchSource:0}: Error finding container b427cd85b80d223b1533825ae53909c693668f186f4b65d5b77bc278a72738b7: Status 404 returned error can't find the container with id b427cd85b80d223b1533825ae53909c693668f186f4b65d5b77bc278a72738b7 Apr 28 19:15:46.874275 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:46.874250 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad20da8d_206b_440c_8c98_5039db8e6f65.slice/crio-3858f3ba01ed7fd3e03b9da6f8e4781fc2b8c859859ca9e6ec1758b0cef94c35 WatchSource:0}: Error finding container 3858f3ba01ed7fd3e03b9da6f8e4781fc2b8c859859ca9e6ec1758b0cef94c35: Status 404 returned error can't find the container with id 3858f3ba01ed7fd3e03b9da6f8e4781fc2b8c859859ca9e6ec1758b0cef94c35 Apr 28 19:15:46.904983 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.904958 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:15:46.910751 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.910729 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" Apr 28 19:15:46.911585 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:46.911560 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf830cd46_083b_489f_b13d_5a749b919ab7.slice/crio-f690c4f192e453fa129eb5f2abd3c9ec62977cebcec73aacdc909f714ebe4728 WatchSource:0}: Error finding container f690c4f192e453fa129eb5f2abd3c9ec62977cebcec73aacdc909f714ebe4728: Status 404 returned error can't find the container with id f690c4f192e453fa129eb5f2abd3c9ec62977cebcec73aacdc909f714ebe4728 Apr 28 19:15:46.917283 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:46.917259 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaa35919_aaa2_4021_afc5_aeb72485f1ea.slice/crio-26f3a0bba5b2b320788a7806988b1c42fde6381123f12dda58faa15058cb1827 WatchSource:0}: Error finding container 26f3a0bba5b2b320788a7806988b1c42fde6381123f12dda58faa15058cb1827: Status 404 returned error can't find the container with id 26f3a0bba5b2b320788a7806988b1c42fde6381123f12dda58faa15058cb1827 Apr 28 19:15:46.937218 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.937191 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dn855" Apr 28 19:15:46.942889 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:46.942869 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nvcxv" Apr 28 19:15:46.943585 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:46.943564 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51106027_8f90_4285_9257_0da036866696.slice/crio-fdbe876e97a234cf634d9c6bf23bca66b1634e3d0da91663a3b783e45cab10e5 WatchSource:0}: Error finding container fdbe876e97a234cf634d9c6bf23bca66b1634e3d0da91663a3b783e45cab10e5: Status 404 returned error can't find the container with id fdbe876e97a234cf634d9c6bf23bca66b1634e3d0da91663a3b783e45cab10e5 Apr 28 19:15:46.949153 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:15:46.949129 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13bd3061_759d_43b2_bf3d_0c09c0a62063.slice/crio-68a9225ea92dee8181fac621291da9aef8672cf446b6a2acbadbb00a8195e7ac WatchSource:0}: Error finding container 68a9225ea92dee8181fac621291da9aef8672cf446b6a2acbadbb00a8195e7ac: Status 404 returned error can't find the container with id 68a9225ea92dee8181fac621291da9aef8672cf446b6a2acbadbb00a8195e7ac Apr 28 19:15:47.211645 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:47.211555 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cc82k\" (UniqueName: \"kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k\") pod \"network-check-target-fmthr\" (UID: \"789c39a9-aea8-4abd-b196-f303f6c6f063\") " pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:15:47.211817 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:47.211659 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs\") pod \"network-metrics-daemon-q2wj9\" (UID: \"ae2a816c-4f04-45d7-bb27-80786c738721\") " pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:15:47.211817 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:47.211745 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:15:47.211817 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:47.211768 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:47.211817 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:47.211770 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:15:47.211817 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:47.211790 2582 projected.go:194] Error preparing data for projected volume kube-api-access-cc82k for pod openshift-network-diagnostics/network-check-target-fmthr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:47.212100 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:47.211827 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs podName:ae2a816c-4f04-45d7-bb27-80786c738721 nodeName:}" failed. No retries permitted until 2026-04-28 19:15:48.211809031 +0000 UTC m=+3.146854158 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs") pod "network-metrics-daemon-q2wj9" (UID: "ae2a816c-4f04-45d7-bb27-80786c738721") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:47.212100 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:47.211845 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k podName:789c39a9-aea8-4abd-b196-f303f6c6f063 nodeName:}" failed. No retries permitted until 2026-04-28 19:15:48.211835321 +0000 UTC m=+3.146880454 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cc82k" (UniqueName: "kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k") pod "network-check-target-fmthr" (UID: "789c39a9-aea8-4abd-b196-f303f6c6f063") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:47.550620 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:47.550529 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-27 19:10:46 +0000 UTC" deadline="2027-11-09 22:41:30.564684539 +0000 UTC" Apr 28 19:15:47.550620 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:47.550573 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13443h25m43.014115476s" Apr 28 19:15:47.635799 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:47.635767 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:15:47.635993 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:47.635926 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q2wj9" podUID="ae2a816c-4f04-45d7-bb27-80786c738721" Apr 28 19:15:47.676670 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:47.676599 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dn855" event={"ID":"51106027-8f90-4285-9257-0da036866696","Type":"ContainerStarted","Data":"fdbe876e97a234cf634d9c6bf23bca66b1634e3d0da91663a3b783e45cab10e5"} Apr 28 19:15:47.691193 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:47.691148 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" event={"ID":"daa35919-aaa2-4021-afc5-aeb72485f1ea","Type":"ContainerStarted","Data":"26f3a0bba5b2b320788a7806988b1c42fde6381123f12dda58faa15058cb1827"} Apr 28 19:15:47.704146 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:47.704083 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" event={"ID":"f830cd46-083b-489f-b13d-5a749b919ab7","Type":"ContainerStarted","Data":"f690c4f192e453fa129eb5f2abd3c9ec62977cebcec73aacdc909f714ebe4728"} Apr 28 19:15:47.714655 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:47.714617 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8l89d" event={"ID":"5a509ee1-53b3-4bd7-822e-06cb6363beff","Type":"ContainerStarted","Data":"2f531b59b7adb4fc7299465c5448e04ebbbf2d462c5ff4634153a7f998ec7f7f"} Apr 28 19:15:47.724359 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:47.724332 2582 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 28 19:15:47.726999 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:47.726963 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nvcxv" event={"ID":"13bd3061-759d-43b2-bf3d-0c09c0a62063","Type":"ContainerStarted","Data":"68a9225ea92dee8181fac621291da9aef8672cf446b6a2acbadbb00a8195e7ac"} Apr 28 19:15:47.735830 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:47.735795 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-h8xww" event={"ID":"ad20da8d-206b-440c-8c98-5039db8e6f65","Type":"ContainerStarted","Data":"3858f3ba01ed7fd3e03b9da6f8e4781fc2b8c859859ca9e6ec1758b0cef94c35"} Apr 28 19:15:47.740029 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:47.739952 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qtgvt" event={"ID":"d2c7c82c-bdc2-456a-b466-42dee787562e","Type":"ContainerStarted","Data":"b427cd85b80d223b1533825ae53909c693668f186f4b65d5b77bc278a72738b7"} Apr 28 19:15:47.744278 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:47.744205 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-89vjp" event={"ID":"ae3db714-6a3b-402e-af99-fdeffaa6cdfa","Type":"ContainerStarted","Data":"9e85d0485aa3f986871bc23540eee5ed57f861084d2f9946fd7820fa459d3679"} Apr 28 19:15:47.754089 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:47.754050 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qtt7w" event={"ID":"1c539cc3-1090-4486-ab6c-9d184f87803d","Type":"ContainerStarted","Data":"5a7cb7d4ad664267467f891e3df1aae9310193279d0be44cf1b1e9b47534d175"} Apr 28 19:15:48.218414 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:48.218373 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs\") pod \"network-metrics-daemon-q2wj9\" (UID: \"ae2a816c-4f04-45d7-bb27-80786c738721\") " pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:15:48.218608 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:48.218429 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cc82k\" (UniqueName: \"kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k\") pod \"network-check-target-fmthr\" (UID: \"789c39a9-aea8-4abd-b196-f303f6c6f063\") " pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:15:48.218701 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:48.218651 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:15:48.218701 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:48.218671 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:15:48.218701 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:48.218684 2582 projected.go:194] Error preparing data for projected volume kube-api-access-cc82k for pod openshift-network-diagnostics/network-check-target-fmthr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:48.218910 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:48.218745 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k podName:789c39a9-aea8-4abd-b196-f303f6c6f063 nodeName:}" failed. No retries permitted until 2026-04-28 19:15:50.218725383 +0000 UTC m=+5.153770511 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cc82k" (UniqueName: "kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k") pod "network-check-target-fmthr" (UID: "789c39a9-aea8-4abd-b196-f303f6c6f063") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:48.219176 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:48.219158 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:48.219269 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:48.219211 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs podName:ae2a816c-4f04-45d7-bb27-80786c738721 nodeName:}" failed. No retries permitted until 2026-04-28 19:15:50.219196585 +0000 UTC m=+5.154241715 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs") pod "network-metrics-daemon-q2wj9" (UID: "ae2a816c-4f04-45d7-bb27-80786c738721") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:48.551287 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:48.551193 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-27 19:10:46 +0000 UTC" deadline="2028-02-11 06:26:15.628503882 +0000 UTC" Apr 28 19:15:48.551287 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:48.551236 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15683h10m27.077274635s" Apr 28 19:15:48.633177 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:48.632667 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:15:48.633177 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:48.632807 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fmthr" podUID="789c39a9-aea8-4abd-b196-f303f6c6f063" Apr 28 19:15:49.635501 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:49.634963 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:15:49.635501 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:49.635097 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q2wj9" podUID="ae2a816c-4f04-45d7-bb27-80786c738721" Apr 28 19:15:50.234635 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:50.234591 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs\") pod \"network-metrics-daemon-q2wj9\" (UID: \"ae2a816c-4f04-45d7-bb27-80786c738721\") " pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:15:50.234779 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:50.234648 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cc82k\" (UniqueName: \"kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k\") pod \"network-check-target-fmthr\" (UID: \"789c39a9-aea8-4abd-b196-f303f6c6f063\") " pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:15:50.234830 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:50.234802 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:15:50.234830 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:50.234821 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:15:50.234955 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:50.234835 2582 projected.go:194] Error preparing data for projected volume kube-api-access-cc82k for pod openshift-network-diagnostics/network-check-target-fmthr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:50.234955 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:50.234918 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k podName:789c39a9-aea8-4abd-b196-f303f6c6f063 nodeName:}" failed. No retries permitted until 2026-04-28 19:15:54.234879122 +0000 UTC m=+9.169924270 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cc82k" (UniqueName: "kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k") pod "network-check-target-fmthr" (UID: "789c39a9-aea8-4abd-b196-f303f6c6f063") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:50.235367 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:50.235346 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:50.235418 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:50.235411 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs podName:ae2a816c-4f04-45d7-bb27-80786c738721 nodeName:}" failed. No retries permitted until 2026-04-28 19:15:54.23539154 +0000 UTC m=+9.170436683 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs") pod "network-metrics-daemon-q2wj9" (UID: "ae2a816c-4f04-45d7-bb27-80786c738721") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:50.633260 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:50.632747 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:15:50.633260 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:50.632879 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fmthr" podUID="789c39a9-aea8-4abd-b196-f303f6c6f063" Apr 28 19:15:51.633163 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:51.632746 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:15:51.633163 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:51.632860 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q2wj9" podUID="ae2a816c-4f04-45d7-bb27-80786c738721" Apr 28 19:15:52.632510 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:52.632472 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:15:52.632707 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:52.632651 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fmthr" podUID="789c39a9-aea8-4abd-b196-f303f6c6f063" Apr 28 19:15:53.635594 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:53.635100 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:15:53.635594 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:53.635225 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q2wj9" podUID="ae2a816c-4f04-45d7-bb27-80786c738721" Apr 28 19:15:54.271199 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:54.271156 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs\") pod \"network-metrics-daemon-q2wj9\" (UID: \"ae2a816c-4f04-45d7-bb27-80786c738721\") " pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:15:54.271401 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:54.271219 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cc82k\" (UniqueName: \"kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k\") pod \"network-check-target-fmthr\" (UID: \"789c39a9-aea8-4abd-b196-f303f6c6f063\") " pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:15:54.271473 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:54.271401 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:15:54.271473 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:54.271422 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:15:54.271473 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:54.271434 2582 projected.go:194] Error preparing data for projected volume kube-api-access-cc82k for pod openshift-network-diagnostics/network-check-target-fmthr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:54.271568 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:54.271493 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k podName:789c39a9-aea8-4abd-b196-f303f6c6f063 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:02.271473422 +0000 UTC m=+17.206518554 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cc82k" (UniqueName: "kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k") pod "network-check-target-fmthr" (UID: "789c39a9-aea8-4abd-b196-f303f6c6f063") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:15:54.271918 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:54.271881 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:54.272017 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:54.271951 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs podName:ae2a816c-4f04-45d7-bb27-80786c738721 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:02.271936645 +0000 UTC m=+17.206981773 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs") pod "network-metrics-daemon-q2wj9" (UID: "ae2a816c-4f04-45d7-bb27-80786c738721") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:15:54.632326 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:54.632256 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:15:54.632472 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:54.632367 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fmthr" podUID="789c39a9-aea8-4abd-b196-f303f6c6f063" Apr 28 19:15:55.635013 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:55.634981 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:15:55.635478 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:55.635102 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q2wj9" podUID="ae2a816c-4f04-45d7-bb27-80786c738721" Apr 28 19:15:56.632509 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:56.632471 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:15:56.632706 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:56.632586 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fmthr" podUID="789c39a9-aea8-4abd-b196-f303f6c6f063" Apr 28 19:15:57.633167 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:57.633131 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:15:57.633594 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:57.633283 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q2wj9" podUID="ae2a816c-4f04-45d7-bb27-80786c738721" Apr 28 19:15:58.633329 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:58.633296 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:15:58.633720 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:58.633411 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fmthr" podUID="789c39a9-aea8-4abd-b196-f303f6c6f063" Apr 28 19:15:59.304435 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:59.304401 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-rb4pd"] Apr 28 19:15:59.333746 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:59.333714 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:15:59.333919 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:59.333811 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb4pd" podUID="82e1e8da-199f-4f4a-b552-16b36c427bd1" Apr 28 19:15:59.407634 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:59.407593 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/82e1e8da-199f-4f4a-b552-16b36c427bd1-dbus\") pod \"global-pull-secret-syncer-rb4pd\" (UID: \"82e1e8da-199f-4f4a-b552-16b36c427bd1\") " pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:15:59.407822 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:59.407695 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/82e1e8da-199f-4f4a-b552-16b36c427bd1-kubelet-config\") pod \"global-pull-secret-syncer-rb4pd\" (UID: \"82e1e8da-199f-4f4a-b552-16b36c427bd1\") " pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:15:59.407822 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:59.407723 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/82e1e8da-199f-4f4a-b552-16b36c427bd1-original-pull-secret\") pod \"global-pull-secret-syncer-rb4pd\" (UID: \"82e1e8da-199f-4f4a-b552-16b36c427bd1\") " pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:15:59.508748 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:59.508710 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/82e1e8da-199f-4f4a-b552-16b36c427bd1-kubelet-config\") pod \"global-pull-secret-syncer-rb4pd\" (UID: \"82e1e8da-199f-4f4a-b552-16b36c427bd1\") " pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:15:59.508748 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:59.508751 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/82e1e8da-199f-4f4a-b552-16b36c427bd1-original-pull-secret\") pod \"global-pull-secret-syncer-rb4pd\" (UID: \"82e1e8da-199f-4f4a-b552-16b36c427bd1\") " pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:15:59.509012 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:59.508803 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/82e1e8da-199f-4f4a-b552-16b36c427bd1-dbus\") pod \"global-pull-secret-syncer-rb4pd\" (UID: \"82e1e8da-199f-4f4a-b552-16b36c427bd1\") " pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:15:59.509012 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:59.508861 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/82e1e8da-199f-4f4a-b552-16b36c427bd1-kubelet-config\") pod \"global-pull-secret-syncer-rb4pd\" (UID: \"82e1e8da-199f-4f4a-b552-16b36c427bd1\") " pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:15:59.509012 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:59.508960 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/82e1e8da-199f-4f4a-b552-16b36c427bd1-dbus\") pod \"global-pull-secret-syncer-rb4pd\" (UID: \"82e1e8da-199f-4f4a-b552-16b36c427bd1\") " pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:15:59.509012 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:59.509001 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:15:59.509137 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:59.509064 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82e1e8da-199f-4f4a-b552-16b36c427bd1-original-pull-secret podName:82e1e8da-199f-4f4a-b552-16b36c427bd1 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:00.009047416 +0000 UTC m=+14.944092548 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/82e1e8da-199f-4f4a-b552-16b36c427bd1-original-pull-secret") pod "global-pull-secret-syncer-rb4pd" (UID: "82e1e8da-199f-4f4a-b552-16b36c427bd1") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:15:59.633042 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:15:59.632959 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:15:59.633209 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:15:59.633110 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q2wj9" podUID="ae2a816c-4f04-45d7-bb27-80786c738721" Apr 28 19:16:00.013307 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:00.013222 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/82e1e8da-199f-4f4a-b552-16b36c427bd1-original-pull-secret\") pod \"global-pull-secret-syncer-rb4pd\" (UID: \"82e1e8da-199f-4f4a-b552-16b36c427bd1\") " pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:16:00.013731 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:00.013388 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:00.013731 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:00.013475 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82e1e8da-199f-4f4a-b552-16b36c427bd1-original-pull-secret podName:82e1e8da-199f-4f4a-b552-16b36c427bd1 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:01.013452253 +0000 UTC m=+15.948497386 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/82e1e8da-199f-4f4a-b552-16b36c427bd1-original-pull-secret") pod "global-pull-secret-syncer-rb4pd" (UID: "82e1e8da-199f-4f4a-b552-16b36c427bd1") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:00.633002 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:00.632962 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:16:00.633165 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:00.632963 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:16:00.633165 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:00.633094 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb4pd" podUID="82e1e8da-199f-4f4a-b552-16b36c427bd1" Apr 28 19:16:00.633269 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:00.633173 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fmthr" podUID="789c39a9-aea8-4abd-b196-f303f6c6f063" Apr 28 19:16:01.020107 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:01.020070 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/82e1e8da-199f-4f4a-b552-16b36c427bd1-original-pull-secret\") pod \"global-pull-secret-syncer-rb4pd\" (UID: \"82e1e8da-199f-4f4a-b552-16b36c427bd1\") " pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:16:01.020539 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:01.020208 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:01.020539 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:01.020290 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82e1e8da-199f-4f4a-b552-16b36c427bd1-original-pull-secret podName:82e1e8da-199f-4f4a-b552-16b36c427bd1 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:03.020270284 +0000 UTC m=+17.955315426 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/82e1e8da-199f-4f4a-b552-16b36c427bd1-original-pull-secret") pod "global-pull-secret-syncer-rb4pd" (UID: "82e1e8da-199f-4f4a-b552-16b36c427bd1") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:01.633128 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:01.633091 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:16:01.633291 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:01.633240 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q2wj9" podUID="ae2a816c-4f04-45d7-bb27-80786c738721" Apr 28 19:16:02.331073 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:02.331028 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs\") pod \"network-metrics-daemon-q2wj9\" (UID: \"ae2a816c-4f04-45d7-bb27-80786c738721\") " pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:16:02.331575 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:02.331083 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cc82k\" (UniqueName: \"kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k\") pod \"network-check-target-fmthr\" (UID: \"789c39a9-aea8-4abd-b196-f303f6c6f063\") " pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:16:02.331575 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:02.331185 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:02.331575 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:02.331215 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:02.331575 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:02.331228 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:02.331575 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:02.331241 2582 projected.go:194] Error preparing data for projected volume kube-api-access-cc82k for pod openshift-network-diagnostics/network-check-target-fmthr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:02.331575 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:02.331280 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs podName:ae2a816c-4f04-45d7-bb27-80786c738721 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:18.33125812 +0000 UTC m=+33.266303262 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs") pod "network-metrics-daemon-q2wj9" (UID: "ae2a816c-4f04-45d7-bb27-80786c738721") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:02.331575 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:02.331303 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k podName:789c39a9-aea8-4abd-b196-f303f6c6f063 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:18.33129321 +0000 UTC m=+33.266338338 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cc82k" (UniqueName: "kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k") pod "network-check-target-fmthr" (UID: "789c39a9-aea8-4abd-b196-f303f6c6f063") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:02.632475 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:02.632396 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:16:02.632475 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:02.632428 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:16:02.632654 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:02.632525 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb4pd" podUID="82e1e8da-199f-4f4a-b552-16b36c427bd1" Apr 28 19:16:02.632654 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:02.632630 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fmthr" podUID="789c39a9-aea8-4abd-b196-f303f6c6f063" Apr 28 19:16:03.037131 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:03.037039 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/82e1e8da-199f-4f4a-b552-16b36c427bd1-original-pull-secret\") pod \"global-pull-secret-syncer-rb4pd\" (UID: \"82e1e8da-199f-4f4a-b552-16b36c427bd1\") " pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:16:03.037289 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:03.037172 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:03.037289 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:03.037242 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82e1e8da-199f-4f4a-b552-16b36c427bd1-original-pull-secret podName:82e1e8da-199f-4f4a-b552-16b36c427bd1 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:07.037223341 +0000 UTC m=+21.972268479 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/82e1e8da-199f-4f4a-b552-16b36c427bd1-original-pull-secret") pod "global-pull-secret-syncer-rb4pd" (UID: "82e1e8da-199f-4f4a-b552-16b36c427bd1") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:03.633240 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:03.633210 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:16:03.633695 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:03.633358 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q2wj9" podUID="ae2a816c-4f04-45d7-bb27-80786c738721" Apr 28 19:16:04.633175 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:04.633137 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:16:04.633476 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:04.633137 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:16:04.633476 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:04.633261 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb4pd" podUID="82e1e8da-199f-4f4a-b552-16b36c427bd1" Apr 28 19:16:04.633476 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:04.633311 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fmthr" podUID="789c39a9-aea8-4abd-b196-f303f6c6f063" Apr 28 19:16:05.636081 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:05.635382 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:16:05.636081 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:05.635803 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q2wj9" podUID="ae2a816c-4f04-45d7-bb27-80786c738721" Apr 28 19:16:05.793002 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:05.792968 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-160.ec2.internal" event={"ID":"206ff994154571336dcc99880b36f4f2","Type":"ContainerStarted","Data":"3f2a51e75ef568fcfc5cf30789ba1939a41175418918ab0c8011af5ed257f322"} Apr 28 19:16:05.795685 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:05.795664 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" event={"ID":"f830cd46-083b-489f-b13d-5a749b919ab7","Type":"ContainerStarted","Data":"e60b5afcacc996ac64083622cd5e66114298ce62a370a274de231d529dcab377"} Apr 28 19:16:05.795745 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:05.795694 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" event={"ID":"f830cd46-083b-489f-b13d-5a749b919ab7","Type":"ContainerStarted","Data":"612a1599cfdc5ab93a8c8111154c1df7d336e0482fdd27e147178e11d7a5d9b8"} Apr 28 19:16:05.795745 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:05.795709 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" event={"ID":"f830cd46-083b-489f-b13d-5a749b919ab7","Type":"ContainerStarted","Data":"c2ce8cdfdaaddd87fd7c16d202dd079d0b11e4492083219f877171f94d346b29"} Apr 28 19:16:05.799607 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:05.799578 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8l89d" event={"ID":"5a509ee1-53b3-4bd7-822e-06cb6363beff","Type":"ContainerStarted","Data":"69e796e3878f0ed0d2696f1f16a478ef331e5729ee73d55591fc4c9e0579df3f"} Apr 28 19:16:05.801673 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:05.801649 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-89vjp" event={"ID":"ae3db714-6a3b-402e-af99-fdeffaa6cdfa","Type":"ContainerStarted","Data":"d03243f731c800aed6d00259fe97a34e4b6da47987bb046a1844c1a2a2ba48d7"} Apr 28 19:16:05.811925 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:05.811473 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-160.ec2.internal" podStartSLOduration=19.811453395 podStartE2EDuration="19.811453395s" podCreationTimestamp="2026-04-28 19:15:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:05.810309899 +0000 UTC m=+20.745355049" watchObservedRunningTime="2026-04-28 19:16:05.811453395 +0000 UTC m=+20.746498545" Apr 28 19:16:05.830067 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:05.829950 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-8l89d" podStartSLOduration=2.220902512 podStartE2EDuration="20.829884012s" podCreationTimestamp="2026-04-28 19:15:45 +0000 UTC" firstStartedPulling="2026-04-28 19:15:46.830513272 +0000 UTC m=+1.765558399" lastFinishedPulling="2026-04-28 19:16:05.439494772 +0000 UTC m=+20.374539899" observedRunningTime="2026-04-28 19:16:05.829391817 +0000 UTC m=+20.764436966" watchObservedRunningTime="2026-04-28 19:16:05.829884012 +0000 UTC m=+20.764929162" Apr 28 19:16:06.633142 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:06.633102 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:16:06.633309 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:06.633102 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:16:06.633309 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:06.633260 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb4pd" podUID="82e1e8da-199f-4f4a-b552-16b36c427bd1" Apr 28 19:16:06.633438 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:06.633305 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fmthr" podUID="789c39a9-aea8-4abd-b196-f303f6c6f063" Apr 28 19:16:06.805193 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:06.805152 2582 generic.go:358] "Generic (PLEG): container finished" podID="a0393c54dc21367a93a647d0297c0f90" containerID="ea15bb2d3a029c7d0e69c1bc7033d5c56b8b659e66ce79f574d5afbb561b753f" exitCode=0 Apr 28 19:16:06.805859 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:06.805245 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" event={"ID":"a0393c54dc21367a93a647d0297c0f90","Type":"ContainerDied","Data":"ea15bb2d3a029c7d0e69c1bc7033d5c56b8b659e66ce79f574d5afbb561b753f"} Apr 28 19:16:06.806680 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:06.806658 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nvcxv" event={"ID":"13bd3061-759d-43b2-bf3d-0c09c0a62063","Type":"ContainerStarted","Data":"5c9780363f25b694ae64f0136bf02e0a24b73b283529c287c0f6e5e7a56eb689"} Apr 28 19:16:06.808089 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:06.808066 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-h8xww" event={"ID":"ad20da8d-206b-440c-8c98-5039db8e6f65","Type":"ContainerStarted","Data":"e5f4909df536d2a1c01ac39a843dcd25923befae418eaa9643acdcc255dc132b"} Apr 28 19:16:06.809738 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:06.809718 2582 generic.go:358] "Generic (PLEG): container finished" podID="d2c7c82c-bdc2-456a-b466-42dee787562e" containerID="afcf83339601922e49bad5168a9282d6357a7829e8d4a1aa997f8937555f5ead" exitCode=0 Apr 28 19:16:06.809822 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:06.809780 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qtgvt" event={"ID":"d2c7c82c-bdc2-456a-b466-42dee787562e","Type":"ContainerDied","Data":"afcf83339601922e49bad5168a9282d6357a7829e8d4a1aa997f8937555f5ead"} Apr 28 19:16:06.811221 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:06.811194 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qtt7w" event={"ID":"1c539cc3-1090-4486-ab6c-9d184f87803d","Type":"ContainerStarted","Data":"2e54129e5d62afd39e26fdb67755a244001633f3a09a449cca2739aa17f0adfe"} Apr 28 19:16:06.812605 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:06.812582 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dn855" event={"ID":"51106027-8f90-4285-9257-0da036866696","Type":"ContainerStarted","Data":"82c496872d73919415002a9ba7f87c0ae8a0228e013644f62c1346f5e13cb03f"} Apr 28 19:16:06.815648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:06.815625 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" event={"ID":"daa35919-aaa2-4021-afc5-aeb72485f1ea","Type":"ContainerStarted","Data":"ed9610ac12c54fc220a4d625dea128109285e9f5f90b989a51d78056f602c2a4"} Apr 28 19:16:06.818596 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:06.818531 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" event={"ID":"f830cd46-083b-489f-b13d-5a749b919ab7","Type":"ContainerStarted","Data":"8eb7bbdc25135d2393e354c769921fa8c239543797c653fda1d74a00a0aaee61"} Apr 28 19:16:06.818596 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:06.818559 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" event={"ID":"f830cd46-083b-489f-b13d-5a749b919ab7","Type":"ContainerStarted","Data":"e97caada93cf51d8fb33d8292263f7c094b8195e189a5cbd547ce4f9d3c6da82"} Apr 28 19:16:06.818596 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:06.818575 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" event={"ID":"f830cd46-083b-489f-b13d-5a749b919ab7","Type":"ContainerStarted","Data":"027acf3972338766894c128cef21bad563f48a236bf64442cdb38a7e658c6c8a"} Apr 28 19:16:06.824365 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:06.824329 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-89vjp" podStartSLOduration=3.152241151 podStartE2EDuration="21.824317465s" podCreationTimestamp="2026-04-28 19:15:45 +0000 UTC" firstStartedPulling="2026-04-28 19:15:46.85702392 +0000 UTC m=+1.792069047" lastFinishedPulling="2026-04-28 19:16:05.529100232 +0000 UTC m=+20.464145361" observedRunningTime="2026-04-28 19:16:05.849578964 +0000 UTC m=+20.784624112" watchObservedRunningTime="2026-04-28 19:16:06.824317465 +0000 UTC m=+21.759362613" Apr 28 19:16:06.876618 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:06.876572 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-dn855" podStartSLOduration=3.383113274 podStartE2EDuration="21.876557839s" podCreationTimestamp="2026-04-28 19:15:45 +0000 UTC" firstStartedPulling="2026-04-28 19:15:46.945986764 +0000 UTC m=+1.881031892" lastFinishedPulling="2026-04-28 19:16:05.439431315 +0000 UTC m=+20.374476457" observedRunningTime="2026-04-28 19:16:06.876172255 +0000 UTC m=+21.811217405" watchObservedRunningTime="2026-04-28 19:16:06.876557839 +0000 UTC m=+21.811602987" Apr 28 19:16:06.918082 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:06.918013 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qtt7w" podStartSLOduration=3.355440057 podStartE2EDuration="21.917994905s" podCreationTimestamp="2026-04-28 19:15:45 +0000 UTC" firstStartedPulling="2026-04-28 19:15:46.845226702 +0000 UTC m=+1.780271830" lastFinishedPulling="2026-04-28 19:16:05.407781539 +0000 UTC m=+20.342826678" observedRunningTime="2026-04-28 19:16:06.91741609 +0000 UTC m=+21.852461240" watchObservedRunningTime="2026-04-28 19:16:06.917994905 +0000 UTC m=+21.853040054" Apr 28 19:16:06.918311 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:06.918277 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-nvcxv" podStartSLOduration=2.43291233 podStartE2EDuration="20.918270543s" podCreationTimestamp="2026-04-28 19:15:46 +0000 UTC" firstStartedPulling="2026-04-28 19:15:46.95057444 +0000 UTC m=+1.885619570" lastFinishedPulling="2026-04-28 19:16:05.435932656 +0000 UTC m=+20.370977783" observedRunningTime="2026-04-28 19:16:06.89955442 +0000 UTC m=+21.834599568" watchObservedRunningTime="2026-04-28 19:16:06.918270543 +0000 UTC m=+21.853315696" Apr 28 19:16:06.937310 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:06.937254 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-h8xww" podStartSLOduration=3.404993324 podStartE2EDuration="21.937234483s" podCreationTimestamp="2026-04-28 19:15:45 +0000 UTC" firstStartedPulling="2026-04-28 19:15:46.875690961 +0000 UTC m=+1.810736089" lastFinishedPulling="2026-04-28 19:16:05.407932121 +0000 UTC m=+20.342977248" observedRunningTime="2026-04-28 19:16:06.937046797 +0000 UTC m=+21.872091946" watchObservedRunningTime="2026-04-28 19:16:06.937234483 +0000 UTC m=+21.872279632" Apr 28 19:16:07.069190 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:07.068985 2582 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 28 19:16:07.072662 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:07.072643 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/82e1e8da-199f-4f4a-b552-16b36c427bd1-original-pull-secret\") pod \"global-pull-secret-syncer-rb4pd\" (UID: \"82e1e8da-199f-4f4a-b552-16b36c427bd1\") " pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:16:07.072752 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:07.072740 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:07.072800 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:07.072792 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82e1e8da-199f-4f4a-b552-16b36c427bd1-original-pull-secret podName:82e1e8da-199f-4f4a-b552-16b36c427bd1 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:15.07277587 +0000 UTC m=+30.007820997 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/82e1e8da-199f-4f4a-b552-16b36c427bd1-original-pull-secret") pod "global-pull-secret-syncer-rb4pd" (UID: "82e1e8da-199f-4f4a-b552-16b36c427bd1") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:07.560359 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:07.560193 2582 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-28T19:16:07.069154215Z","UUID":"a45002be-0744-4510-b822-ccd268649c60","Handler":null,"Name":"","Endpoint":""} Apr 28 19:16:07.562777 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:07.562744 2582 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 28 19:16:07.562777 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:07.562778 2582 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 28 19:16:07.633351 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:07.633320 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:16:07.633554 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:07.633526 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q2wj9" podUID="ae2a816c-4f04-45d7-bb27-80786c738721" Apr 28 19:16:07.826880 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:07.826384 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" event={"ID":"a0393c54dc21367a93a647d0297c0f90","Type":"ContainerStarted","Data":"a64274aae005d0216743c20e455c192d201991a5fbe98923ac5e3033c62ab2c0"} Apr 28 19:16:07.828594 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:07.828473 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" event={"ID":"daa35919-aaa2-4021-afc5-aeb72485f1ea","Type":"ContainerStarted","Data":"d8ef595fc6740208fb3baff4979cc1acdda337c641c22ef8dc9f8b62156eaa2e"} Apr 28 19:16:08.632583 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:08.632546 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:16:08.632753 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:08.632546 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:16:08.632753 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:08.632688 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fmthr" podUID="789c39a9-aea8-4abd-b196-f303f6c6f063" Apr 28 19:16:08.632830 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:08.632793 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb4pd" podUID="82e1e8da-199f-4f4a-b552-16b36c427bd1" Apr 28 19:16:08.832697 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:08.832654 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" event={"ID":"daa35919-aaa2-4021-afc5-aeb72485f1ea","Type":"ContainerStarted","Data":"402fa9572f03f2b81c96dadfaf30a2916443763d03aa03c1983dcc9fe8882333"} Apr 28 19:16:08.835912 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:08.835866 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" event={"ID":"f830cd46-083b-489f-b13d-5a749b919ab7","Type":"ContainerStarted","Data":"fe46fa2e3e2b1d169de687297bdd5b15ba5789ddf43a5614dd93900e57ddafb4"} Apr 28 19:16:08.854913 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:08.854849 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-160.ec2.internal" podStartSLOduration=22.854831426 podStartE2EDuration="22.854831426s" podCreationTimestamp="2026-04-28 19:15:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:16:07.847202167 +0000 UTC m=+22.782247314" watchObservedRunningTime="2026-04-28 19:16:08.854831426 +0000 UTC m=+23.789876577" Apr 28 19:16:08.855071 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:08.855022 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kqcc2" podStartSLOduration=2.915597446 podStartE2EDuration="23.855016006s" podCreationTimestamp="2026-04-28 19:15:45 +0000 UTC" firstStartedPulling="2026-04-28 19:15:46.918771432 +0000 UTC m=+1.853816562" lastFinishedPulling="2026-04-28 19:16:07.858189992 +0000 UTC m=+22.793235122" observedRunningTime="2026-04-28 19:16:08.853306422 +0000 UTC m=+23.788351571" watchObservedRunningTime="2026-04-28 19:16:08.855016006 +0000 UTC m=+23.790061154" Apr 28 19:16:09.632653 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:09.632612 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:16:09.632829 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:09.632762 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q2wj9" podUID="ae2a816c-4f04-45d7-bb27-80786c738721" Apr 28 19:16:10.632726 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:10.632690 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:16:10.632726 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:10.632710 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:16:10.633288 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:10.632795 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fmthr" podUID="789c39a9-aea8-4abd-b196-f303f6c6f063" Apr 28 19:16:10.633288 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:10.632951 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb4pd" podUID="82e1e8da-199f-4f4a-b552-16b36c427bd1" Apr 28 19:16:11.365948 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:11.365698 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-h8xww" Apr 28 19:16:11.369113 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:11.367320 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-h8xww" Apr 28 19:16:11.632842 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:11.632759 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:16:11.633321 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:11.632892 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q2wj9" podUID="ae2a816c-4f04-45d7-bb27-80786c738721" Apr 28 19:16:11.842750 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:11.842716 2582 generic.go:358] "Generic (PLEG): container finished" podID="d2c7c82c-bdc2-456a-b466-42dee787562e" containerID="8cf1302fccc54fc59818ce7de835f1ec0924f1d8671cca9f1b1109e9a990c45e" exitCode=0 Apr 28 19:16:11.842997 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:11.842801 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qtgvt" event={"ID":"d2c7c82c-bdc2-456a-b466-42dee787562e","Type":"ContainerDied","Data":"8cf1302fccc54fc59818ce7de835f1ec0924f1d8671cca9f1b1109e9a990c45e"} Apr 28 19:16:11.846131 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:11.846108 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" event={"ID":"f830cd46-083b-489f-b13d-5a749b919ab7","Type":"ContainerStarted","Data":"b8bd5267caaf0f827ef5fe89414e24a4a6bf7ee9287b7aec712c6c337380ca1c"} Apr 28 19:16:11.846363 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:11.846336 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-h8xww" Apr 28 19:16:11.846632 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:11.846616 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:16:11.846688 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:11.846640 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:16:11.846858 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:11.846847 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-h8xww" Apr 28 19:16:11.862182 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:11.862158 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:16:11.862536 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:11.862521 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:16:11.913020 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:11.912937 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" podStartSLOduration=8.345547293 podStartE2EDuration="26.912923891s" podCreationTimestamp="2026-04-28 19:15:45 +0000 UTC" firstStartedPulling="2026-04-28 19:15:46.913789064 +0000 UTC m=+1.848834191" lastFinishedPulling="2026-04-28 19:16:05.481165648 +0000 UTC m=+20.416210789" observedRunningTime="2026-04-28 19:16:11.912733017 +0000 UTC m=+26.847778166" watchObservedRunningTime="2026-04-28 19:16:11.912923891 +0000 UTC m=+26.847969040" Apr 28 19:16:12.352978 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:12.352942 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:16:12.633174 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:12.633098 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:16:12.633534 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:12.633101 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:16:12.633534 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:12.633205 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb4pd" podUID="82e1e8da-199f-4f4a-b552-16b36c427bd1" Apr 28 19:16:12.633534 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:12.633304 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fmthr" podUID="789c39a9-aea8-4abd-b196-f303f6c6f063" Apr 28 19:16:12.850490 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:12.850307 2582 generic.go:358] "Generic (PLEG): container finished" podID="d2c7c82c-bdc2-456a-b466-42dee787562e" containerID="b896680f3fb1ad6009987ad2f561535758f38c3c19497bcf71da2f64ea3816e0" exitCode=0 Apr 28 19:16:12.850614 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:12.850381 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qtgvt" event={"ID":"d2c7c82c-bdc2-456a-b466-42dee787562e","Type":"ContainerDied","Data":"b896680f3fb1ad6009987ad2f561535758f38c3c19497bcf71da2f64ea3816e0"} Apr 28 19:16:13.065230 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:13.065185 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rb4pd"] Apr 28 19:16:13.065439 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:13.065334 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:16:13.065513 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:13.065448 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb4pd" podUID="82e1e8da-199f-4f4a-b552-16b36c427bd1" Apr 28 19:16:13.065990 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:13.065970 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fmthr"] Apr 28 19:16:13.066092 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:13.066072 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:16:13.066154 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:13.066138 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fmthr" podUID="789c39a9-aea8-4abd-b196-f303f6c6f063" Apr 28 19:16:13.078350 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:13.078311 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q2wj9"] Apr 28 19:16:13.078488 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:13.078448 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:16:13.078546 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:13.078528 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q2wj9" podUID="ae2a816c-4f04-45d7-bb27-80786c738721" Apr 28 19:16:13.854393 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:13.854348 2582 generic.go:358] "Generic (PLEG): container finished" podID="d2c7c82c-bdc2-456a-b466-42dee787562e" containerID="e6eb603873cb246fa3a993701f66b8beb75e942b0fa9815478e28339cc1d62d6" exitCode=0 Apr 28 19:16:13.854851 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:13.854400 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qtgvt" event={"ID":"d2c7c82c-bdc2-456a-b466-42dee787562e","Type":"ContainerDied","Data":"e6eb603873cb246fa3a993701f66b8beb75e942b0fa9815478e28339cc1d62d6"} Apr 28 19:16:14.632787 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:14.632751 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:16:14.633003 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:14.632886 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb4pd" podUID="82e1e8da-199f-4f4a-b552-16b36c427bd1" Apr 28 19:16:14.633003 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:14.632751 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:16:14.633122 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:14.633012 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q2wj9" podUID="ae2a816c-4f04-45d7-bb27-80786c738721" Apr 28 19:16:14.633122 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:14.632751 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:16:14.633122 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:14.633082 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fmthr" podUID="789c39a9-aea8-4abd-b196-f303f6c6f063" Apr 28 19:16:15.136268 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:15.136228 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/82e1e8da-199f-4f4a-b552-16b36c427bd1-original-pull-secret\") pod \"global-pull-secret-syncer-rb4pd\" (UID: \"82e1e8da-199f-4f4a-b552-16b36c427bd1\") " pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:16:15.136866 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:15.136392 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:15.136866 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:15.136453 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82e1e8da-199f-4f4a-b552-16b36c427bd1-original-pull-secret podName:82e1e8da-199f-4f4a-b552-16b36c427bd1 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:31.1364352 +0000 UTC m=+46.071480335 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/82e1e8da-199f-4f4a-b552-16b36c427bd1-original-pull-secret") pod "global-pull-secret-syncer-rb4pd" (UID: "82e1e8da-199f-4f4a-b552-16b36c427bd1") : object "kube-system"/"original-pull-secret" not registered Apr 28 19:16:16.633184 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:16.633143 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:16:16.633800 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:16.633211 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:16:16.633800 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:16.633307 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fmthr" podUID="789c39a9-aea8-4abd-b196-f303f6c6f063" Apr 28 19:16:16.633800 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:16.633356 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-q2wj9" podUID="ae2a816c-4f04-45d7-bb27-80786c738721" Apr 28 19:16:16.633800 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:16.633377 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:16:16.633800 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:16.633447 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-rb4pd" podUID="82e1e8da-199f-4f4a-b552-16b36c427bd1" Apr 28 19:16:18.360326 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.360286 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs\") pod \"network-metrics-daemon-q2wj9\" (UID: \"ae2a816c-4f04-45d7-bb27-80786c738721\") " pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:16:18.360326 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.360331 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cc82k\" (UniqueName: \"kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k\") pod \"network-check-target-fmthr\" (UID: \"789c39a9-aea8-4abd-b196-f303f6c6f063\") " pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:16:18.360891 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:18.360461 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:18.360891 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:18.360475 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 28 19:16:18.360891 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:18.360510 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 28 19:16:18.360891 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:18.360526 2582 projected.go:194] Error preparing data for projected volume kube-api-access-cc82k for pod openshift-network-diagnostics/network-check-target-fmthr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:18.360891 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:18.360545 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs podName:ae2a816c-4f04-45d7-bb27-80786c738721 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:50.360521815 +0000 UTC m=+65.295566957 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs") pod "network-metrics-daemon-q2wj9" (UID: "ae2a816c-4f04-45d7-bb27-80786c738721") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 28 19:16:18.360891 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:18.360574 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k podName:789c39a9-aea8-4abd-b196-f303f6c6f063 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:50.36055857 +0000 UTC m=+65.295603698 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cc82k" (UniqueName: "kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k") pod "network-check-target-fmthr" (UID: "789c39a9-aea8-4abd-b196-f303f6c6f063") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 28 19:16:18.363105 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.363079 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-160.ec2.internal" event="NodeReady" Apr 28 19:16:18.363242 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.363212 2582 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 28 19:16:18.420278 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.420250 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-txhn5"] Apr 28 19:16:18.451175 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.451136 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-t6pm8"] Apr 28 19:16:18.451347 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.451319 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-txhn5" Apr 28 19:16:18.454783 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.454547 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 28 19:16:18.454783 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.454597 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 28 19:16:18.454783 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.454639 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qcg5d\"" Apr 28 19:16:18.466270 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.466247 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-txhn5"] Apr 28 19:16:18.466394 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.466279 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t6pm8"] Apr 28 19:16:18.466394 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.466364 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t6pm8" Apr 28 19:16:18.469130 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.469110 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 28 19:16:18.469253 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.469169 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rkrph\"" Apr 28 19:16:18.469253 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.469209 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 28 19:16:18.469366 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.469112 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 28 19:16:18.561876 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.561839 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf70a1bd-b873-41bc-a143-901d7c76665c-tmp-dir\") pod \"dns-default-txhn5\" (UID: \"cf70a1bd-b873-41bc-a143-901d7c76665c\") " pod="openshift-dns/dns-default-txhn5" Apr 28 19:16:18.561876 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.561874 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlzlk\" (UniqueName: \"kubernetes.io/projected/d392608a-a370-4a06-8556-ce4952638d04-kube-api-access-zlzlk\") pod \"ingress-canary-t6pm8\" (UID: \"d392608a-a370-4a06-8556-ce4952638d04\") " pod="openshift-ingress-canary/ingress-canary-t6pm8" Apr 28 19:16:18.562120 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.561925 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls\") pod \"dns-default-txhn5\" (UID: \"cf70a1bd-b873-41bc-a143-901d7c76665c\") " pod="openshift-dns/dns-default-txhn5" Apr 28 19:16:18.562120 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.562048 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf70a1bd-b873-41bc-a143-901d7c76665c-config-volume\") pod \"dns-default-txhn5\" (UID: \"cf70a1bd-b873-41bc-a143-901d7c76665c\") " pod="openshift-dns/dns-default-txhn5" Apr 28 19:16:18.562120 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.562090 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert\") pod \"ingress-canary-t6pm8\" (UID: \"d392608a-a370-4a06-8556-ce4952638d04\") " pod="openshift-ingress-canary/ingress-canary-t6pm8" Apr 28 19:16:18.562244 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.562176 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z9vd\" (UniqueName: \"kubernetes.io/projected/cf70a1bd-b873-41bc-a143-901d7c76665c-kube-api-access-6z9vd\") pod \"dns-default-txhn5\" (UID: \"cf70a1bd-b873-41bc-a143-901d7c76665c\") " pod="openshift-dns/dns-default-txhn5" Apr 28 19:16:18.633349 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.633260 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:16:18.633349 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.633284 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:16:18.633560 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.633260 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:16:18.636262 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.636229 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 28 19:16:18.636422 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.636396 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-djdhf\"" Apr 28 19:16:18.636488 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.636423 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 28 19:16:18.636488 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.636478 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 28 19:16:18.636595 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.636394 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 28 19:16:18.636657 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.636613 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kx25j\"" Apr 28 19:16:18.662908 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.662869 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf70a1bd-b873-41bc-a143-901d7c76665c-tmp-dir\") pod \"dns-default-txhn5\" (UID: \"cf70a1bd-b873-41bc-a143-901d7c76665c\") " pod="openshift-dns/dns-default-txhn5" Apr 28 19:16:18.663054 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.662916 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlzlk\" (UniqueName: \"kubernetes.io/projected/d392608a-a370-4a06-8556-ce4952638d04-kube-api-access-zlzlk\") pod \"ingress-canary-t6pm8\" (UID: \"d392608a-a370-4a06-8556-ce4952638d04\") " pod="openshift-ingress-canary/ingress-canary-t6pm8" Apr 28 19:16:18.663054 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.662944 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls\") pod \"dns-default-txhn5\" (UID: \"cf70a1bd-b873-41bc-a143-901d7c76665c\") " pod="openshift-dns/dns-default-txhn5" Apr 28 19:16:18.663054 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.662978 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf70a1bd-b873-41bc-a143-901d7c76665c-config-volume\") pod \"dns-default-txhn5\" (UID: \"cf70a1bd-b873-41bc-a143-901d7c76665c\") " pod="openshift-dns/dns-default-txhn5" Apr 28 19:16:18.663054 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.663003 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert\") pod \"ingress-canary-t6pm8\" (UID: \"d392608a-a370-4a06-8556-ce4952638d04\") " pod="openshift-ingress-canary/ingress-canary-t6pm8" Apr 28 19:16:18.663054 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.663045 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6z9vd\" (UniqueName: \"kubernetes.io/projected/cf70a1bd-b873-41bc-a143-901d7c76665c-kube-api-access-6z9vd\") pod \"dns-default-txhn5\" (UID: \"cf70a1bd-b873-41bc-a143-901d7c76665c\") " pod="openshift-dns/dns-default-txhn5" Apr 28 19:16:18.663345 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:18.663112 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:18.663345 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:18.663168 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls podName:cf70a1bd-b873-41bc-a143-901d7c76665c nodeName:}" failed. No retries permitted until 2026-04-28 19:16:19.163149505 +0000 UTC m=+34.098194632 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls") pod "dns-default-txhn5" (UID: "cf70a1bd-b873-41bc-a143-901d7c76665c") : secret "dns-default-metrics-tls" not found Apr 28 19:16:18.663345 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.663221 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/cf70a1bd-b873-41bc-a143-901d7c76665c-tmp-dir\") pod \"dns-default-txhn5\" (UID: \"cf70a1bd-b873-41bc-a143-901d7c76665c\") " pod="openshift-dns/dns-default-txhn5" Apr 28 19:16:18.663345 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:18.663249 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:18.663345 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:18.663292 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert podName:d392608a-a370-4a06-8556-ce4952638d04 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:19.163279655 +0000 UTC m=+34.098324785 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert") pod "ingress-canary-t6pm8" (UID: "d392608a-a370-4a06-8556-ce4952638d04") : secret "canary-serving-cert" not found Apr 28 19:16:18.663657 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.663635 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cf70a1bd-b873-41bc-a143-901d7c76665c-config-volume\") pod \"dns-default-txhn5\" (UID: \"cf70a1bd-b873-41bc-a143-901d7c76665c\") " pod="openshift-dns/dns-default-txhn5" Apr 28 19:16:18.674542 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.674508 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z9vd\" (UniqueName: \"kubernetes.io/projected/cf70a1bd-b873-41bc-a143-901d7c76665c-kube-api-access-6z9vd\") pod \"dns-default-txhn5\" (UID: \"cf70a1bd-b873-41bc-a143-901d7c76665c\") " pod="openshift-dns/dns-default-txhn5" Apr 28 19:16:18.674663 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:18.674582 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlzlk\" (UniqueName: \"kubernetes.io/projected/d392608a-a370-4a06-8556-ce4952638d04-kube-api-access-zlzlk\") pod \"ingress-canary-t6pm8\" (UID: \"d392608a-a370-4a06-8556-ce4952638d04\") " pod="openshift-ingress-canary/ingress-canary-t6pm8" Apr 28 19:16:19.165536 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:19.165496 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls\") pod \"dns-default-txhn5\" (UID: \"cf70a1bd-b873-41bc-a143-901d7c76665c\") " pod="openshift-dns/dns-default-txhn5" Apr 28 19:16:19.165730 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:19.165559 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert\") pod \"ingress-canary-t6pm8\" (UID: \"d392608a-a370-4a06-8556-ce4952638d04\") " pod="openshift-ingress-canary/ingress-canary-t6pm8" Apr 28 19:16:19.165730 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:19.165662 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:19.165866 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:19.165735 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert podName:d392608a-a370-4a06-8556-ce4952638d04 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:20.165718978 +0000 UTC m=+35.100764128 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert") pod "ingress-canary-t6pm8" (UID: "d392608a-a370-4a06-8556-ce4952638d04") : secret "canary-serving-cert" not found Apr 28 19:16:19.165866 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:19.165662 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:19.165866 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:19.165822 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls podName:cf70a1bd-b873-41bc-a143-901d7c76665c nodeName:}" failed. No retries permitted until 2026-04-28 19:16:20.165804548 +0000 UTC m=+35.100849685 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls") pod "dns-default-txhn5" (UID: "cf70a1bd-b873-41bc-a143-901d7c76665c") : secret "dns-default-metrics-tls" not found Apr 28 19:16:20.173450 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:20.173413 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls\") pod \"dns-default-txhn5\" (UID: \"cf70a1bd-b873-41bc-a143-901d7c76665c\") " pod="openshift-dns/dns-default-txhn5" Apr 28 19:16:20.173862 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:20.173472 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert\") pod \"ingress-canary-t6pm8\" (UID: \"d392608a-a370-4a06-8556-ce4952638d04\") " pod="openshift-ingress-canary/ingress-canary-t6pm8" Apr 28 19:16:20.173862 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:20.173585 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:20.173862 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:20.173585 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:20.173862 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:20.173651 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls podName:cf70a1bd-b873-41bc-a143-901d7c76665c nodeName:}" failed. No retries permitted until 2026-04-28 19:16:22.173635695 +0000 UTC m=+37.108680825 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls") pod "dns-default-txhn5" (UID: "cf70a1bd-b873-41bc-a143-901d7c76665c") : secret "dns-default-metrics-tls" not found Apr 28 19:16:20.173862 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:20.173666 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert podName:d392608a-a370-4a06-8556-ce4952638d04 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:22.173659842 +0000 UTC m=+37.108704969 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert") pod "ingress-canary-t6pm8" (UID: "d392608a-a370-4a06-8556-ce4952638d04") : secret "canary-serving-cert" not found Apr 28 19:16:20.870756 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:20.870719 2582 generic.go:358] "Generic (PLEG): container finished" podID="d2c7c82c-bdc2-456a-b466-42dee787562e" containerID="05899aebb9413856414362caee19a2ec2ed7da543099b33c16fb3c5e801f6513" exitCode=0 Apr 28 19:16:20.870936 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:20.870768 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qtgvt" event={"ID":"d2c7c82c-bdc2-456a-b466-42dee787562e","Type":"ContainerDied","Data":"05899aebb9413856414362caee19a2ec2ed7da543099b33c16fb3c5e801f6513"} Apr 28 19:16:21.874837 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:21.874802 2582 generic.go:358] "Generic (PLEG): container finished" podID="d2c7c82c-bdc2-456a-b466-42dee787562e" containerID="167f3ada0f07d167935e9c58be4f6ec76ffe79900d5fa7f42006da99e3b97c2f" exitCode=0 Apr 28 19:16:21.875206 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:21.874845 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qtgvt" event={"ID":"d2c7c82c-bdc2-456a-b466-42dee787562e","Type":"ContainerDied","Data":"167f3ada0f07d167935e9c58be4f6ec76ffe79900d5fa7f42006da99e3b97c2f"} Apr 28 19:16:22.190876 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:22.190775 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls\") pod \"dns-default-txhn5\" (UID: \"cf70a1bd-b873-41bc-a143-901d7c76665c\") " pod="openshift-dns/dns-default-txhn5" Apr 28 19:16:22.190876 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:22.190832 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert\") pod \"ingress-canary-t6pm8\" (UID: \"d392608a-a370-4a06-8556-ce4952638d04\") " pod="openshift-ingress-canary/ingress-canary-t6pm8" Apr 28 19:16:22.191110 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:22.190951 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:22.191110 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:22.190974 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:22.191110 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:22.191035 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert podName:d392608a-a370-4a06-8556-ce4952638d04 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:26.19101553 +0000 UTC m=+41.126060660 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert") pod "ingress-canary-t6pm8" (UID: "d392608a-a370-4a06-8556-ce4952638d04") : secret "canary-serving-cert" not found Apr 28 19:16:22.191110 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:22.191052 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls podName:cf70a1bd-b873-41bc-a143-901d7c76665c nodeName:}" failed. No retries permitted until 2026-04-28 19:16:26.191043635 +0000 UTC m=+41.126088762 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls") pod "dns-default-txhn5" (UID: "cf70a1bd-b873-41bc-a143-901d7c76665c") : secret "dns-default-metrics-tls" not found Apr 28 19:16:22.879152 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:22.879117 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qtgvt" event={"ID":"d2c7c82c-bdc2-456a-b466-42dee787562e","Type":"ContainerStarted","Data":"64555b604717b29fce329dbb5a1c38243b65906a193676d414aa5da4db5800ea"} Apr 28 19:16:22.907972 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:22.907913 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qtgvt" podStartSLOduration=4.901054334 podStartE2EDuration="37.907872861s" podCreationTimestamp="2026-04-28 19:15:45 +0000 UTC" firstStartedPulling="2026-04-28 19:15:46.871099853 +0000 UTC m=+1.806144983" lastFinishedPulling="2026-04-28 19:16:19.877918378 +0000 UTC m=+34.812963510" observedRunningTime="2026-04-28 19:16:22.906583781 +0000 UTC m=+37.841628929" watchObservedRunningTime="2026-04-28 19:16:22.907872861 +0000 UTC m=+37.842918012" Apr 28 19:16:26.223998 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:26.223953 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls\") pod \"dns-default-txhn5\" (UID: \"cf70a1bd-b873-41bc-a143-901d7c76665c\") " pod="openshift-dns/dns-default-txhn5" Apr 28 19:16:26.223998 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:26.224004 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert\") pod \"ingress-canary-t6pm8\" (UID: \"d392608a-a370-4a06-8556-ce4952638d04\") " pod="openshift-ingress-canary/ingress-canary-t6pm8" Apr 28 19:16:26.224402 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:26.224106 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:26.224402 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:26.224106 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:26.224402 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:26.224173 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls podName:cf70a1bd-b873-41bc-a143-901d7c76665c nodeName:}" failed. No retries permitted until 2026-04-28 19:16:34.22415413 +0000 UTC m=+49.159199269 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls") pod "dns-default-txhn5" (UID: "cf70a1bd-b873-41bc-a143-901d7c76665c") : secret "dns-default-metrics-tls" not found Apr 28 19:16:26.224402 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:26.224190 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert podName:d392608a-a370-4a06-8556-ce4952638d04 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:34.224182453 +0000 UTC m=+49.159227585 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert") pod "ingress-canary-t6pm8" (UID: "d392608a-a370-4a06-8556-ce4952638d04") : secret "canary-serving-cert" not found Apr 28 19:16:31.159439 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:31.159397 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/82e1e8da-199f-4f4a-b552-16b36c427bd1-original-pull-secret\") pod \"global-pull-secret-syncer-rb4pd\" (UID: \"82e1e8da-199f-4f4a-b552-16b36c427bd1\") " pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:16:31.163044 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:31.163023 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/82e1e8da-199f-4f4a-b552-16b36c427bd1-original-pull-secret\") pod \"global-pull-secret-syncer-rb4pd\" (UID: \"82e1e8da-199f-4f4a-b552-16b36c427bd1\") " pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:16:31.244568 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:31.244530 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-rb4pd" Apr 28 19:16:31.389022 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:31.388992 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-rb4pd"] Apr 28 19:16:31.395779 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:16:31.395749 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82e1e8da_199f_4f4a_b552_16b36c427bd1.slice/crio-ada68a96d7ac5d2eafb6fa7269b2b0175cdb3860d15a970d3b51060a49b0538b WatchSource:0}: Error finding container ada68a96d7ac5d2eafb6fa7269b2b0175cdb3860d15a970d3b51060a49b0538b: Status 404 returned error can't find the container with id ada68a96d7ac5d2eafb6fa7269b2b0175cdb3860d15a970d3b51060a49b0538b Apr 28 19:16:31.898014 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:31.897975 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rb4pd" event={"ID":"82e1e8da-199f-4f4a-b552-16b36c427bd1","Type":"ContainerStarted","Data":"ada68a96d7ac5d2eafb6fa7269b2b0175cdb3860d15a970d3b51060a49b0538b"} Apr 28 19:16:34.281414 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:34.281377 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls\") pod \"dns-default-txhn5\" (UID: \"cf70a1bd-b873-41bc-a143-901d7c76665c\") " pod="openshift-dns/dns-default-txhn5" Apr 28 19:16:34.281830 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:34.281434 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert\") pod \"ingress-canary-t6pm8\" (UID: \"d392608a-a370-4a06-8556-ce4952638d04\") " pod="openshift-ingress-canary/ingress-canary-t6pm8" Apr 28 19:16:34.281830 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:34.281547 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:34.281830 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:34.281619 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert podName:d392608a-a370-4a06-8556-ce4952638d04 nodeName:}" failed. No retries permitted until 2026-04-28 19:16:50.281602995 +0000 UTC m=+65.216648122 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert") pod "ingress-canary-t6pm8" (UID: "d392608a-a370-4a06-8556-ce4952638d04") : secret "canary-serving-cert" not found Apr 28 19:16:34.281830 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:34.281547 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:34.281830 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:34.281692 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls podName:cf70a1bd-b873-41bc-a143-901d7c76665c nodeName:}" failed. No retries permitted until 2026-04-28 19:16:50.281676699 +0000 UTC m=+65.216721830 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls") pod "dns-default-txhn5" (UID: "cf70a1bd-b873-41bc-a143-901d7c76665c") : secret "dns-default-metrics-tls" not found Apr 28 19:16:36.908640 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:36.908602 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-rb4pd" event={"ID":"82e1e8da-199f-4f4a-b552-16b36c427bd1","Type":"ContainerStarted","Data":"422874434d519e9ab50f4819734f5f9763a23b35b85ae9a03eaee06cb810673e"} Apr 28 19:16:36.936100 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:36.936051 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-rb4pd" podStartSLOduration=33.402937045 podStartE2EDuration="37.936035385s" podCreationTimestamp="2026-04-28 19:15:59 +0000 UTC" firstStartedPulling="2026-04-28 19:16:31.397332949 +0000 UTC m=+46.332378077" lastFinishedPulling="2026-04-28 19:16:35.930431275 +0000 UTC m=+50.865476417" observedRunningTime="2026-04-28 19:16:36.935252633 +0000 UTC m=+51.870297783" watchObservedRunningTime="2026-04-28 19:16:36.936035385 +0000 UTC m=+51.871080534" Apr 28 19:16:43.865472 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:43.865446 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6xgsz" Apr 28 19:16:50.297268 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:50.297228 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls\") pod \"dns-default-txhn5\" (UID: \"cf70a1bd-b873-41bc-a143-901d7c76665c\") " pod="openshift-dns/dns-default-txhn5" Apr 28 19:16:50.297268 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:50.297275 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert\") pod \"ingress-canary-t6pm8\" (UID: \"d392608a-a370-4a06-8556-ce4952638d04\") " pod="openshift-ingress-canary/ingress-canary-t6pm8" Apr 28 19:16:50.297788 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:50.297367 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:16:50.297788 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:50.297377 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:16:50.297788 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:50.297423 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert podName:d392608a-a370-4a06-8556-ce4952638d04 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:22.297410947 +0000 UTC m=+97.232456074 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert") pod "ingress-canary-t6pm8" (UID: "d392608a-a370-4a06-8556-ce4952638d04") : secret "canary-serving-cert" not found Apr 28 19:16:50.297788 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:50.297437 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls podName:cf70a1bd-b873-41bc-a143-901d7c76665c nodeName:}" failed. No retries permitted until 2026-04-28 19:17:22.297431139 +0000 UTC m=+97.232476265 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls") pod "dns-default-txhn5" (UID: "cf70a1bd-b873-41bc-a143-901d7c76665c") : secret "dns-default-metrics-tls" not found Apr 28 19:16:50.398014 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:50.397978 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs\") pod \"network-metrics-daemon-q2wj9\" (UID: \"ae2a816c-4f04-45d7-bb27-80786c738721\") " pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:16:50.398171 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:50.398019 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cc82k\" (UniqueName: \"kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k\") pod \"network-check-target-fmthr\" (UID: \"789c39a9-aea8-4abd-b196-f303f6c6f063\") " pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:16:50.401070 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:50.401048 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 28 19:16:50.401154 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:50.401136 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 28 19:16:50.408390 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:50.408368 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 28 19:16:50.408488 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:16:50.408430 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs podName:ae2a816c-4f04-45d7-bb27-80786c738721 nodeName:}" failed. No retries permitted until 2026-04-28 19:17:54.408414178 +0000 UTC m=+129.343459305 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs") pod "network-metrics-daemon-q2wj9" (UID: "ae2a816c-4f04-45d7-bb27-80786c738721") : secret "metrics-daemon-secret" not found Apr 28 19:16:50.411135 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:50.411119 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 28 19:16:50.421483 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:50.421454 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc82k\" (UniqueName: \"kubernetes.io/projected/789c39a9-aea8-4abd-b196-f303f6c6f063-kube-api-access-cc82k\") pod \"network-check-target-fmthr\" (UID: \"789c39a9-aea8-4abd-b196-f303f6c6f063\") " pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:16:50.455329 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:50.455300 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-kx25j\"" Apr 28 19:16:50.462965 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:50.462944 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:16:50.598062 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:50.598030 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fmthr"] Apr 28 19:16:50.601663 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:16:50.601636 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod789c39a9_aea8_4abd_b196_f303f6c6f063.slice/crio-cadc1cb9f5a7ef6777ef691935690b3f3f021ad62f751e138a1657b434dfd879 WatchSource:0}: Error finding container cadc1cb9f5a7ef6777ef691935690b3f3f021ad62f751e138a1657b434dfd879: Status 404 returned error can't find the container with id cadc1cb9f5a7ef6777ef691935690b3f3f021ad62f751e138a1657b434dfd879 Apr 28 19:16:50.936160 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:50.936071 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fmthr" event={"ID":"789c39a9-aea8-4abd-b196-f303f6c6f063","Type":"ContainerStarted","Data":"cadc1cb9f5a7ef6777ef691935690b3f3f021ad62f751e138a1657b434dfd879"} Apr 28 19:16:53.024222 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:53.024183 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f966467c6-xjd5r"] Apr 28 19:16:53.029232 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:53.029205 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f966467c6-xjd5r" Apr 28 19:16:53.032607 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:53.032584 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 28 19:16:53.033457 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:53.033433 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 28 19:16:53.034283 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:53.034144 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 28 19:16:53.034283 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:53.034174 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 28 19:16:53.047552 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:53.047525 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f966467c6-xjd5r"] Apr 28 19:16:53.117961 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:53.117915 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/770c7da0-75e9-4ec2-ad6f-8daa8e878aee-tmp\") pod \"klusterlet-addon-workmgr-f966467c6-xjd5r\" (UID: \"770c7da0-75e9-4ec2-ad6f-8daa8e878aee\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f966467c6-xjd5r" Apr 28 19:16:53.117961 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:53.117965 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84dl9\" (UniqueName: \"kubernetes.io/projected/770c7da0-75e9-4ec2-ad6f-8daa8e878aee-kube-api-access-84dl9\") pod \"klusterlet-addon-workmgr-f966467c6-xjd5r\" (UID: \"770c7da0-75e9-4ec2-ad6f-8daa8e878aee\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f966467c6-xjd5r" Apr 28 19:16:53.118178 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:53.117998 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/770c7da0-75e9-4ec2-ad6f-8daa8e878aee-klusterlet-config\") pod \"klusterlet-addon-workmgr-f966467c6-xjd5r\" (UID: \"770c7da0-75e9-4ec2-ad6f-8daa8e878aee\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f966467c6-xjd5r" Apr 28 19:16:53.218669 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:53.218630 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/770c7da0-75e9-4ec2-ad6f-8daa8e878aee-tmp\") pod \"klusterlet-addon-workmgr-f966467c6-xjd5r\" (UID: \"770c7da0-75e9-4ec2-ad6f-8daa8e878aee\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f966467c6-xjd5r" Apr 28 19:16:53.218848 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:53.218680 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84dl9\" (UniqueName: \"kubernetes.io/projected/770c7da0-75e9-4ec2-ad6f-8daa8e878aee-kube-api-access-84dl9\") pod \"klusterlet-addon-workmgr-f966467c6-xjd5r\" (UID: \"770c7da0-75e9-4ec2-ad6f-8daa8e878aee\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f966467c6-xjd5r" Apr 28 19:16:53.218848 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:53.218790 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/770c7da0-75e9-4ec2-ad6f-8daa8e878aee-klusterlet-config\") pod \"klusterlet-addon-workmgr-f966467c6-xjd5r\" (UID: \"770c7da0-75e9-4ec2-ad6f-8daa8e878aee\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f966467c6-xjd5r" Apr 28 19:16:53.219107 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:53.219078 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/770c7da0-75e9-4ec2-ad6f-8daa8e878aee-tmp\") pod \"klusterlet-addon-workmgr-f966467c6-xjd5r\" (UID: \"770c7da0-75e9-4ec2-ad6f-8daa8e878aee\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f966467c6-xjd5r" Apr 28 19:16:53.221639 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:53.221617 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/770c7da0-75e9-4ec2-ad6f-8daa8e878aee-klusterlet-config\") pod \"klusterlet-addon-workmgr-f966467c6-xjd5r\" (UID: \"770c7da0-75e9-4ec2-ad6f-8daa8e878aee\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f966467c6-xjd5r" Apr 28 19:16:53.228496 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:53.228461 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84dl9\" (UniqueName: \"kubernetes.io/projected/770c7da0-75e9-4ec2-ad6f-8daa8e878aee-kube-api-access-84dl9\") pod \"klusterlet-addon-workmgr-f966467c6-xjd5r\" (UID: \"770c7da0-75e9-4ec2-ad6f-8daa8e878aee\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f966467c6-xjd5r" Apr 28 19:16:53.340747 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:53.340305 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f966467c6-xjd5r" Apr 28 19:16:53.519188 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:53.519161 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f966467c6-xjd5r"] Apr 28 19:16:53.520723 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:16:53.520698 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod770c7da0_75e9_4ec2_ad6f_8daa8e878aee.slice/crio-1e15f28287fc3909f785034ea4084a443c25fe7055a85311ff8daa7a36fc8b4b WatchSource:0}: Error finding container 1e15f28287fc3909f785034ea4084a443c25fe7055a85311ff8daa7a36fc8b4b: Status 404 returned error can't find the container with id 1e15f28287fc3909f785034ea4084a443c25fe7055a85311ff8daa7a36fc8b4b Apr 28 19:16:53.943835 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:53.943796 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f966467c6-xjd5r" event={"ID":"770c7da0-75e9-4ec2-ad6f-8daa8e878aee","Type":"ContainerStarted","Data":"1e15f28287fc3909f785034ea4084a443c25fe7055a85311ff8daa7a36fc8b4b"} Apr 28 19:16:53.944950 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:53.944916 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fmthr" event={"ID":"789c39a9-aea8-4abd-b196-f303f6c6f063","Type":"ContainerStarted","Data":"d89807477768bac0ec4242420843f74472f6f6396c3d32d71c5f21aea8c0fe54"} Apr 28 19:16:53.945085 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:53.945059 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:16:53.961362 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:53.961309 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-fmthr" podStartSLOduration=66.120044414 podStartE2EDuration="1m8.961295404s" podCreationTimestamp="2026-04-28 19:15:45 +0000 UTC" firstStartedPulling="2026-04-28 19:16:50.605585544 +0000 UTC m=+65.540630673" lastFinishedPulling="2026-04-28 19:16:53.446836531 +0000 UTC m=+68.381881663" observedRunningTime="2026-04-28 19:16:53.96049918 +0000 UTC m=+68.895544329" watchObservedRunningTime="2026-04-28 19:16:53.961295404 +0000 UTC m=+68.896340552" Apr 28 19:16:57.954498 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:57.954463 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f966467c6-xjd5r" event={"ID":"770c7da0-75e9-4ec2-ad6f-8daa8e878aee","Type":"ContainerStarted","Data":"3ab8fac97e74b7dbf194dbb2e33e4307dcb91dd734809e125f43f10f613b1bfa"} Apr 28 19:16:57.954922 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:57.954747 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f966467c6-xjd5r" Apr 28 19:16:57.956417 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:57.956394 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f966467c6-xjd5r" Apr 28 19:16:57.975359 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:16:57.975312 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f966467c6-xjd5r" podStartSLOduration=2.026967933 podStartE2EDuration="5.975299058s" podCreationTimestamp="2026-04-28 19:16:52 +0000 UTC" firstStartedPulling="2026-04-28 19:16:53.522477958 +0000 UTC m=+68.457523097" lastFinishedPulling="2026-04-28 19:16:57.470809091 +0000 UTC m=+72.405854222" observedRunningTime="2026-04-28 19:16:57.974844489 +0000 UTC m=+72.909889637" watchObservedRunningTime="2026-04-28 19:16:57.975299058 +0000 UTC m=+72.910344206" Apr 28 19:17:22.328958 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:17:22.328913 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls\") pod \"dns-default-txhn5\" (UID: \"cf70a1bd-b873-41bc-a143-901d7c76665c\") " pod="openshift-dns/dns-default-txhn5" Apr 28 19:17:22.328958 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:17:22.328961 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert\") pod \"ingress-canary-t6pm8\" (UID: \"d392608a-a370-4a06-8556-ce4952638d04\") " pod="openshift-ingress-canary/ingress-canary-t6pm8" Apr 28 19:17:22.329472 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:17:22.329040 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 28 19:17:22.329472 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:17:22.329043 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 28 19:17:22.329472 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:17:22.329095 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert podName:d392608a-a370-4a06-8556-ce4952638d04 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:26.329078263 +0000 UTC m=+161.264123391 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert") pod "ingress-canary-t6pm8" (UID: "d392608a-a370-4a06-8556-ce4952638d04") : secret "canary-serving-cert" not found Apr 28 19:17:22.329472 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:17:22.329107 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls podName:cf70a1bd-b873-41bc-a143-901d7c76665c nodeName:}" failed. No retries permitted until 2026-04-28 19:18:26.329101856 +0000 UTC m=+161.264146984 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls") pod "dns-default-txhn5" (UID: "cf70a1bd-b873-41bc-a143-901d7c76665c") : secret "dns-default-metrics-tls" not found Apr 28 19:17:24.949747 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:17:24.949716 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-fmthr" Apr 28 19:17:54.452499 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:17:54.452453 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs\") pod \"network-metrics-daemon-q2wj9\" (UID: \"ae2a816c-4f04-45d7-bb27-80786c738721\") " pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:17:54.453048 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:17:54.452600 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 28 19:17:54.453048 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:17:54.452675 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs podName:ae2a816c-4f04-45d7-bb27-80786c738721 nodeName:}" failed. No retries permitted until 2026-04-28 19:19:56.452657342 +0000 UTC m=+251.387702476 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs") pod "network-metrics-daemon-q2wj9" (UID: "ae2a816c-4f04-45d7-bb27-80786c738721") : secret "metrics-daemon-secret" not found Apr 28 19:18:05.687881 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.687758 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-86dfbc9d5d-cmkg7"] Apr 28 19:18:05.690392 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.690372 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:05.693369 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.693342 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 28 19:18:05.693503 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.693342 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 28 19:18:05.693503 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.693400 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 28 19:18:05.693783 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.693767 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 28 19:18:05.694532 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.694514 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 28 19:18:05.694532 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.694531 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 28 19:18:05.694656 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.694534 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-wx7c2\"" Apr 28 19:18:05.708497 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.708469 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-86dfbc9d5d-cmkg7"] Apr 28 19:18:05.778725 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.778692 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-c49b984f5-fxfzl"] Apr 28 19:18:05.781274 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.781249 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:05.782343 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.782319 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-sgd2m"] Apr 28 19:18:05.784202 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.784182 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 28 19:18:05.784304 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.784211 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 28 19:18:05.784918 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.784881 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f2h6c"] Apr 28 19:18:05.785084 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.785067 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-sgd2m" Apr 28 19:18:05.785879 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.785856 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 28 19:18:05.785879 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.785877 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-lcrng\"" Apr 28 19:18:05.787418 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.787401 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f2h6c" Apr 28 19:18:05.792628 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.791128 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 28 19:18:05.792628 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.791283 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 28 19:18:05.792628 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.791621 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 28 19:18:05.792628 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.791885 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-ddxbq\"" Apr 28 19:18:05.792628 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.792231 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 28 19:18:05.792914 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.792870 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-47m6r\"" Apr 28 19:18:05.794495 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.794461 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 28 19:18:05.798518 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.798498 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-sgd2m"] Apr 28 19:18:05.802844 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.802819 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-c49b984f5-fxfzl"] Apr 28 19:18:05.828122 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.828088 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-metrics-certs\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:05.828122 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.828124 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07abf802-d61b-4b5a-8138-9f569e8d18b7-service-ca-bundle\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:05.828344 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.828157 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-default-certificate\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:05.828344 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.828242 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-stats-auth\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:05.828344 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.828279 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m98w4\" (UniqueName: \"kubernetes.io/projected/07abf802-d61b-4b5a-8138-9f569e8d18b7-kube-api-access-m98w4\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:05.830839 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.830816 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f2h6c"] Apr 28 19:18:05.929061 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.929025 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-stats-auth\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:05.929061 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.929059 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m98w4\" (UniqueName: \"kubernetes.io/projected/07abf802-d61b-4b5a-8138-9f569e8d18b7-kube-api-access-m98w4\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:05.929319 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.929082 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-tls\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:05.929319 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.929098 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-f2h6c\" (UID: \"18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f2h6c" Apr 28 19:18:05.929319 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.929160 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq6zp\" (UniqueName: \"kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-kube-api-access-cq6zp\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:05.929319 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.929194 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-f2h6c\" (UID: \"18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f2h6c" Apr 28 19:18:05.929319 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.929273 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/27c4f3e0-e06d-4652-9e49-d180dcdb1901-ca-trust-extracted\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:05.929319 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.929298 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-bound-sa-token\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:05.929319 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.929323 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dxb5\" (UniqueName: \"kubernetes.io/projected/de46f190-a71a-4aa0-b9b0-54682ec8837f-kube-api-access-8dxb5\") pod \"network-check-source-8894fc9bd-sgd2m\" (UID: \"de46f190-a71a-4aa0-b9b0-54682ec8837f\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-sgd2m" Apr 28 19:18:05.929637 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.929348 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/27c4f3e0-e06d-4652-9e49-d180dcdb1901-image-registry-private-configuration\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:05.929637 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.929405 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-metrics-certs\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:05.929637 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.929431 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-certificates\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:05.929637 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.929468 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/27c4f3e0-e06d-4652-9e49-d180dcdb1901-installation-pull-secrets\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:05.929637 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:05.929479 2582 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 28 19:18:05.929637 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.929491 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07abf802-d61b-4b5a-8138-9f569e8d18b7-service-ca-bundle\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:05.929637 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.929521 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-default-certificate\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:05.929637 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:05.929533 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-metrics-certs podName:07abf802-d61b-4b5a-8138-9f569e8d18b7 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:06.429518687 +0000 UTC m=+141.364563815 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-metrics-certs") pod "router-default-86dfbc9d5d-cmkg7" (UID: "07abf802-d61b-4b5a-8138-9f569e8d18b7") : secret "router-metrics-certs-default" not found Apr 28 19:18:05.929637 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:05.929582 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/07abf802-d61b-4b5a-8138-9f569e8d18b7-service-ca-bundle podName:07abf802-d61b-4b5a-8138-9f569e8d18b7 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:06.429560856 +0000 UTC m=+141.364605986 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/07abf802-d61b-4b5a-8138-9f569e8d18b7-service-ca-bundle") pod "router-default-86dfbc9d5d-cmkg7" (UID: "07abf802-d61b-4b5a-8138-9f569e8d18b7") : configmap references non-existent config key: service-ca.crt Apr 28 19:18:05.929637 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.929610 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27c4f3e0-e06d-4652-9e49-d180dcdb1901-trusted-ca\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:05.929637 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.929636 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nv6q\" (UniqueName: \"kubernetes.io/projected/18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7-kube-api-access-5nv6q\") pod \"kube-storage-version-migrator-operator-6769c5d45-f2h6c\" (UID: \"18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f2h6c" Apr 28 19:18:05.931748 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.931731 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-stats-auth\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:05.932037 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.932018 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-default-certificate\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:05.938530 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:05.938459 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m98w4\" (UniqueName: \"kubernetes.io/projected/07abf802-d61b-4b5a-8138-9f569e8d18b7-kube-api-access-m98w4\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:06.030865 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.030826 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-certificates\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:06.030865 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.030868 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/27c4f3e0-e06d-4652-9e49-d180dcdb1901-installation-pull-secrets\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:06.031136 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.030938 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27c4f3e0-e06d-4652-9e49-d180dcdb1901-trusted-ca\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:06.031136 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.030967 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nv6q\" (UniqueName: \"kubernetes.io/projected/18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7-kube-api-access-5nv6q\") pod \"kube-storage-version-migrator-operator-6769c5d45-f2h6c\" (UID: \"18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f2h6c" Apr 28 19:18:06.031136 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.031111 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-tls\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:06.031285 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.031146 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-f2h6c\" (UID: \"18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f2h6c" Apr 28 19:18:06.031285 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.031180 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cq6zp\" (UniqueName: \"kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-kube-api-access-cq6zp\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:06.031285 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:06.031245 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:18:06.031285 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:06.031268 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c49b984f5-fxfzl: secret "image-registry-tls" not found Apr 28 19:18:06.031478 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.031307 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-f2h6c\" (UID: \"18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f2h6c" Apr 28 19:18:06.031478 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:06.031339 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-tls podName:27c4f3e0-e06d-4652-9e49-d180dcdb1901 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:06.531323017 +0000 UTC m=+141.466368149 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-tls") pod "image-registry-c49b984f5-fxfzl" (UID: "27c4f3e0-e06d-4652-9e49-d180dcdb1901") : secret "image-registry-tls" not found Apr 28 19:18:06.031478 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.031391 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/27c4f3e0-e06d-4652-9e49-d180dcdb1901-ca-trust-extracted\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:06.031478 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.031410 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-bound-sa-token\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:06.031478 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.031427 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dxb5\" (UniqueName: \"kubernetes.io/projected/de46f190-a71a-4aa0-b9b0-54682ec8837f-kube-api-access-8dxb5\") pod \"network-check-source-8894fc9bd-sgd2m\" (UID: \"de46f190-a71a-4aa0-b9b0-54682ec8837f\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-sgd2m" Apr 28 19:18:06.031478 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.031459 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/27c4f3e0-e06d-4652-9e49-d180dcdb1901-image-registry-private-configuration\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:06.031775 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.031722 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-f2h6c\" (UID: \"18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f2h6c" Apr 28 19:18:06.031990 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.031969 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/27c4f3e0-e06d-4652-9e49-d180dcdb1901-ca-trust-extracted\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:06.032299 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.032277 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-certificates\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:06.032360 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.032337 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27c4f3e0-e06d-4652-9e49-d180dcdb1901-trusted-ca\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:06.033777 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.033755 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-f2h6c\" (UID: \"18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f2h6c" Apr 28 19:18:06.033954 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.033935 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/27c4f3e0-e06d-4652-9e49-d180dcdb1901-installation-pull-secrets\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:06.034001 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.033940 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/27c4f3e0-e06d-4652-9e49-d180dcdb1901-image-registry-private-configuration\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:06.042803 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.042772 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-bound-sa-token\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:06.042929 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.042857 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq6zp\" (UniqueName: \"kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-kube-api-access-cq6zp\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:06.043015 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.042998 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nv6q\" (UniqueName: \"kubernetes.io/projected/18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7-kube-api-access-5nv6q\") pod \"kube-storage-version-migrator-operator-6769c5d45-f2h6c\" (UID: \"18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f2h6c" Apr 28 19:18:06.043132 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.043115 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dxb5\" (UniqueName: \"kubernetes.io/projected/de46f190-a71a-4aa0-b9b0-54682ec8837f-kube-api-access-8dxb5\") pod \"network-check-source-8894fc9bd-sgd2m\" (UID: \"de46f190-a71a-4aa0-b9b0-54682ec8837f\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-sgd2m" Apr 28 19:18:06.099567 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.099534 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-sgd2m" Apr 28 19:18:06.107267 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.107240 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f2h6c" Apr 28 19:18:06.229094 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.229062 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-sgd2m"] Apr 28 19:18:06.232886 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:18:06.232855 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde46f190_a71a_4aa0_b9b0_54682ec8837f.slice/crio-919847f43635acc32f7b70a0ec06a38e579f4d068eea2d09d9c1de6da1dbae24 WatchSource:0}: Error finding container 919847f43635acc32f7b70a0ec06a38e579f4d068eea2d09d9c1de6da1dbae24: Status 404 returned error can't find the container with id 919847f43635acc32f7b70a0ec06a38e579f4d068eea2d09d9c1de6da1dbae24 Apr 28 19:18:06.247855 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.247828 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f2h6c"] Apr 28 19:18:06.252153 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:18:06.252123 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18ad5e91_b88a_4b8c_88a2_c52f5fe8a9f7.slice/crio-70503cd2f4d2bcb7d3ba7bfaa1164efd2de0e06e6c33d90fb4ace4421fbb7836 WatchSource:0}: Error finding container 70503cd2f4d2bcb7d3ba7bfaa1164efd2de0e06e6c33d90fb4ace4421fbb7836: Status 404 returned error can't find the container with id 70503cd2f4d2bcb7d3ba7bfaa1164efd2de0e06e6c33d90fb4ace4421fbb7836 Apr 28 19:18:06.434972 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.434933 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-metrics-certs\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:06.434972 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.434976 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07abf802-d61b-4b5a-8138-9f569e8d18b7-service-ca-bundle\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:06.435186 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:06.435084 2582 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 28 19:18:06.435186 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:06.435098 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/07abf802-d61b-4b5a-8138-9f569e8d18b7-service-ca-bundle podName:07abf802-d61b-4b5a-8138-9f569e8d18b7 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:07.435081705 +0000 UTC m=+142.370126840 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/07abf802-d61b-4b5a-8138-9f569e8d18b7-service-ca-bundle") pod "router-default-86dfbc9d5d-cmkg7" (UID: "07abf802-d61b-4b5a-8138-9f569e8d18b7") : configmap references non-existent config key: service-ca.crt Apr 28 19:18:06.435186 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:06.435137 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-metrics-certs podName:07abf802-d61b-4b5a-8138-9f569e8d18b7 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:07.435122173 +0000 UTC m=+142.370167301 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-metrics-certs") pod "router-default-86dfbc9d5d-cmkg7" (UID: "07abf802-d61b-4b5a-8138-9f569e8d18b7") : secret "router-metrics-certs-default" not found Apr 28 19:18:06.536128 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:06.536090 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-tls\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:06.536292 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:06.536243 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:18:06.536292 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:06.536262 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c49b984f5-fxfzl: secret "image-registry-tls" not found Apr 28 19:18:06.536364 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:06.536315 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-tls podName:27c4f3e0-e06d-4652-9e49-d180dcdb1901 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:07.536300084 +0000 UTC m=+142.471345213 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-tls") pod "image-registry-c49b984f5-fxfzl" (UID: "27c4f3e0-e06d-4652-9e49-d180dcdb1901") : secret "image-registry-tls" not found Apr 28 19:18:07.090978 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:07.090939 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f2h6c" event={"ID":"18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7","Type":"ContainerStarted","Data":"70503cd2f4d2bcb7d3ba7bfaa1164efd2de0e06e6c33d90fb4ace4421fbb7836"} Apr 28 19:18:07.092361 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:07.092328 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-sgd2m" event={"ID":"de46f190-a71a-4aa0-b9b0-54682ec8837f","Type":"ContainerStarted","Data":"c48678d20d6614ccc657bf8cd5d1438bfbb39fb8482df05db6f2c51b8c1da045"} Apr 28 19:18:07.092504 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:07.092365 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-sgd2m" event={"ID":"de46f190-a71a-4aa0-b9b0-54682ec8837f","Type":"ContainerStarted","Data":"919847f43635acc32f7b70a0ec06a38e579f4d068eea2d09d9c1de6da1dbae24"} Apr 28 19:18:07.115329 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:07.115276 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-sgd2m" podStartSLOduration=2.115254356 podStartE2EDuration="2.115254356s" podCreationTimestamp="2026-04-28 19:18:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:18:07.113301487 +0000 UTC m=+142.048346636" watchObservedRunningTime="2026-04-28 19:18:07.115254356 +0000 UTC m=+142.050299507" Apr 28 19:18:07.444335 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:07.444235 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-metrics-certs\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:07.444335 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:07.444280 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07abf802-d61b-4b5a-8138-9f569e8d18b7-service-ca-bundle\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:07.444649 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:07.444383 2582 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 28 19:18:07.444649 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:07.444442 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/07abf802-d61b-4b5a-8138-9f569e8d18b7-service-ca-bundle podName:07abf802-d61b-4b5a-8138-9f569e8d18b7 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:09.444421677 +0000 UTC m=+144.379466804 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/07abf802-d61b-4b5a-8138-9f569e8d18b7-service-ca-bundle") pod "router-default-86dfbc9d5d-cmkg7" (UID: "07abf802-d61b-4b5a-8138-9f569e8d18b7") : configmap references non-existent config key: service-ca.crt Apr 28 19:18:07.444649 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:07.444463 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-metrics-certs podName:07abf802-d61b-4b5a-8138-9f569e8d18b7 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:09.444453993 +0000 UTC m=+144.379499123 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-metrics-certs") pod "router-default-86dfbc9d5d-cmkg7" (UID: "07abf802-d61b-4b5a-8138-9f569e8d18b7") : secret "router-metrics-certs-default" not found Apr 28 19:18:07.545555 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:07.545518 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-tls\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:07.545726 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:07.545660 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:18:07.545726 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:07.545675 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c49b984f5-fxfzl: secret "image-registry-tls" not found Apr 28 19:18:07.545849 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:07.545730 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-tls podName:27c4f3e0-e06d-4652-9e49-d180dcdb1901 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:09.545712812 +0000 UTC m=+144.480757939 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-tls") pod "image-registry-c49b984f5-fxfzl" (UID: "27c4f3e0-e06d-4652-9e49-d180dcdb1901") : secret "image-registry-tls" not found Apr 28 19:18:09.097143 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:09.097098 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f2h6c" event={"ID":"18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7","Type":"ContainerStarted","Data":"1252b06e00ade898a11caf7ec0ee9a73f8a2c9eb13942745d025cfb3d16f8007"} Apr 28 19:18:09.118865 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:09.118818 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f2h6c" podStartSLOduration=1.86084944 podStartE2EDuration="4.118804907s" podCreationTimestamp="2026-04-28 19:18:05 +0000 UTC" firstStartedPulling="2026-04-28 19:18:06.253698969 +0000 UTC m=+141.188744096" lastFinishedPulling="2026-04-28 19:18:08.511654436 +0000 UTC m=+143.446699563" observedRunningTime="2026-04-28 19:18:09.117093739 +0000 UTC m=+144.052138888" watchObservedRunningTime="2026-04-28 19:18:09.118804907 +0000 UTC m=+144.053850056" Apr 28 19:18:09.462219 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:09.462121 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-metrics-certs\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:09.462219 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:09.462176 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07abf802-d61b-4b5a-8138-9f569e8d18b7-service-ca-bundle\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:09.462415 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:09.462301 2582 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 28 19:18:09.462415 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:09.462340 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/07abf802-d61b-4b5a-8138-9f569e8d18b7-service-ca-bundle podName:07abf802-d61b-4b5a-8138-9f569e8d18b7 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:13.462324008 +0000 UTC m=+148.397369155 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/07abf802-d61b-4b5a-8138-9f569e8d18b7-service-ca-bundle") pod "router-default-86dfbc9d5d-cmkg7" (UID: "07abf802-d61b-4b5a-8138-9f569e8d18b7") : configmap references non-existent config key: service-ca.crt Apr 28 19:18:09.462415 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:09.462370 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-metrics-certs podName:07abf802-d61b-4b5a-8138-9f569e8d18b7 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:13.462351394 +0000 UTC m=+148.397396538 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-metrics-certs") pod "router-default-86dfbc9d5d-cmkg7" (UID: "07abf802-d61b-4b5a-8138-9f569e8d18b7") : secret "router-metrics-certs-default" not found Apr 28 19:18:09.563305 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:09.563257 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-tls\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:09.563476 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:09.563376 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:18:09.563476 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:09.563388 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c49b984f5-fxfzl: secret "image-registry-tls" not found Apr 28 19:18:09.563476 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:09.563437 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-tls podName:27c4f3e0-e06d-4652-9e49-d180dcdb1901 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:13.563423911 +0000 UTC m=+148.498469038 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-tls") pod "image-registry-c49b984f5-fxfzl" (UID: "27c4f3e0-e06d-4652-9e49-d180dcdb1901") : secret "image-registry-tls" not found Apr 28 19:18:12.784249 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:12.784222 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-dn855_51106027-8f90-4285-9257-0da036866696/dns-node-resolver/0.log" Apr 28 19:18:13.495979 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:13.495943 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-metrics-certs\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:13.495979 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:13.495986 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07abf802-d61b-4b5a-8138-9f569e8d18b7-service-ca-bundle\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:13.496202 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:13.496114 2582 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 28 19:18:13.496202 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:13.496135 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/07abf802-d61b-4b5a-8138-9f569e8d18b7-service-ca-bundle podName:07abf802-d61b-4b5a-8138-9f569e8d18b7 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:21.496117533 +0000 UTC m=+156.431162660 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/07abf802-d61b-4b5a-8138-9f569e8d18b7-service-ca-bundle") pod "router-default-86dfbc9d5d-cmkg7" (UID: "07abf802-d61b-4b5a-8138-9f569e8d18b7") : configmap references non-existent config key: service-ca.crt Apr 28 19:18:13.496202 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:13.496158 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-metrics-certs podName:07abf802-d61b-4b5a-8138-9f569e8d18b7 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:21.49614779 +0000 UTC m=+156.431192922 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-metrics-certs") pod "router-default-86dfbc9d5d-cmkg7" (UID: "07abf802-d61b-4b5a-8138-9f569e8d18b7") : secret "router-metrics-certs-default" not found Apr 28 19:18:13.596713 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:13.596676 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-tls\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:13.596984 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:13.596792 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 28 19:18:13.596984 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:13.596804 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c49b984f5-fxfzl: secret "image-registry-tls" not found Apr 28 19:18:13.596984 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:13.596851 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-tls podName:27c4f3e0-e06d-4652-9e49-d180dcdb1901 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:21.596839156 +0000 UTC m=+156.531884283 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-tls") pod "image-registry-c49b984f5-fxfzl" (UID: "27c4f3e0-e06d-4652-9e49-d180dcdb1901") : secret "image-registry-tls" not found Apr 28 19:18:13.984459 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:13.984375 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qtt7w_1c539cc3-1090-4486-ab6c-9d184f87803d/node-ca/0.log" Apr 28 19:18:21.462881 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:21.462830 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-txhn5" podUID="cf70a1bd-b873-41bc-a143-901d7c76665c" Apr 28 19:18:21.477037 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:21.477002 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-t6pm8" podUID="d392608a-a370-4a06-8556-ce4952638d04" Apr 28 19:18:21.563954 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:21.563922 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07abf802-d61b-4b5a-8138-9f569e8d18b7-service-ca-bundle\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:21.564140 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:21.564031 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-metrics-certs\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:21.564587 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:21.564562 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07abf802-d61b-4b5a-8138-9f569e8d18b7-service-ca-bundle\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:21.566347 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:21.566330 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07abf802-d61b-4b5a-8138-9f569e8d18b7-metrics-certs\") pod \"router-default-86dfbc9d5d-cmkg7\" (UID: \"07abf802-d61b-4b5a-8138-9f569e8d18b7\") " pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:21.599011 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:21.598977 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:21.657020 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:21.656974 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-q2wj9" podUID="ae2a816c-4f04-45d7-bb27-80786c738721" Apr 28 19:18:21.665212 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:21.665178 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-tls\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:21.667605 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:21.667586 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-tls\") pod \"image-registry-c49b984f5-fxfzl\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:21.692443 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:21.692412 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:21.730371 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:21.730292 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-86dfbc9d5d-cmkg7"] Apr 28 19:18:21.733156 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:18:21.733120 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07abf802_d61b_4b5a_8138_9f569e8d18b7.slice/crio-702515573ae4ea1e5eca5f471bf1e3dabba5d2a23a106ee32709c0294c184916 WatchSource:0}: Error finding container 702515573ae4ea1e5eca5f471bf1e3dabba5d2a23a106ee32709c0294c184916: Status 404 returned error can't find the container with id 702515573ae4ea1e5eca5f471bf1e3dabba5d2a23a106ee32709c0294c184916 Apr 28 19:18:21.820346 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:21.820317 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-c49b984f5-fxfzl"] Apr 28 19:18:21.823293 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:18:21.823262 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27c4f3e0_e06d_4652_9e49_d180dcdb1901.slice/crio-f9c8c0100c6b264bdb5e9d7c1d38467129dfb431fe884bb9e2d42b6e42e0b8dd WatchSource:0}: Error finding container f9c8c0100c6b264bdb5e9d7c1d38467129dfb431fe884bb9e2d42b6e42e0b8dd: Status 404 returned error can't find the container with id f9c8c0100c6b264bdb5e9d7c1d38467129dfb431fe884bb9e2d42b6e42e0b8dd Apr 28 19:18:22.126204 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:22.126168 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" event={"ID":"07abf802-d61b-4b5a-8138-9f569e8d18b7","Type":"ContainerStarted","Data":"ed0e2c48310b9685d0b3d7decfb3d26ce21b971cb53e3b5fc103efbc7e2f3485"} Apr 28 19:18:22.126204 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:22.126210 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" event={"ID":"07abf802-d61b-4b5a-8138-9f569e8d18b7","Type":"ContainerStarted","Data":"702515573ae4ea1e5eca5f471bf1e3dabba5d2a23a106ee32709c0294c184916"} Apr 28 19:18:22.127463 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:22.127430 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" event={"ID":"27c4f3e0-e06d-4652-9e49-d180dcdb1901","Type":"ContainerStarted","Data":"fe4a534551c6df2ae74dba15f3bdee827755fe655ff8a4529d6646a3845902be"} Apr 28 19:18:22.127580 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:22.127468 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" event={"ID":"27c4f3e0-e06d-4652-9e49-d180dcdb1901","Type":"ContainerStarted","Data":"f9c8c0100c6b264bdb5e9d7c1d38467129dfb431fe884bb9e2d42b6e42e0b8dd"} Apr 28 19:18:22.127580 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:22.127478 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-txhn5" Apr 28 19:18:22.127657 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:22.127467 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t6pm8" Apr 28 19:18:22.147755 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:22.147709 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" podStartSLOduration=17.147694214 podStartE2EDuration="17.147694214s" podCreationTimestamp="2026-04-28 19:18:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:18:22.147220565 +0000 UTC m=+157.082265714" watchObservedRunningTime="2026-04-28 19:18:22.147694214 +0000 UTC m=+157.082739362" Apr 28 19:18:22.170170 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:22.170127 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" podStartSLOduration=17.170113341 podStartE2EDuration="17.170113341s" podCreationTimestamp="2026-04-28 19:18:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:18:22.168784611 +0000 UTC m=+157.103829760" watchObservedRunningTime="2026-04-28 19:18:22.170113341 +0000 UTC m=+157.105158516" Apr 28 19:18:22.599213 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:22.599172 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:22.601703 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:22.601680 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:23.131086 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:23.131051 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:23.131086 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:23.131092 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:23.132196 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:23.132173 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-86dfbc9d5d-cmkg7" Apr 28 19:18:26.405738 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:26.405697 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert\") pod \"ingress-canary-t6pm8\" (UID: \"d392608a-a370-4a06-8556-ce4952638d04\") " pod="openshift-ingress-canary/ingress-canary-t6pm8" Apr 28 19:18:26.406151 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:26.405814 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls\") pod \"dns-default-txhn5\" (UID: \"cf70a1bd-b873-41bc-a143-901d7c76665c\") " pod="openshift-dns/dns-default-txhn5" Apr 28 19:18:26.408260 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:26.408223 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf70a1bd-b873-41bc-a143-901d7c76665c-metrics-tls\") pod \"dns-default-txhn5\" (UID: \"cf70a1bd-b873-41bc-a143-901d7c76665c\") " pod="openshift-dns/dns-default-txhn5" Apr 28 19:18:26.408393 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:26.408280 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d392608a-a370-4a06-8556-ce4952638d04-cert\") pod \"ingress-canary-t6pm8\" (UID: \"d392608a-a370-4a06-8556-ce4952638d04\") " pod="openshift-ingress-canary/ingress-canary-t6pm8" Apr 28 19:18:26.633634 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:26.633599 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qcg5d\"" Apr 28 19:18:26.633634 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:26.633599 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rkrph\"" Apr 28 19:18:26.639052 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:26.639029 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-txhn5" Apr 28 19:18:26.639160 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:26.639100 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-t6pm8" Apr 28 19:18:26.774511 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:26.774344 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-txhn5"] Apr 28 19:18:26.776489 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:18:26.776453 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf70a1bd_b873_41bc_a143_901d7c76665c.slice/crio-e84c41dabcb674283d3ce7d144949f5766382fd838b9f5b22ab4b37b2e568b43 WatchSource:0}: Error finding container e84c41dabcb674283d3ce7d144949f5766382fd838b9f5b22ab4b37b2e568b43: Status 404 returned error can't find the container with id e84c41dabcb674283d3ce7d144949f5766382fd838b9f5b22ab4b37b2e568b43 Apr 28 19:18:26.789189 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:26.789157 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-t6pm8"] Apr 28 19:18:26.792425 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:18:26.792383 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd392608a_a370_4a06_8556_ce4952638d04.slice/crio-45293aa7fdbbfefa86ba07faa7517b650f0a5ab5b3c52e53cbba9792307733c7 WatchSource:0}: Error finding container 45293aa7fdbbfefa86ba07faa7517b650f0a5ab5b3c52e53cbba9792307733c7: Status 404 returned error can't find the container with id 45293aa7fdbbfefa86ba07faa7517b650f0a5ab5b3c52e53cbba9792307733c7 Apr 28 19:18:27.140864 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:27.140825 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t6pm8" event={"ID":"d392608a-a370-4a06-8556-ce4952638d04","Type":"ContainerStarted","Data":"45293aa7fdbbfefa86ba07faa7517b650f0a5ab5b3c52e53cbba9792307733c7"} Apr 28 19:18:27.141754 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:27.141729 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-txhn5" event={"ID":"cf70a1bd-b873-41bc-a143-901d7c76665c","Type":"ContainerStarted","Data":"e84c41dabcb674283d3ce7d144949f5766382fd838b9f5b22ab4b37b2e568b43"} Apr 28 19:18:29.149775 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:29.149738 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-t6pm8" event={"ID":"d392608a-a370-4a06-8556-ce4952638d04","Type":"ContainerStarted","Data":"2b03bbf8ac143513bb293ea763bc7a21c2c39257126c639954e042bca7fe8500"} Apr 28 19:18:29.151414 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:29.151387 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-txhn5" event={"ID":"cf70a1bd-b873-41bc-a143-901d7c76665c","Type":"ContainerStarted","Data":"b03a327f59580b1729cae77dfb5bdc3c444cce0267a7e49366c980fe9e666c72"} Apr 28 19:18:29.151524 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:29.151422 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-txhn5" event={"ID":"cf70a1bd-b873-41bc-a143-901d7c76665c","Type":"ContainerStarted","Data":"c8c37d8ce0b4ecffae486f5f741228d059162e044209b8398e770922107af2e9"} Apr 28 19:18:29.151524 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:29.151512 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-txhn5" Apr 28 19:18:29.166310 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:29.166254 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-t6pm8" podStartSLOduration=129.038114617 podStartE2EDuration="2m11.166240499s" podCreationTimestamp="2026-04-28 19:16:18 +0000 UTC" firstStartedPulling="2026-04-28 19:18:26.79466709 +0000 UTC m=+161.729712233" lastFinishedPulling="2026-04-28 19:18:28.922792987 +0000 UTC m=+163.857838115" observedRunningTime="2026-04-28 19:18:29.165999216 +0000 UTC m=+164.101044366" watchObservedRunningTime="2026-04-28 19:18:29.166240499 +0000 UTC m=+164.101285641" Apr 28 19:18:29.184508 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:29.184296 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-txhn5" podStartSLOduration=129.804805845 podStartE2EDuration="2m11.184275845s" podCreationTimestamp="2026-04-28 19:16:18 +0000 UTC" firstStartedPulling="2026-04-28 19:18:26.77841455 +0000 UTC m=+161.713459680" lastFinishedPulling="2026-04-28 19:18:28.157884543 +0000 UTC m=+163.092929680" observedRunningTime="2026-04-28 19:18:29.183234643 +0000 UTC m=+164.118279804" watchObservedRunningTime="2026-04-28 19:18:29.184275845 +0000 UTC m=+164.119320988" Apr 28 19:18:33.308678 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.308587 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-72v9l"] Apr 28 19:18:33.312210 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.312179 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-72v9l" Apr 28 19:18:33.315621 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.315594 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 28 19:18:33.316836 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.316810 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 28 19:18:33.316999 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.316839 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-ql9lz\"" Apr 28 19:18:33.316999 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.316862 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 28 19:18:33.317277 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.317259 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 28 19:18:33.332178 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.332148 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-72v9l"] Apr 28 19:18:33.342232 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.342203 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-c49b984f5-fxfzl"] Apr 28 19:18:33.460980 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.460944 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/17932199-2efe-457e-ab6b-f3d6f37d1c05-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-72v9l\" (UID: \"17932199-2efe-457e-ab6b-f3d6f37d1c05\") " pod="openshift-insights/insights-runtime-extractor-72v9l" Apr 28 19:18:33.461151 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.460990 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mgmw\" (UniqueName: \"kubernetes.io/projected/17932199-2efe-457e-ab6b-f3d6f37d1c05-kube-api-access-8mgmw\") pod \"insights-runtime-extractor-72v9l\" (UID: \"17932199-2efe-457e-ab6b-f3d6f37d1c05\") " pod="openshift-insights/insights-runtime-extractor-72v9l" Apr 28 19:18:33.461151 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.461103 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/17932199-2efe-457e-ab6b-f3d6f37d1c05-data-volume\") pod \"insights-runtime-extractor-72v9l\" (UID: \"17932199-2efe-457e-ab6b-f3d6f37d1c05\") " pod="openshift-insights/insights-runtime-extractor-72v9l" Apr 28 19:18:33.461151 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.461138 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/17932199-2efe-457e-ab6b-f3d6f37d1c05-crio-socket\") pod \"insights-runtime-extractor-72v9l\" (UID: \"17932199-2efe-457e-ab6b-f3d6f37d1c05\") " pod="openshift-insights/insights-runtime-extractor-72v9l" Apr 28 19:18:33.461251 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.461160 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/17932199-2efe-457e-ab6b-f3d6f37d1c05-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-72v9l\" (UID: \"17932199-2efe-457e-ab6b-f3d6f37d1c05\") " pod="openshift-insights/insights-runtime-extractor-72v9l" Apr 28 19:18:33.562256 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.562167 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/17932199-2efe-457e-ab6b-f3d6f37d1c05-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-72v9l\" (UID: \"17932199-2efe-457e-ab6b-f3d6f37d1c05\") " pod="openshift-insights/insights-runtime-extractor-72v9l" Apr 28 19:18:33.562256 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.562215 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mgmw\" (UniqueName: \"kubernetes.io/projected/17932199-2efe-457e-ab6b-f3d6f37d1c05-kube-api-access-8mgmw\") pod \"insights-runtime-extractor-72v9l\" (UID: \"17932199-2efe-457e-ab6b-f3d6f37d1c05\") " pod="openshift-insights/insights-runtime-extractor-72v9l" Apr 28 19:18:33.562474 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.562269 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/17932199-2efe-457e-ab6b-f3d6f37d1c05-data-volume\") pod \"insights-runtime-extractor-72v9l\" (UID: \"17932199-2efe-457e-ab6b-f3d6f37d1c05\") " pod="openshift-insights/insights-runtime-extractor-72v9l" Apr 28 19:18:33.562474 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.562287 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/17932199-2efe-457e-ab6b-f3d6f37d1c05-crio-socket\") pod \"insights-runtime-extractor-72v9l\" (UID: \"17932199-2efe-457e-ab6b-f3d6f37d1c05\") " pod="openshift-insights/insights-runtime-extractor-72v9l" Apr 28 19:18:33.562474 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.562311 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/17932199-2efe-457e-ab6b-f3d6f37d1c05-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-72v9l\" (UID: \"17932199-2efe-457e-ab6b-f3d6f37d1c05\") " pod="openshift-insights/insights-runtime-extractor-72v9l" Apr 28 19:18:33.562474 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.562419 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/17932199-2efe-457e-ab6b-f3d6f37d1c05-crio-socket\") pod \"insights-runtime-extractor-72v9l\" (UID: \"17932199-2efe-457e-ab6b-f3d6f37d1c05\") " pod="openshift-insights/insights-runtime-extractor-72v9l" Apr 28 19:18:33.562673 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.562651 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/17932199-2efe-457e-ab6b-f3d6f37d1c05-data-volume\") pod \"insights-runtime-extractor-72v9l\" (UID: \"17932199-2efe-457e-ab6b-f3d6f37d1c05\") " pod="openshift-insights/insights-runtime-extractor-72v9l" Apr 28 19:18:33.562752 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.562735 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/17932199-2efe-457e-ab6b-f3d6f37d1c05-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-72v9l\" (UID: \"17932199-2efe-457e-ab6b-f3d6f37d1c05\") " pod="openshift-insights/insights-runtime-extractor-72v9l" Apr 28 19:18:33.564910 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.564865 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/17932199-2efe-457e-ab6b-f3d6f37d1c05-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-72v9l\" (UID: \"17932199-2efe-457e-ab6b-f3d6f37d1c05\") " pod="openshift-insights/insights-runtime-extractor-72v9l" Apr 28 19:18:33.572562 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.572526 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mgmw\" (UniqueName: \"kubernetes.io/projected/17932199-2efe-457e-ab6b-f3d6f37d1c05-kube-api-access-8mgmw\") pod \"insights-runtime-extractor-72v9l\" (UID: \"17932199-2efe-457e-ab6b-f3d6f37d1c05\") " pod="openshift-insights/insights-runtime-extractor-72v9l" Apr 28 19:18:33.622599 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.622563 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-72v9l" Apr 28 19:18:33.766700 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:33.766672 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-72v9l"] Apr 28 19:18:33.769844 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:18:33.769816 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17932199_2efe_457e_ab6b_f3d6f37d1c05.slice/crio-3874f3614eb3a60c6831b01564ad2b220da151a4397c7c8ed966185580f8a122 WatchSource:0}: Error finding container 3874f3614eb3a60c6831b01564ad2b220da151a4397c7c8ed966185580f8a122: Status 404 returned error can't find the container with id 3874f3614eb3a60c6831b01564ad2b220da151a4397c7c8ed966185580f8a122 Apr 28 19:18:34.166178 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:34.166090 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-72v9l" event={"ID":"17932199-2efe-457e-ab6b-f3d6f37d1c05","Type":"ContainerStarted","Data":"3628a1d3e5b06111664f6643fa85518d897d82d745d5650565b7cf10298a5ef7"} Apr 28 19:18:34.166178 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:34.166135 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-72v9l" event={"ID":"17932199-2efe-457e-ab6b-f3d6f37d1c05","Type":"ContainerStarted","Data":"3874f3614eb3a60c6831b01564ad2b220da151a4397c7c8ed966185580f8a122"} Apr 28 19:18:35.171364 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:35.171320 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-72v9l" event={"ID":"17932199-2efe-457e-ab6b-f3d6f37d1c05","Type":"ContainerStarted","Data":"31deaa303be03d8a992df310094cf0004a4728d1c0ae81d90a3bf4f0280b1ed0"} Apr 28 19:18:35.639802 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:35.639749 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:18:36.176077 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:36.176038 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-72v9l" event={"ID":"17932199-2efe-457e-ab6b-f3d6f37d1c05","Type":"ContainerStarted","Data":"8f17e0cf45f5a0720452f0307163c491d9ae6460d744ebdd4e046a0138b1aab9"} Apr 28 19:18:36.209549 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:36.209481 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-72v9l" podStartSLOduration=1.330333265 podStartE2EDuration="3.209465176s" podCreationTimestamp="2026-04-28 19:18:33 +0000 UTC" firstStartedPulling="2026-04-28 19:18:33.82321066 +0000 UTC m=+168.758255793" lastFinishedPulling="2026-04-28 19:18:35.702342574 +0000 UTC m=+170.637387704" observedRunningTime="2026-04-28 19:18:36.208252537 +0000 UTC m=+171.143297686" watchObservedRunningTime="2026-04-28 19:18:36.209465176 +0000 UTC m=+171.144510325" Apr 28 19:18:39.158281 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:39.158244 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-txhn5" Apr 28 19:18:40.986744 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:40.986710 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgdsv"] Apr 28 19:18:40.990726 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:40.990709 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgdsv" Apr 28 19:18:40.993349 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:40.993327 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 28 19:18:40.993489 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:40.993435 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-2tngh\"" Apr 28 19:18:41.001421 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:41.001392 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgdsv"] Apr 28 19:18:41.016635 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:41.016603 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/057f39ec-8955-454a-9e94-9140f32a99bb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hgdsv\" (UID: \"057f39ec-8955-454a-9e94-9140f32a99bb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgdsv" Apr 28 19:18:41.117154 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:41.117111 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/057f39ec-8955-454a-9e94-9140f32a99bb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hgdsv\" (UID: \"057f39ec-8955-454a-9e94-9140f32a99bb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgdsv" Apr 28 19:18:41.117323 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:41.117234 2582 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 28 19:18:41.117323 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:41.117298 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/057f39ec-8955-454a-9e94-9140f32a99bb-tls-certificates podName:057f39ec-8955-454a-9e94-9140f32a99bb nodeName:}" failed. No retries permitted until 2026-04-28 19:18:41.617282613 +0000 UTC m=+176.552327739 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/057f39ec-8955-454a-9e94-9140f32a99bb-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-hgdsv" (UID: "057f39ec-8955-454a-9e94-9140f32a99bb") : secret "prometheus-operator-admission-webhook-tls" not found Apr 28 19:18:41.620585 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:41.620544 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/057f39ec-8955-454a-9e94-9140f32a99bb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hgdsv\" (UID: \"057f39ec-8955-454a-9e94-9140f32a99bb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgdsv" Apr 28 19:18:41.623105 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:41.623075 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/057f39ec-8955-454a-9e94-9140f32a99bb-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hgdsv\" (UID: \"057f39ec-8955-454a-9e94-9140f32a99bb\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgdsv" Apr 28 19:18:41.899769 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:41.899653 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgdsv" Apr 28 19:18:42.015840 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:42.015804 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgdsv"] Apr 28 19:18:42.019546 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:18:42.019520 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod057f39ec_8955_454a_9e94_9140f32a99bb.slice/crio-2b252fab59562c60697fdfa11411c3cf253650ed3f38a77003578af0ee642433 WatchSource:0}: Error finding container 2b252fab59562c60697fdfa11411c3cf253650ed3f38a77003578af0ee642433: Status 404 returned error can't find the container with id 2b252fab59562c60697fdfa11411c3cf253650ed3f38a77003578af0ee642433 Apr 28 19:18:42.193720 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:42.193638 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgdsv" event={"ID":"057f39ec-8955-454a-9e94-9140f32a99bb","Type":"ContainerStarted","Data":"2b252fab59562c60697fdfa11411c3cf253650ed3f38a77003578af0ee642433"} Apr 28 19:18:43.197694 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:43.197611 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgdsv" event={"ID":"057f39ec-8955-454a-9e94-9140f32a99bb","Type":"ContainerStarted","Data":"1b6081c0e57f494d1fcd2e36cdeca5e190696b1946a964cc0e44818c4ba94b8f"} Apr 28 19:18:43.198074 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:43.197850 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgdsv" Apr 28 19:18:43.203405 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:43.203382 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgdsv" Apr 28 19:18:43.214329 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:43.214284 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hgdsv" podStartSLOduration=2.342922788 podStartE2EDuration="3.214269941s" podCreationTimestamp="2026-04-28 19:18:40 +0000 UTC" firstStartedPulling="2026-04-28 19:18:42.021358272 +0000 UTC m=+176.956403399" lastFinishedPulling="2026-04-28 19:18:42.892705407 +0000 UTC m=+177.827750552" observedRunningTime="2026-04-28 19:18:43.213491217 +0000 UTC m=+178.148536412" watchObservedRunningTime="2026-04-28 19:18:43.214269941 +0000 UTC m=+178.149315089" Apr 28 19:18:43.347402 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:43.347371 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:44.050668 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.050632 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9ljl5"] Apr 28 19:18:44.052611 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.052594 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-9ljl5" Apr 28 19:18:44.055622 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.055596 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 28 19:18:44.055740 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.055596 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 28 19:18:44.055740 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.055606 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 28 19:18:44.055859 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.055838 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 28 19:18:44.056891 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.056874 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-nzgfq\"" Apr 28 19:18:44.056891 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.056885 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 28 19:18:44.064123 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.064068 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9ljl5"] Apr 28 19:18:44.138143 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.138103 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/915ecf94-aae8-4cde-b2a3-fa92ebde0d2c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9ljl5\" (UID: \"915ecf94-aae8-4cde-b2a3-fa92ebde0d2c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9ljl5" Apr 28 19:18:44.138143 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.138141 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/915ecf94-aae8-4cde-b2a3-fa92ebde0d2c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9ljl5\" (UID: \"915ecf94-aae8-4cde-b2a3-fa92ebde0d2c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9ljl5" Apr 28 19:18:44.138349 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.138196 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/915ecf94-aae8-4cde-b2a3-fa92ebde0d2c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9ljl5\" (UID: \"915ecf94-aae8-4cde-b2a3-fa92ebde0d2c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9ljl5" Apr 28 19:18:44.138349 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.138281 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d7l5\" (UniqueName: \"kubernetes.io/projected/915ecf94-aae8-4cde-b2a3-fa92ebde0d2c-kube-api-access-5d7l5\") pod \"prometheus-operator-5676c8c784-9ljl5\" (UID: \"915ecf94-aae8-4cde-b2a3-fa92ebde0d2c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9ljl5" Apr 28 19:18:44.238614 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.238579 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/915ecf94-aae8-4cde-b2a3-fa92ebde0d2c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9ljl5\" (UID: \"915ecf94-aae8-4cde-b2a3-fa92ebde0d2c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9ljl5" Apr 28 19:18:44.239112 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.238646 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5d7l5\" (UniqueName: \"kubernetes.io/projected/915ecf94-aae8-4cde-b2a3-fa92ebde0d2c-kube-api-access-5d7l5\") pod \"prometheus-operator-5676c8c784-9ljl5\" (UID: \"915ecf94-aae8-4cde-b2a3-fa92ebde0d2c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9ljl5" Apr 28 19:18:44.239112 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.238673 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/915ecf94-aae8-4cde-b2a3-fa92ebde0d2c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9ljl5\" (UID: \"915ecf94-aae8-4cde-b2a3-fa92ebde0d2c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9ljl5" Apr 28 19:18:44.239112 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.238689 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/915ecf94-aae8-4cde-b2a3-fa92ebde0d2c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9ljl5\" (UID: \"915ecf94-aae8-4cde-b2a3-fa92ebde0d2c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9ljl5" Apr 28 19:18:44.239395 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.239373 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/915ecf94-aae8-4cde-b2a3-fa92ebde0d2c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9ljl5\" (UID: \"915ecf94-aae8-4cde-b2a3-fa92ebde0d2c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9ljl5" Apr 28 19:18:44.241201 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.241177 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/915ecf94-aae8-4cde-b2a3-fa92ebde0d2c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9ljl5\" (UID: \"915ecf94-aae8-4cde-b2a3-fa92ebde0d2c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9ljl5" Apr 28 19:18:44.241297 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.241272 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/915ecf94-aae8-4cde-b2a3-fa92ebde0d2c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9ljl5\" (UID: \"915ecf94-aae8-4cde-b2a3-fa92ebde0d2c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9ljl5" Apr 28 19:18:44.247619 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.247598 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d7l5\" (UniqueName: \"kubernetes.io/projected/915ecf94-aae8-4cde-b2a3-fa92ebde0d2c-kube-api-access-5d7l5\") pod \"prometheus-operator-5676c8c784-9ljl5\" (UID: \"915ecf94-aae8-4cde-b2a3-fa92ebde0d2c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9ljl5" Apr 28 19:18:44.361865 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.361776 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-9ljl5" Apr 28 19:18:44.481414 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:44.481385 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9ljl5"] Apr 28 19:18:44.484698 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:18:44.484674 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod915ecf94_aae8_4cde_b2a3_fa92ebde0d2c.slice/crio-3bc437064f0dd347c48468b8a1ac80e2b54d14df36b9e807f335427753011959 WatchSource:0}: Error finding container 3bc437064f0dd347c48468b8a1ac80e2b54d14df36b9e807f335427753011959: Status 404 returned error can't find the container with id 3bc437064f0dd347c48468b8a1ac80e2b54d14df36b9e807f335427753011959 Apr 28 19:18:45.204452 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:45.204414 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9ljl5" event={"ID":"915ecf94-aae8-4cde-b2a3-fa92ebde0d2c","Type":"ContainerStarted","Data":"3bc437064f0dd347c48468b8a1ac80e2b54d14df36b9e807f335427753011959"} Apr 28 19:18:46.208442 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:46.208395 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9ljl5" event={"ID":"915ecf94-aae8-4cde-b2a3-fa92ebde0d2c","Type":"ContainerStarted","Data":"34442f31a239494bfc267303773de308248e0f7f70370b66ade8a4ec31f6d648"} Apr 28 19:18:46.208442 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:46.208437 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9ljl5" event={"ID":"915ecf94-aae8-4cde-b2a3-fa92ebde0d2c","Type":"ContainerStarted","Data":"df9e372ae6ff78284d0a864ba86a496ea8328848c66d7ad30c1d8f4c43273a05"} Apr 28 19:18:46.227601 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:46.227546 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-9ljl5" podStartSLOduration=1.221382982 podStartE2EDuration="2.227530989s" podCreationTimestamp="2026-04-28 19:18:44 +0000 UTC" firstStartedPulling="2026-04-28 19:18:44.486486509 +0000 UTC m=+179.421531637" lastFinishedPulling="2026-04-28 19:18:45.492634517 +0000 UTC m=+180.427679644" observedRunningTime="2026-04-28 19:18:46.226330521 +0000 UTC m=+181.161375671" watchObservedRunningTime="2026-04-28 19:18:46.227530989 +0000 UTC m=+181.162576138" Apr 28 19:18:48.420154 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.420114 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-qg8lf"] Apr 28 19:18:48.422362 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.422340 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" Apr 28 19:18:48.425175 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.425146 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-z4qsk\"" Apr 28 19:18:48.425357 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.425295 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 28 19:18:48.425517 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.425492 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 28 19:18:48.426333 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.426311 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 28 19:18:48.435763 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.435740 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-nv9pk"] Apr 28 19:18:48.437743 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.437725 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.440301 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.440278 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 28 19:18:48.440410 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.440278 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 28 19:18:48.440410 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.440360 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 28 19:18:48.440674 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.440657 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-xgvrs\"" Apr 28 19:18:48.441155 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.441133 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-qg8lf"] Apr 28 19:18:48.470701 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.470669 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ead0a897-3d24-481a-8227-839c80a17804-node-exporter-tls\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.470701 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.470703 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ead0a897-3d24-481a-8227-839c80a17804-node-exporter-textfile\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.470950 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.470727 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3e9dbac1-8dfa-4366-ae62-c45f1598141a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-qg8lf\" (UID: \"3e9dbac1-8dfa-4366-ae62-c45f1598141a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" Apr 28 19:18:48.470950 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.470749 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/3e9dbac1-8dfa-4366-ae62-c45f1598141a-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-qg8lf\" (UID: \"3e9dbac1-8dfa-4366-ae62-c45f1598141a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" Apr 28 19:18:48.470950 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.470767 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ead0a897-3d24-481a-8227-839c80a17804-root\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.470950 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.470807 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e9dbac1-8dfa-4366-ae62-c45f1598141a-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-qg8lf\" (UID: \"3e9dbac1-8dfa-4366-ae62-c45f1598141a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" Apr 28 19:18:48.470950 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.470874 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5zlf\" (UniqueName: \"kubernetes.io/projected/3e9dbac1-8dfa-4366-ae62-c45f1598141a-kube-api-access-g5zlf\") pod \"kube-state-metrics-69db897b98-qg8lf\" (UID: \"3e9dbac1-8dfa-4366-ae62-c45f1598141a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" Apr 28 19:18:48.471204 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.470960 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4f5f\" (UniqueName: \"kubernetes.io/projected/ead0a897-3d24-481a-8227-839c80a17804-kube-api-access-d4f5f\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.471204 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.471019 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/3e9dbac1-8dfa-4366-ae62-c45f1598141a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-qg8lf\" (UID: \"3e9dbac1-8dfa-4366-ae62-c45f1598141a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" Apr 28 19:18:48.471204 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.471076 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ead0a897-3d24-481a-8227-839c80a17804-node-exporter-wtmp\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.471204 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.471103 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e9dbac1-8dfa-4366-ae62-c45f1598141a-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-qg8lf\" (UID: \"3e9dbac1-8dfa-4366-ae62-c45f1598141a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" Apr 28 19:18:48.471204 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.471145 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ead0a897-3d24-481a-8227-839c80a17804-node-exporter-accelerators-collector-config\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.471204 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.471178 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ead0a897-3d24-481a-8227-839c80a17804-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.471388 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.471203 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ead0a897-3d24-481a-8227-839c80a17804-sys\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.471388 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.471265 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ead0a897-3d24-481a-8227-839c80a17804-metrics-client-ca\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.572076 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.572040 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ead0a897-3d24-481a-8227-839c80a17804-node-exporter-wtmp\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.572076 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.572077 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e9dbac1-8dfa-4366-ae62-c45f1598141a-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-qg8lf\" (UID: \"3e9dbac1-8dfa-4366-ae62-c45f1598141a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" Apr 28 19:18:48.572331 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.572099 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ead0a897-3d24-481a-8227-839c80a17804-node-exporter-accelerators-collector-config\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.572331 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.572179 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ead0a897-3d24-481a-8227-839c80a17804-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.572331 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:48.572217 2582 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 28 19:18:48.572331 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.572236 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ead0a897-3d24-481a-8227-839c80a17804-node-exporter-wtmp\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.572331 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.572270 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ead0a897-3d24-481a-8227-839c80a17804-sys\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.572331 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.572223 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ead0a897-3d24-481a-8227-839c80a17804-sys\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.572331 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:48.572294 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e9dbac1-8dfa-4366-ae62-c45f1598141a-kube-state-metrics-tls podName:3e9dbac1-8dfa-4366-ae62-c45f1598141a nodeName:}" failed. No retries permitted until 2026-04-28 19:18:49.072278043 +0000 UTC m=+184.007323184 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/3e9dbac1-8dfa-4366-ae62-c45f1598141a-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-qg8lf" (UID: "3e9dbac1-8dfa-4366-ae62-c45f1598141a") : secret "kube-state-metrics-tls" not found Apr 28 19:18:48.572680 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.572339 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ead0a897-3d24-481a-8227-839c80a17804-metrics-client-ca\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.572680 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.572385 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ead0a897-3d24-481a-8227-839c80a17804-node-exporter-tls\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.572680 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.572414 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ead0a897-3d24-481a-8227-839c80a17804-node-exporter-textfile\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.572680 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.572450 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3e9dbac1-8dfa-4366-ae62-c45f1598141a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-qg8lf\" (UID: \"3e9dbac1-8dfa-4366-ae62-c45f1598141a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" Apr 28 19:18:48.572680 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:48.572508 2582 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 28 19:18:48.572680 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.572553 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/3e9dbac1-8dfa-4366-ae62-c45f1598141a-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-qg8lf\" (UID: \"3e9dbac1-8dfa-4366-ae62-c45f1598141a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" Apr 28 19:18:48.572680 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.572588 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ead0a897-3d24-481a-8227-839c80a17804-root\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.572680 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.572625 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e9dbac1-8dfa-4366-ae62-c45f1598141a-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-qg8lf\" (UID: \"3e9dbac1-8dfa-4366-ae62-c45f1598141a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" Apr 28 19:18:48.572680 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.572676 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ead0a897-3d24-481a-8227-839c80a17804-node-exporter-accelerators-collector-config\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.573088 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.572705 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ead0a897-3d24-481a-8227-839c80a17804-node-exporter-textfile\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.573088 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.572738 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ead0a897-3d24-481a-8227-839c80a17804-root\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.573088 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:48.572844 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ead0a897-3d24-481a-8227-839c80a17804-node-exporter-tls podName:ead0a897-3d24-481a-8227-839c80a17804 nodeName:}" failed. No retries permitted until 2026-04-28 19:18:49.072648651 +0000 UTC m=+184.007693798 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/ead0a897-3d24-481a-8227-839c80a17804-node-exporter-tls") pod "node-exporter-nv9pk" (UID: "ead0a897-3d24-481a-8227-839c80a17804") : secret "node-exporter-tls" not found Apr 28 19:18:48.573088 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.572887 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5zlf\" (UniqueName: \"kubernetes.io/projected/3e9dbac1-8dfa-4366-ae62-c45f1598141a-kube-api-access-g5zlf\") pod \"kube-state-metrics-69db897b98-qg8lf\" (UID: \"3e9dbac1-8dfa-4366-ae62-c45f1598141a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" Apr 28 19:18:48.573088 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.572955 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4f5f\" (UniqueName: \"kubernetes.io/projected/ead0a897-3d24-481a-8227-839c80a17804-kube-api-access-d4f5f\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.573088 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.572973 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ead0a897-3d24-481a-8227-839c80a17804-metrics-client-ca\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.573088 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.572990 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/3e9dbac1-8dfa-4366-ae62-c45f1598141a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-qg8lf\" (UID: \"3e9dbac1-8dfa-4366-ae62-c45f1598141a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" Apr 28 19:18:48.573088 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.573001 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/3e9dbac1-8dfa-4366-ae62-c45f1598141a-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-qg8lf\" (UID: \"3e9dbac1-8dfa-4366-ae62-c45f1598141a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" Apr 28 19:18:48.573408 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.573392 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3e9dbac1-8dfa-4366-ae62-c45f1598141a-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-qg8lf\" (UID: \"3e9dbac1-8dfa-4366-ae62-c45f1598141a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" Apr 28 19:18:48.573504 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.573485 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/3e9dbac1-8dfa-4366-ae62-c45f1598141a-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-qg8lf\" (UID: \"3e9dbac1-8dfa-4366-ae62-c45f1598141a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" Apr 28 19:18:48.575513 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.575488 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ead0a897-3d24-481a-8227-839c80a17804-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:48.575618 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.575565 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3e9dbac1-8dfa-4366-ae62-c45f1598141a-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-qg8lf\" (UID: \"3e9dbac1-8dfa-4366-ae62-c45f1598141a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" Apr 28 19:18:48.590350 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.590321 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5zlf\" (UniqueName: \"kubernetes.io/projected/3e9dbac1-8dfa-4366-ae62-c45f1598141a-kube-api-access-g5zlf\") pod \"kube-state-metrics-69db897b98-qg8lf\" (UID: \"3e9dbac1-8dfa-4366-ae62-c45f1598141a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" Apr 28 19:18:48.590474 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:48.590420 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4f5f\" (UniqueName: \"kubernetes.io/projected/ead0a897-3d24-481a-8227-839c80a17804-kube-api-access-d4f5f\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:49.077830 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:49.077774 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e9dbac1-8dfa-4366-ae62-c45f1598141a-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-qg8lf\" (UID: \"3e9dbac1-8dfa-4366-ae62-c45f1598141a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" Apr 28 19:18:49.078038 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:49.077856 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ead0a897-3d24-481a-8227-839c80a17804-node-exporter-tls\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:49.080324 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:49.080301 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ead0a897-3d24-481a-8227-839c80a17804-node-exporter-tls\") pod \"node-exporter-nv9pk\" (UID: \"ead0a897-3d24-481a-8227-839c80a17804\") " pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:49.080528 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:49.080506 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e9dbac1-8dfa-4366-ae62-c45f1598141a-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-qg8lf\" (UID: \"3e9dbac1-8dfa-4366-ae62-c45f1598141a\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" Apr 28 19:18:49.335365 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:49.335262 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" Apr 28 19:18:49.347824 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:49.347796 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nv9pk" Apr 28 19:18:49.356206 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:18:49.356172 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podead0a897_3d24_481a_8227_839c80a17804.slice/crio-74985809d2a0550c7f458fda5e82f1408ddfbcacfe88bd9a370037803a7a8958 WatchSource:0}: Error finding container 74985809d2a0550c7f458fda5e82f1408ddfbcacfe88bd9a370037803a7a8958: Status 404 returned error can't find the container with id 74985809d2a0550c7f458fda5e82f1408ddfbcacfe88bd9a370037803a7a8958 Apr 28 19:18:49.478022 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:49.477989 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-qg8lf"] Apr 28 19:18:49.482856 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:18:49.482827 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e9dbac1_8dfa_4366_ae62_c45f1598141a.slice/crio-9139e703a0faffce93421dbb6e60633e54af75f501ac05b6554998ec4a498bfe WatchSource:0}: Error finding container 9139e703a0faffce93421dbb6e60633e54af75f501ac05b6554998ec4a498bfe: Status 404 returned error can't find the container with id 9139e703a0faffce93421dbb6e60633e54af75f501ac05b6554998ec4a498bfe Apr 28 19:18:50.221332 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.221301 2582 generic.go:358] "Generic (PLEG): container finished" podID="ead0a897-3d24-481a-8227-839c80a17804" containerID="2a7a905aaf126e5b8c1084dadf77e4207ffe48eadffed710580a46b9ee286f3b" exitCode=0 Apr 28 19:18:50.221522 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.221385 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nv9pk" event={"ID":"ead0a897-3d24-481a-8227-839c80a17804","Type":"ContainerDied","Data":"2a7a905aaf126e5b8c1084dadf77e4207ffe48eadffed710580a46b9ee286f3b"} Apr 28 19:18:50.221522 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.221433 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nv9pk" event={"ID":"ead0a897-3d24-481a-8227-839c80a17804","Type":"ContainerStarted","Data":"74985809d2a0550c7f458fda5e82f1408ddfbcacfe88bd9a370037803a7a8958"} Apr 28 19:18:50.222844 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.222819 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" event={"ID":"3e9dbac1-8dfa-4366-ae62-c45f1598141a","Type":"ContainerStarted","Data":"9139e703a0faffce93421dbb6e60633e54af75f501ac05b6554998ec4a498bfe"} Apr 28 19:18:50.485097 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.485066 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-b9f9f476c-rlv57"] Apr 28 19:18:50.487799 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.487780 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.494614 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.494551 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 28 19:18:50.494614 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.494556 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 28 19:18:50.494966 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.494944 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 28 19:18:50.495312 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.495197 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-evhc3skuljqu1\"" Apr 28 19:18:50.495312 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.495269 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 28 19:18:50.495431 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.495405 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-qhxkr\"" Apr 28 19:18:50.495616 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.495539 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 28 19:18:50.520300 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.520214 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-b9f9f476c-rlv57"] Apr 28 19:18:50.591314 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.591286 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.591512 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.591340 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-secret-thanos-querier-tls\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.591512 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.591380 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzlzs\" (UniqueName: \"kubernetes.io/projected/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-kube-api-access-gzlzs\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.591512 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.591402 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.591512 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.591453 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.591758 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.591538 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-secret-grpc-tls\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.591758 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.591560 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.591758 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.591596 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-metrics-client-ca\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.692122 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.692083 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.692243 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.692168 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-secret-thanos-querier-tls\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.692243 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.692210 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzlzs\" (UniqueName: \"kubernetes.io/projected/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-kube-api-access-gzlzs\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.692415 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.692243 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.692415 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.692278 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.692553 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.692527 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-secret-grpc-tls\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.692610 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.692566 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.692668 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.692607 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-metrics-client-ca\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.693510 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.693480 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-metrics-client-ca\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.695762 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.695717 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-secret-grpc-tls\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.696442 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.696310 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.696442 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.696313 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.696442 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.696402 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.696644 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.696472 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.696764 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.696746 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-secret-thanos-querier-tls\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.744173 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.743121 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzlzs\" (UniqueName: \"kubernetes.io/projected/8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc-kube-api-access-gzlzs\") pod \"thanos-querier-b9f9f476c-rlv57\" (UID: \"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc\") " pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.798387 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.798348 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:50.986727 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:18:50.986695 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c1a742d_18af_40cf_b3cc_c0c9f7fb64cc.slice/crio-0bf3642b19479cd0d4a436461424237b1ece7f961ca3e67a5a04a9763328789c WatchSource:0}: Error finding container 0bf3642b19479cd0d4a436461424237b1ece7f961ca3e67a5a04a9763328789c: Status 404 returned error can't find the container with id 0bf3642b19479cd0d4a436461424237b1ece7f961ca3e67a5a04a9763328789c Apr 28 19:18:50.987458 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:50.987413 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-b9f9f476c-rlv57"] Apr 28 19:18:51.227000 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:51.226858 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" event={"ID":"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc","Type":"ContainerStarted","Data":"0bf3642b19479cd0d4a436461424237b1ece7f961ca3e67a5a04a9763328789c"} Apr 28 19:18:51.228741 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:51.228714 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nv9pk" event={"ID":"ead0a897-3d24-481a-8227-839c80a17804","Type":"ContainerStarted","Data":"9f3bb5c2ee6f43bcbdc5956df02d414f93137682cc892249167d7e0522147923"} Apr 28 19:18:51.228876 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:51.228745 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nv9pk" event={"ID":"ead0a897-3d24-481a-8227-839c80a17804","Type":"ContainerStarted","Data":"9153b34d69ae624c26259fa0afdb9995eafb38dbfe76992d8eba863b4787c618"} Apr 28 19:18:51.230614 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:51.230588 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" event={"ID":"3e9dbac1-8dfa-4366-ae62-c45f1598141a","Type":"ContainerStarted","Data":"b9eebf4f732cf3c8373d0d85398e48e49e136af34e84314b154b6f6d6d1674bf"} Apr 28 19:18:51.230698 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:51.230621 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" event={"ID":"3e9dbac1-8dfa-4366-ae62-c45f1598141a","Type":"ContainerStarted","Data":"b96bd884cc9e1ed485018ad3acb88d2f76ce7b0f8e2e1d388c1be60f89d47d61"} Apr 28 19:18:51.230698 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:51.230635 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" event={"ID":"3e9dbac1-8dfa-4366-ae62-c45f1598141a","Type":"ContainerStarted","Data":"b852b33dd33b8d40eeb9ec628ea5ac41b650c1248ee76932e55ccbbf72917282"} Apr 28 19:18:51.271531 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:51.271469 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-nv9pk" podStartSLOduration=2.618892626 podStartE2EDuration="3.27145132s" podCreationTimestamp="2026-04-28 19:18:48 +0000 UTC" firstStartedPulling="2026-04-28 19:18:49.358006805 +0000 UTC m=+184.293051932" lastFinishedPulling="2026-04-28 19:18:50.010565482 +0000 UTC m=+184.945610626" observedRunningTime="2026-04-28 19:18:51.270577027 +0000 UTC m=+186.205622175" watchObservedRunningTime="2026-04-28 19:18:51.27145132 +0000 UTC m=+186.206496467" Apr 28 19:18:51.343132 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:51.343082 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-qg8lf" podStartSLOduration=2.18738819 podStartE2EDuration="3.34306088s" podCreationTimestamp="2026-04-28 19:18:48 +0000 UTC" firstStartedPulling="2026-04-28 19:18:49.484802942 +0000 UTC m=+184.419848069" lastFinishedPulling="2026-04-28 19:18:50.640475629 +0000 UTC m=+185.575520759" observedRunningTime="2026-04-28 19:18:51.342790715 +0000 UTC m=+186.277835879" watchObservedRunningTime="2026-04-28 19:18:51.34306088 +0000 UTC m=+186.278106043" Apr 28 19:18:54.245465 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:54.245436 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" event={"ID":"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc","Type":"ContainerStarted","Data":"3a5716ff4e0014883e89e6044cd0b40fa01ba9c2940952d01297c8c827cdb504"} Apr 28 19:18:54.245754 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:54.245477 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" event={"ID":"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc","Type":"ContainerStarted","Data":"e25fb3b35c54aeae351a4021b8a1059fb03653656f8d88ee7b16bf680c46bdcb"} Apr 28 19:18:54.245754 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:54.245487 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" event={"ID":"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc","Type":"ContainerStarted","Data":"536eb5454fed9d241a6fe56cd96bc39587a5a94bad34d01dfccf149b6efe6635"} Apr 28 19:18:55.252038 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:55.251992 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" event={"ID":"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc","Type":"ContainerStarted","Data":"24725e74e9a1ca50ab079fb2f681681611b82e46053c6866b12d7665b72942b3"} Apr 28 19:18:55.252558 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:55.252529 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" event={"ID":"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc","Type":"ContainerStarted","Data":"ef45885eead98cfd4a96091a4ec3f80c08748144c343cb3e16eefee495a27dbe"} Apr 28 19:18:55.252558 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:55.252558 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" event={"ID":"8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc","Type":"ContainerStarted","Data":"cb53874bf4a10eb1f0a658a4fd82d066a002e083df4081e00b630c82ae953623"} Apr 28 19:18:55.252742 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:55.252704 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:18:55.284987 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:55.284927 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" podStartSLOduration=2.055432001 podStartE2EDuration="5.28489035s" podCreationTimestamp="2026-04-28 19:18:50 +0000 UTC" firstStartedPulling="2026-04-28 19:18:50.988692597 +0000 UTC m=+185.923737724" lastFinishedPulling="2026-04-28 19:18:54.218150938 +0000 UTC m=+189.153196073" observedRunningTime="2026-04-28 19:18:55.283218518 +0000 UTC m=+190.218263668" watchObservedRunningTime="2026-04-28 19:18:55.28489035 +0000 UTC m=+190.219935500" Apr 28 19:18:58.361974 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.361888 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" podUID="27c4f3e0-e06d-4652-9e49-d180dcdb1901" containerName="registry" containerID="cri-o://fe4a534551c6df2ae74dba15f3bdee827755fe655ff8a4529d6646a3845902be" gracePeriod=30 Apr 28 19:18:58.601416 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.601395 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:58.668216 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.668142 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-certificates\") pod \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " Apr 28 19:18:58.668216 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.668181 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq6zp\" (UniqueName: \"kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-kube-api-access-cq6zp\") pod \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " Apr 28 19:18:58.668216 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.668197 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-bound-sa-token\") pod \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " Apr 28 19:18:58.668479 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.668239 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/27c4f3e0-e06d-4652-9e49-d180dcdb1901-image-registry-private-configuration\") pod \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " Apr 28 19:18:58.668479 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.668279 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/27c4f3e0-e06d-4652-9e49-d180dcdb1901-ca-trust-extracted\") pod \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " Apr 28 19:18:58.668479 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.668305 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/27c4f3e0-e06d-4652-9e49-d180dcdb1901-installation-pull-secrets\") pod \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " Apr 28 19:18:58.668479 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.668330 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27c4f3e0-e06d-4652-9e49-d180dcdb1901-trusted-ca\") pod \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " Apr 28 19:18:58.668479 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.668355 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-tls\") pod \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\" (UID: \"27c4f3e0-e06d-4652-9e49-d180dcdb1901\") " Apr 28 19:18:58.668706 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.668622 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "27c4f3e0-e06d-4652-9e49-d180dcdb1901" (UID: "27c4f3e0-e06d-4652-9e49-d180dcdb1901"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:18:58.669323 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.669279 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27c4f3e0-e06d-4652-9e49-d180dcdb1901-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "27c4f3e0-e06d-4652-9e49-d180dcdb1901" (UID: "27c4f3e0-e06d-4652-9e49-d180dcdb1901"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:18:58.670790 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.670754 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27c4f3e0-e06d-4652-9e49-d180dcdb1901-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "27c4f3e0-e06d-4652-9e49-d180dcdb1901" (UID: "27c4f3e0-e06d-4652-9e49-d180dcdb1901"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:18:58.670932 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.670877 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-kube-api-access-cq6zp" (OuterVolumeSpecName: "kube-api-access-cq6zp") pod "27c4f3e0-e06d-4652-9e49-d180dcdb1901" (UID: "27c4f3e0-e06d-4652-9e49-d180dcdb1901"). InnerVolumeSpecName "kube-api-access-cq6zp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:18:58.671012 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.670984 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "27c4f3e0-e06d-4652-9e49-d180dcdb1901" (UID: "27c4f3e0-e06d-4652-9e49-d180dcdb1901"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:18:58.671142 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.671124 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27c4f3e0-e06d-4652-9e49-d180dcdb1901-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "27c4f3e0-e06d-4652-9e49-d180dcdb1901" (UID: "27c4f3e0-e06d-4652-9e49-d180dcdb1901"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:18:58.671396 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.671370 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "27c4f3e0-e06d-4652-9e49-d180dcdb1901" (UID: "27c4f3e0-e06d-4652-9e49-d180dcdb1901"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:18:58.677271 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.677246 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27c4f3e0-e06d-4652-9e49-d180dcdb1901-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "27c4f3e0-e06d-4652-9e49-d180dcdb1901" (UID: "27c4f3e0-e06d-4652-9e49-d180dcdb1901"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:18:58.769819 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.769785 2582 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-certificates\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:18:58.769819 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.769816 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cq6zp\" (UniqueName: \"kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-kube-api-access-cq6zp\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:18:58.769819 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.769829 2582 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-bound-sa-token\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:18:58.770074 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.769841 2582 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/27c4f3e0-e06d-4652-9e49-d180dcdb1901-image-registry-private-configuration\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:18:58.770074 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.769851 2582 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/27c4f3e0-e06d-4652-9e49-d180dcdb1901-ca-trust-extracted\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:18:58.770074 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.769860 2582 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/27c4f3e0-e06d-4652-9e49-d180dcdb1901-installation-pull-secrets\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:18:58.770074 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.769870 2582 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/27c4f3e0-e06d-4652-9e49-d180dcdb1901-trusted-ca\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:18:58.770074 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:58.769879 2582 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/27c4f3e0-e06d-4652-9e49-d180dcdb1901-registry-tls\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:18:59.193433 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.193398 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-hws42"] Apr 28 19:18:59.193725 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.193712 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27c4f3e0-e06d-4652-9e49-d180dcdb1901" containerName="registry" Apr 28 19:18:59.193783 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.193727 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="27c4f3e0-e06d-4652-9e49-d180dcdb1901" containerName="registry" Apr 28 19:18:59.193783 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.193777 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="27c4f3e0-e06d-4652-9e49-d180dcdb1901" containerName="registry" Apr 28 19:18:59.199492 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.199474 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-hws42" Apr 28 19:18:59.203397 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.203370 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 28 19:18:59.203538 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.203423 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 28 19:18:59.203625 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.203538 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-26bkg\"" Apr 28 19:18:59.211315 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.211294 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-hws42"] Apr 28 19:18:59.265076 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.265047 2582 generic.go:358] "Generic (PLEG): container finished" podID="27c4f3e0-e06d-4652-9e49-d180dcdb1901" containerID="fe4a534551c6df2ae74dba15f3bdee827755fe655ff8a4529d6646a3845902be" exitCode=0 Apr 28 19:18:59.265245 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.265116 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" Apr 28 19:18:59.265245 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.265138 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" event={"ID":"27c4f3e0-e06d-4652-9e49-d180dcdb1901","Type":"ContainerDied","Data":"fe4a534551c6df2ae74dba15f3bdee827755fe655ff8a4529d6646a3845902be"} Apr 28 19:18:59.265245 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.265174 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c49b984f5-fxfzl" event={"ID":"27c4f3e0-e06d-4652-9e49-d180dcdb1901","Type":"ContainerDied","Data":"f9c8c0100c6b264bdb5e9d7c1d38467129dfb431fe884bb9e2d42b6e42e0b8dd"} Apr 28 19:18:59.265245 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.265191 2582 scope.go:117] "RemoveContainer" containerID="fe4a534551c6df2ae74dba15f3bdee827755fe655ff8a4529d6646a3845902be" Apr 28 19:18:59.273650 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.273628 2582 scope.go:117] "RemoveContainer" containerID="fe4a534551c6df2ae74dba15f3bdee827755fe655ff8a4529d6646a3845902be" Apr 28 19:18:59.273807 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.273789 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vt6z\" (UniqueName: \"kubernetes.io/projected/3f6140f3-f0ea-4c6c-aa32-0e5ca135571f-kube-api-access-7vt6z\") pod \"downloads-6bcc868b7-hws42\" (UID: \"3f6140f3-f0ea-4c6c-aa32-0e5ca135571f\") " pod="openshift-console/downloads-6bcc868b7-hws42" Apr 28 19:18:59.273970 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:18:59.273951 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe4a534551c6df2ae74dba15f3bdee827755fe655ff8a4529d6646a3845902be\": container with ID starting with fe4a534551c6df2ae74dba15f3bdee827755fe655ff8a4529d6646a3845902be not found: ID does not exist" containerID="fe4a534551c6df2ae74dba15f3bdee827755fe655ff8a4529d6646a3845902be" Apr 28 19:18:59.274022 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.273979 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe4a534551c6df2ae74dba15f3bdee827755fe655ff8a4529d6646a3845902be"} err="failed to get container status \"fe4a534551c6df2ae74dba15f3bdee827755fe655ff8a4529d6646a3845902be\": rpc error: code = NotFound desc = could not find container \"fe4a534551c6df2ae74dba15f3bdee827755fe655ff8a4529d6646a3845902be\": container with ID starting with fe4a534551c6df2ae74dba15f3bdee827755fe655ff8a4529d6646a3845902be not found: ID does not exist" Apr 28 19:18:59.326323 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.326283 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-c49b984f5-fxfzl"] Apr 28 19:18:59.346095 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.346063 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-c49b984f5-fxfzl"] Apr 28 19:18:59.375330 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.375303 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7vt6z\" (UniqueName: \"kubernetes.io/projected/3f6140f3-f0ea-4c6c-aa32-0e5ca135571f-kube-api-access-7vt6z\") pod \"downloads-6bcc868b7-hws42\" (UID: \"3f6140f3-f0ea-4c6c-aa32-0e5ca135571f\") " pod="openshift-console/downloads-6bcc868b7-hws42" Apr 28 19:18:59.387442 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.387411 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vt6z\" (UniqueName: \"kubernetes.io/projected/3f6140f3-f0ea-4c6c-aa32-0e5ca135571f-kube-api-access-7vt6z\") pod \"downloads-6bcc868b7-hws42\" (UID: \"3f6140f3-f0ea-4c6c-aa32-0e5ca135571f\") " pod="openshift-console/downloads-6bcc868b7-hws42" Apr 28 19:18:59.508663 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.508625 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-hws42" Apr 28 19:18:59.641252 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.641213 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27c4f3e0-e06d-4652-9e49-d180dcdb1901" path="/var/lib/kubelet/pods/27c4f3e0-e06d-4652-9e49-d180dcdb1901/volumes" Apr 28 19:18:59.648683 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:18:59.648641 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-hws42"] Apr 28 19:18:59.651985 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:18:59.651953 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f6140f3_f0ea_4c6c_aa32_0e5ca135571f.slice/crio-224dc39f8466330d7bb9341efe15b2a2b20b44cb1981e265b166eab8586fc5b2 WatchSource:0}: Error finding container 224dc39f8466330d7bb9341efe15b2a2b20b44cb1981e265b166eab8586fc5b2: Status 404 returned error can't find the container with id 224dc39f8466330d7bb9341efe15b2a2b20b44cb1981e265b166eab8586fc5b2 Apr 28 19:19:00.269149 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:00.269100 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-hws42" event={"ID":"3f6140f3-f0ea-4c6c-aa32-0e5ca135571f","Type":"ContainerStarted","Data":"224dc39f8466330d7bb9341efe15b2a2b20b44cb1981e265b166eab8586fc5b2"} Apr 28 19:19:01.261967 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:01.261939 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-b9f9f476c-rlv57" Apr 28 19:19:10.578527 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.578492 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6b8bcbc46f-958sx"] Apr 28 19:19:10.581331 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.581300 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:10.583915 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.583873 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 28 19:19:10.584074 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.584032 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 28 19:19:10.585283 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.585253 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 28 19:19:10.585388 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.585284 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 28 19:19:10.585388 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.585286 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 28 19:19:10.585388 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.585258 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-9ztwl\"" Apr 28 19:19:10.596401 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.596372 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b8bcbc46f-958sx"] Apr 28 19:19:10.681983 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.681938 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/07d09f95-65ab-472a-8050-5a461c515306-console-oauth-config\") pod \"console-6b8bcbc46f-958sx\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:10.682162 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.682021 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/07d09f95-65ab-472a-8050-5a461c515306-console-serving-cert\") pod \"console-6b8bcbc46f-958sx\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:10.682162 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.682150 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/07d09f95-65ab-472a-8050-5a461c515306-service-ca\") pod \"console-6b8bcbc46f-958sx\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:10.682264 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.682213 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9vjt\" (UniqueName: \"kubernetes.io/projected/07d09f95-65ab-472a-8050-5a461c515306-kube-api-access-s9vjt\") pod \"console-6b8bcbc46f-958sx\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:10.682264 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.682255 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/07d09f95-65ab-472a-8050-5a461c515306-console-config\") pod \"console-6b8bcbc46f-958sx\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:10.682369 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.682298 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/07d09f95-65ab-472a-8050-5a461c515306-oauth-serving-cert\") pod \"console-6b8bcbc46f-958sx\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:10.783483 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.783451 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/07d09f95-65ab-472a-8050-5a461c515306-console-oauth-config\") pod \"console-6b8bcbc46f-958sx\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:10.783661 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.783500 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/07d09f95-65ab-472a-8050-5a461c515306-console-serving-cert\") pod \"console-6b8bcbc46f-958sx\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:10.783661 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.783549 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/07d09f95-65ab-472a-8050-5a461c515306-service-ca\") pod \"console-6b8bcbc46f-958sx\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:10.783661 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.783588 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s9vjt\" (UniqueName: \"kubernetes.io/projected/07d09f95-65ab-472a-8050-5a461c515306-kube-api-access-s9vjt\") pod \"console-6b8bcbc46f-958sx\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:10.783661 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.783621 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/07d09f95-65ab-472a-8050-5a461c515306-console-config\") pod \"console-6b8bcbc46f-958sx\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:10.783661 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.783658 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/07d09f95-65ab-472a-8050-5a461c515306-oauth-serving-cert\") pod \"console-6b8bcbc46f-958sx\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:10.784734 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.784667 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/07d09f95-65ab-472a-8050-5a461c515306-service-ca\") pod \"console-6b8bcbc46f-958sx\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:10.784945 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.784917 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/07d09f95-65ab-472a-8050-5a461c515306-oauth-serving-cert\") pod \"console-6b8bcbc46f-958sx\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:10.785068 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.785010 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/07d09f95-65ab-472a-8050-5a461c515306-console-config\") pod \"console-6b8bcbc46f-958sx\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:10.786332 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.786311 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/07d09f95-65ab-472a-8050-5a461c515306-console-oauth-config\") pod \"console-6b8bcbc46f-958sx\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:10.786753 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.786734 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/07d09f95-65ab-472a-8050-5a461c515306-console-serving-cert\") pod \"console-6b8bcbc46f-958sx\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:10.792595 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.792573 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9vjt\" (UniqueName: \"kubernetes.io/projected/07d09f95-65ab-472a-8050-5a461c515306-kube-api-access-s9vjt\") pod \"console-6b8bcbc46f-958sx\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:10.892390 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:10.892301 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:16.307977 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:16.307951 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6b8bcbc46f-958sx"] Apr 28 19:19:16.309825 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:19:16.309798 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07d09f95_65ab_472a_8050_5a461c515306.slice/crio-49c9f464e38a5afc72f915b9d2598a80a2c146c63fff6c0d37626d5b3df8f0c5 WatchSource:0}: Error finding container 49c9f464e38a5afc72f915b9d2598a80a2c146c63fff6c0d37626d5b3df8f0c5: Status 404 returned error can't find the container with id 49c9f464e38a5afc72f915b9d2598a80a2c146c63fff6c0d37626d5b3df8f0c5 Apr 28 19:19:16.318282 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:16.318254 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-hws42" event={"ID":"3f6140f3-f0ea-4c6c-aa32-0e5ca135571f","Type":"ContainerStarted","Data":"a4083e513d01d09818e76e80c834068c64b65831c4b368752540f1ffb4a2f9a5"} Apr 28 19:19:16.318426 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:16.318407 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-hws42" Apr 28 19:19:16.319468 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:16.319444 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b8bcbc46f-958sx" event={"ID":"07d09f95-65ab-472a-8050-5a461c515306","Type":"ContainerStarted","Data":"49c9f464e38a5afc72f915b9d2598a80a2c146c63fff6c0d37626d5b3df8f0c5"} Apr 28 19:19:16.319933 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:16.319875 2582 patch_prober.go:28] interesting pod/downloads-6bcc868b7-hws42 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.133.0.18:8080/\": dial tcp 10.133.0.18:8080: connect: connection refused" start-of-body= Apr 28 19:19:16.320019 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:16.319951 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-6bcc868b7-hws42" podUID="3f6140f3-f0ea-4c6c-aa32-0e5ca135571f" containerName="download-server" probeResult="failure" output="Get \"http://10.133.0.18:8080/\": dial tcp 10.133.0.18:8080: connect: connection refused" Apr 28 19:19:16.357202 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:16.357145 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-hws42" podStartSLOduration=0.820312863 podStartE2EDuration="17.357129027s" podCreationTimestamp="2026-04-28 19:18:59 +0000 UTC" firstStartedPulling="2026-04-28 19:18:59.65394322 +0000 UTC m=+194.588988351" lastFinishedPulling="2026-04-28 19:19:16.190759385 +0000 UTC m=+211.125804515" observedRunningTime="2026-04-28 19:19:16.35495861 +0000 UTC m=+211.290003753" watchObservedRunningTime="2026-04-28 19:19:16.357129027 +0000 UTC m=+211.292174179" Apr 28 19:19:17.338816 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:17.338778 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-hws42" Apr 28 19:19:19.004271 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.004229 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-58886b789f-kcs45"] Apr 28 19:19:19.047295 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.047222 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58886b789f-kcs45"] Apr 28 19:19:19.047612 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.047575 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.057680 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.057653 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 28 19:19:19.164107 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.164070 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-trusted-ca-bundle\") pod \"console-58886b789f-kcs45\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.164349 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.164124 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-console-serving-cert\") pod \"console-58886b789f-kcs45\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.164349 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.164196 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-console-oauth-config\") pod \"console-58886b789f-kcs45\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.164349 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.164234 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-service-ca\") pod \"console-58886b789f-kcs45\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.164349 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.164291 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-console-config\") pod \"console-58886b789f-kcs45\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.164349 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.164321 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-847fb\" (UniqueName: \"kubernetes.io/projected/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-kube-api-access-847fb\") pod \"console-58886b789f-kcs45\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.164628 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.164355 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-oauth-serving-cert\") pod \"console-58886b789f-kcs45\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.265340 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.265268 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-trusted-ca-bundle\") pod \"console-58886b789f-kcs45\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.265340 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.265319 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-console-serving-cert\") pod \"console-58886b789f-kcs45\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.265671 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.265370 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-console-oauth-config\") pod \"console-58886b789f-kcs45\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.265671 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.265399 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-service-ca\") pod \"console-58886b789f-kcs45\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.265671 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.265444 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-console-config\") pod \"console-58886b789f-kcs45\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.265671 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.265471 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-847fb\" (UniqueName: \"kubernetes.io/projected/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-kube-api-access-847fb\") pod \"console-58886b789f-kcs45\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.265671 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.265499 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-oauth-serving-cert\") pod \"console-58886b789f-kcs45\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.268269 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.266247 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-oauth-serving-cert\") pod \"console-58886b789f-kcs45\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.268269 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.267031 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-trusted-ca-bundle\") pod \"console-58886b789f-kcs45\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.268269 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.267544 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-service-ca\") pod \"console-58886b789f-kcs45\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.268269 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.268103 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-console-config\") pod \"console-58886b789f-kcs45\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.270003 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.269848 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-console-serving-cert\") pod \"console-58886b789f-kcs45\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.270314 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.270293 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-console-oauth-config\") pod \"console-58886b789f-kcs45\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.280184 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.280114 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-847fb\" (UniqueName: \"kubernetes.io/projected/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-kube-api-access-847fb\") pod \"console-58886b789f-kcs45\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.361831 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.361329 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:19.711452 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:19.711416 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58886b789f-kcs45"] Apr 28 19:19:19.714657 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:19:19.714630 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbde4648b_b5b5_45ec_9f1e_c54a811ffbca.slice/crio-9cd231f5b291cba6bf2af9d2020e0a6524a82b204c6bc6fcb76ee8b811b7dc52 WatchSource:0}: Error finding container 9cd231f5b291cba6bf2af9d2020e0a6524a82b204c6bc6fcb76ee8b811b7dc52: Status 404 returned error can't find the container with id 9cd231f5b291cba6bf2af9d2020e0a6524a82b204c6bc6fcb76ee8b811b7dc52 Apr 28 19:19:20.345206 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:20.345174 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b8bcbc46f-958sx" event={"ID":"07d09f95-65ab-472a-8050-5a461c515306","Type":"ContainerStarted","Data":"81b4221335e99ff4d1ca7ce83f9180a4c78ec5d5803d2bb1891fa63eee4f35ac"} Apr 28 19:19:20.346616 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:20.346586 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58886b789f-kcs45" event={"ID":"bde4648b-b5b5-45ec-9f1e-c54a811ffbca","Type":"ContainerStarted","Data":"9cd231f5b291cba6bf2af9d2020e0a6524a82b204c6bc6fcb76ee8b811b7dc52"} Apr 28 19:19:20.370298 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:20.370253 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6b8bcbc46f-958sx" podStartSLOduration=6.673189692 podStartE2EDuration="10.370236517s" podCreationTimestamp="2026-04-28 19:19:10 +0000 UTC" firstStartedPulling="2026-04-28 19:19:16.311956298 +0000 UTC m=+211.247001439" lastFinishedPulling="2026-04-28 19:19:20.009003127 +0000 UTC m=+214.944048264" observedRunningTime="2026-04-28 19:19:20.368960311 +0000 UTC m=+215.304005462" watchObservedRunningTime="2026-04-28 19:19:20.370236517 +0000 UTC m=+215.305281667" Apr 28 19:19:20.893353 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:20.893319 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:20.893353 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:20.893362 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:20.899185 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:20.899159 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:21.351217 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:21.351175 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58886b789f-kcs45" event={"ID":"bde4648b-b5b5-45ec-9f1e-c54a811ffbca","Type":"ContainerStarted","Data":"bfdae4700e43e1c27b6960a2bbd8e636e81d1bd4a212d2142de9bbd8f8b91507"} Apr 28 19:19:21.355743 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:21.355718 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:21.397578 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:21.397520 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-58886b789f-kcs45" podStartSLOduration=2.839318796 podStartE2EDuration="3.39750327s" podCreationTimestamp="2026-04-28 19:19:18 +0000 UTC" firstStartedPulling="2026-04-28 19:19:19.716941696 +0000 UTC m=+214.651986822" lastFinishedPulling="2026-04-28 19:19:20.275126159 +0000 UTC m=+215.210171296" observedRunningTime="2026-04-28 19:19:21.396439532 +0000 UTC m=+216.331484681" watchObservedRunningTime="2026-04-28 19:19:21.39750327 +0000 UTC m=+216.332548422" Apr 28 19:19:29.362293 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:29.362256 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:29.362293 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:29.362297 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:29.366858 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:29.366836 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:29.379296 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:29.379275 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:19:29.473453 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:29.473417 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b8bcbc46f-958sx"] Apr 28 19:19:40.409634 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:40.409597 2582 generic.go:358] "Generic (PLEG): container finished" podID="18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7" containerID="1252b06e00ade898a11caf7ec0ee9a73f8a2c9eb13942745d025cfb3d16f8007" exitCode=0 Apr 28 19:19:40.410060 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:40.409673 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f2h6c" event={"ID":"18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7","Type":"ContainerDied","Data":"1252b06e00ade898a11caf7ec0ee9a73f8a2c9eb13942745d025cfb3d16f8007"} Apr 28 19:19:40.410060 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:40.410007 2582 scope.go:117] "RemoveContainer" containerID="1252b06e00ade898a11caf7ec0ee9a73f8a2c9eb13942745d025cfb3d16f8007" Apr 28 19:19:41.413807 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:41.413774 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f2h6c" event={"ID":"18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7","Type":"ContainerStarted","Data":"953bce2ff27942bf9e3bf4031b9c7ced54d8997e30461093aad9a87248615666"} Apr 28 19:19:54.495108 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:54.495040 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6b8bcbc46f-958sx" podUID="07d09f95-65ab-472a-8050-5a461c515306" containerName="console" containerID="cri-o://81b4221335e99ff4d1ca7ce83f9180a4c78ec5d5803d2bb1891fa63eee4f35ac" gracePeriod=15 Apr 28 19:19:54.768702 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:54.768674 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b8bcbc46f-958sx_07d09f95-65ab-472a-8050-5a461c515306/console/0.log" Apr 28 19:19:54.768828 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:54.768736 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:54.876830 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:54.876791 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/07d09f95-65ab-472a-8050-5a461c515306-console-serving-cert\") pod \"07d09f95-65ab-472a-8050-5a461c515306\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " Apr 28 19:19:54.877047 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:54.876850 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/07d09f95-65ab-472a-8050-5a461c515306-service-ca\") pod \"07d09f95-65ab-472a-8050-5a461c515306\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " Apr 28 19:19:54.877047 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:54.876914 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/07d09f95-65ab-472a-8050-5a461c515306-console-config\") pod \"07d09f95-65ab-472a-8050-5a461c515306\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " Apr 28 19:19:54.877047 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:54.876947 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/07d09f95-65ab-472a-8050-5a461c515306-console-oauth-config\") pod \"07d09f95-65ab-472a-8050-5a461c515306\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " Apr 28 19:19:54.877047 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:54.876977 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9vjt\" (UniqueName: \"kubernetes.io/projected/07d09f95-65ab-472a-8050-5a461c515306-kube-api-access-s9vjt\") pod \"07d09f95-65ab-472a-8050-5a461c515306\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " Apr 28 19:19:54.877047 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:54.877002 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/07d09f95-65ab-472a-8050-5a461c515306-oauth-serving-cert\") pod \"07d09f95-65ab-472a-8050-5a461c515306\" (UID: \"07d09f95-65ab-472a-8050-5a461c515306\") " Apr 28 19:19:54.877405 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:54.877373 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07d09f95-65ab-472a-8050-5a461c515306-service-ca" (OuterVolumeSpecName: "service-ca") pod "07d09f95-65ab-472a-8050-5a461c515306" (UID: "07d09f95-65ab-472a-8050-5a461c515306"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:19:54.877545 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:54.877401 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07d09f95-65ab-472a-8050-5a461c515306-console-config" (OuterVolumeSpecName: "console-config") pod "07d09f95-65ab-472a-8050-5a461c515306" (UID: "07d09f95-65ab-472a-8050-5a461c515306"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:19:54.877545 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:54.877473 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07d09f95-65ab-472a-8050-5a461c515306-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "07d09f95-65ab-472a-8050-5a461c515306" (UID: "07d09f95-65ab-472a-8050-5a461c515306"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:19:54.879328 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:54.879302 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d09f95-65ab-472a-8050-5a461c515306-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "07d09f95-65ab-472a-8050-5a461c515306" (UID: "07d09f95-65ab-472a-8050-5a461c515306"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:19:54.879423 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:54.879331 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d09f95-65ab-472a-8050-5a461c515306-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "07d09f95-65ab-472a-8050-5a461c515306" (UID: "07d09f95-65ab-472a-8050-5a461c515306"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:19:54.879423 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:54.879338 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07d09f95-65ab-472a-8050-5a461c515306-kube-api-access-s9vjt" (OuterVolumeSpecName: "kube-api-access-s9vjt") pod "07d09f95-65ab-472a-8050-5a461c515306" (UID: "07d09f95-65ab-472a-8050-5a461c515306"). InnerVolumeSpecName "kube-api-access-s9vjt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:19:54.978185 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:54.978144 2582 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/07d09f95-65ab-472a-8050-5a461c515306-console-oauth-config\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:19:54.978185 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:54.978180 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s9vjt\" (UniqueName: \"kubernetes.io/projected/07d09f95-65ab-472a-8050-5a461c515306-kube-api-access-s9vjt\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:19:54.978185 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:54.978196 2582 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/07d09f95-65ab-472a-8050-5a461c515306-oauth-serving-cert\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:19:54.978419 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:54.978208 2582 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/07d09f95-65ab-472a-8050-5a461c515306-console-serving-cert\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:19:54.978419 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:54.978221 2582 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/07d09f95-65ab-472a-8050-5a461c515306-service-ca\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:19:54.978419 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:54.978234 2582 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/07d09f95-65ab-472a-8050-5a461c515306-console-config\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:19:55.456785 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:55.456758 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6b8bcbc46f-958sx_07d09f95-65ab-472a-8050-5a461c515306/console/0.log" Apr 28 19:19:55.457005 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:55.456798 2582 generic.go:358] "Generic (PLEG): container finished" podID="07d09f95-65ab-472a-8050-5a461c515306" containerID="81b4221335e99ff4d1ca7ce83f9180a4c78ec5d5803d2bb1891fa63eee4f35ac" exitCode=2 Apr 28 19:19:55.457005 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:55.456832 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b8bcbc46f-958sx" event={"ID":"07d09f95-65ab-472a-8050-5a461c515306","Type":"ContainerDied","Data":"81b4221335e99ff4d1ca7ce83f9180a4c78ec5d5803d2bb1891fa63eee4f35ac"} Apr 28 19:19:55.457005 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:55.456872 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6b8bcbc46f-958sx" event={"ID":"07d09f95-65ab-472a-8050-5a461c515306","Type":"ContainerDied","Data":"49c9f464e38a5afc72f915b9d2598a80a2c146c63fff6c0d37626d5b3df8f0c5"} Apr 28 19:19:55.457005 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:55.456881 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6b8bcbc46f-958sx" Apr 28 19:19:55.457005 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:55.456888 2582 scope.go:117] "RemoveContainer" containerID="81b4221335e99ff4d1ca7ce83f9180a4c78ec5d5803d2bb1891fa63eee4f35ac" Apr 28 19:19:55.465638 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:55.465620 2582 scope.go:117] "RemoveContainer" containerID="81b4221335e99ff4d1ca7ce83f9180a4c78ec5d5803d2bb1891fa63eee4f35ac" Apr 28 19:19:55.465911 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:19:55.465878 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81b4221335e99ff4d1ca7ce83f9180a4c78ec5d5803d2bb1891fa63eee4f35ac\": container with ID starting with 81b4221335e99ff4d1ca7ce83f9180a4c78ec5d5803d2bb1891fa63eee4f35ac not found: ID does not exist" containerID="81b4221335e99ff4d1ca7ce83f9180a4c78ec5d5803d2bb1891fa63eee4f35ac" Apr 28 19:19:55.465983 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:55.465931 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81b4221335e99ff4d1ca7ce83f9180a4c78ec5d5803d2bb1891fa63eee4f35ac"} err="failed to get container status \"81b4221335e99ff4d1ca7ce83f9180a4c78ec5d5803d2bb1891fa63eee4f35ac\": rpc error: code = NotFound desc = could not find container \"81b4221335e99ff4d1ca7ce83f9180a4c78ec5d5803d2bb1891fa63eee4f35ac\": container with ID starting with 81b4221335e99ff4d1ca7ce83f9180a4c78ec5d5803d2bb1891fa63eee4f35ac not found: ID does not exist" Apr 28 19:19:55.480816 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:55.480790 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6b8bcbc46f-958sx"] Apr 28 19:19:55.489657 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:55.489634 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6b8bcbc46f-958sx"] Apr 28 19:19:55.636331 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:55.636299 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07d09f95-65ab-472a-8050-5a461c515306" path="/var/lib/kubelet/pods/07d09f95-65ab-472a-8050-5a461c515306/volumes" Apr 28 19:19:56.492817 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:56.492780 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs\") pod \"network-metrics-daemon-q2wj9\" (UID: \"ae2a816c-4f04-45d7-bb27-80786c738721\") " pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:19:56.495132 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:56.495108 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ae2a816c-4f04-45d7-bb27-80786c738721-metrics-certs\") pod \"network-metrics-daemon-q2wj9\" (UID: \"ae2a816c-4f04-45d7-bb27-80786c738721\") " pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:19:56.643494 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:56.643465 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-djdhf\"" Apr 28 19:19:56.651018 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:56.651000 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-q2wj9" Apr 28 19:19:56.777271 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:56.777176 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-q2wj9"] Apr 28 19:19:56.780247 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:19:56.780218 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae2a816c_4f04_45d7_bb27_80786c738721.slice/crio-4c43c36a2a81a92c98b17a79906e1e3745af883a57ff662d35d1dd3ac9c1a236 WatchSource:0}: Error finding container 4c43c36a2a81a92c98b17a79906e1e3745af883a57ff662d35d1dd3ac9c1a236: Status 404 returned error can't find the container with id 4c43c36a2a81a92c98b17a79906e1e3745af883a57ff662d35d1dd3ac9c1a236 Apr 28 19:19:57.464591 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:57.464548 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q2wj9" event={"ID":"ae2a816c-4f04-45d7-bb27-80786c738721","Type":"ContainerStarted","Data":"4c43c36a2a81a92c98b17a79906e1e3745af883a57ff662d35d1dd3ac9c1a236"} Apr 28 19:19:58.468811 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:58.468782 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q2wj9" event={"ID":"ae2a816c-4f04-45d7-bb27-80786c738721","Type":"ContainerStarted","Data":"01a9c63666d7cb04001f79ac525a0506d1e2e790f377b689a85c78c6e4b6d86e"} Apr 28 19:19:59.473268 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:59.473223 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-q2wj9" event={"ID":"ae2a816c-4f04-45d7-bb27-80786c738721","Type":"ContainerStarted","Data":"2ab53689363a910bfc9a91cef6972e7364adbd0484606a05b29e78cfd4ddd81a"} Apr 28 19:19:59.544342 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:19:59.544291 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-q2wj9" podStartSLOduration=253.162034885 podStartE2EDuration="4m14.544273608s" podCreationTimestamp="2026-04-28 19:15:45 +0000 UTC" firstStartedPulling="2026-04-28 19:19:56.782061296 +0000 UTC m=+251.717106426" lastFinishedPulling="2026-04-28 19:19:58.164300021 +0000 UTC m=+253.099345149" observedRunningTime="2026-04-28 19:19:59.544252594 +0000 UTC m=+254.479297743" watchObservedRunningTime="2026-04-28 19:19:59.544273608 +0000 UTC m=+254.479318761" Apr 28 19:20:19.267182 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.267146 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-748c689b74-m628v"] Apr 28 19:20:19.267642 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.267440 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07d09f95-65ab-472a-8050-5a461c515306" containerName="console" Apr 28 19:20:19.267642 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.267452 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d09f95-65ab-472a-8050-5a461c515306" containerName="console" Apr 28 19:20:19.267642 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.267517 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="07d09f95-65ab-472a-8050-5a461c515306" containerName="console" Apr 28 19:20:19.270479 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.270461 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.285211 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.285182 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-748c689b74-m628v"] Apr 28 19:20:19.372813 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.372780 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-trusted-ca-bundle\") pod \"console-748c689b74-m628v\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.373029 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.372825 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1aae57c4-15f1-4852-bce2-ca391f1b23c6-console-oauth-config\") pod \"console-748c689b74-m628v\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.373029 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.372849 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmtg5\" (UniqueName: \"kubernetes.io/projected/1aae57c4-15f1-4852-bce2-ca391f1b23c6-kube-api-access-kmtg5\") pod \"console-748c689b74-m628v\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.373029 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.372964 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-console-config\") pod \"console-748c689b74-m628v\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.373029 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.373015 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1aae57c4-15f1-4852-bce2-ca391f1b23c6-console-serving-cert\") pod \"console-748c689b74-m628v\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.373231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.373049 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-service-ca\") pod \"console-748c689b74-m628v\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.373231 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.373103 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-oauth-serving-cert\") pod \"console-748c689b74-m628v\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.474221 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.474172 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-console-config\") pod \"console-748c689b74-m628v\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.474221 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.474227 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1aae57c4-15f1-4852-bce2-ca391f1b23c6-console-serving-cert\") pod \"console-748c689b74-m628v\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.474438 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.474348 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-service-ca\") pod \"console-748c689b74-m628v\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.474438 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.474422 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-oauth-serving-cert\") pod \"console-748c689b74-m628v\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.474548 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.474499 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-trusted-ca-bundle\") pod \"console-748c689b74-m628v\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.474602 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.474555 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1aae57c4-15f1-4852-bce2-ca391f1b23c6-console-oauth-config\") pod \"console-748c689b74-m628v\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.474602 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.474583 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kmtg5\" (UniqueName: \"kubernetes.io/projected/1aae57c4-15f1-4852-bce2-ca391f1b23c6-kube-api-access-kmtg5\") pod \"console-748c689b74-m628v\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.475084 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.475060 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-console-config\") pod \"console-748c689b74-m628v\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.475084 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.475076 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-oauth-serving-cert\") pod \"console-748c689b74-m628v\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.475217 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.475098 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-service-ca\") pod \"console-748c689b74-m628v\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.475454 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.475436 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-trusted-ca-bundle\") pod \"console-748c689b74-m628v\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.476800 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.476774 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1aae57c4-15f1-4852-bce2-ca391f1b23c6-console-serving-cert\") pod \"console-748c689b74-m628v\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.477158 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.477141 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1aae57c4-15f1-4852-bce2-ca391f1b23c6-console-oauth-config\") pod \"console-748c689b74-m628v\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.486300 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.486271 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmtg5\" (UniqueName: \"kubernetes.io/projected/1aae57c4-15f1-4852-bce2-ca391f1b23c6-kube-api-access-kmtg5\") pod \"console-748c689b74-m628v\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.580179 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.580086 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:19.715258 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:19.715228 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-748c689b74-m628v"] Apr 28 19:20:19.717389 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:20:19.717360 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1aae57c4_15f1_4852_bce2_ca391f1b23c6.slice/crio-b69b996d7b7075e04e5f78f739cc9d3ce2652225d3f46aec8d02308878462160 WatchSource:0}: Error finding container b69b996d7b7075e04e5f78f739cc9d3ce2652225d3f46aec8d02308878462160: Status 404 returned error can't find the container with id b69b996d7b7075e04e5f78f739cc9d3ce2652225d3f46aec8d02308878462160 Apr 28 19:20:20.537563 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:20.537527 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-748c689b74-m628v" event={"ID":"1aae57c4-15f1-4852-bce2-ca391f1b23c6","Type":"ContainerStarted","Data":"bce5ff5bacbd3e8bd51d858014f1ff28981dc6bd4ad89eb92bab77165d11a0c4"} Apr 28 19:20:20.537563 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:20.537563 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-748c689b74-m628v" event={"ID":"1aae57c4-15f1-4852-bce2-ca391f1b23c6","Type":"ContainerStarted","Data":"b69b996d7b7075e04e5f78f739cc9d3ce2652225d3f46aec8d02308878462160"} Apr 28 19:20:20.558402 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:20.558355 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-748c689b74-m628v" podStartSLOduration=1.558340853 podStartE2EDuration="1.558340853s" podCreationTimestamp="2026-04-28 19:20:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:20:20.557548185 +0000 UTC m=+275.492593341" watchObservedRunningTime="2026-04-28 19:20:20.558340853 +0000 UTC m=+275.493386002" Apr 28 19:20:29.580792 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:29.580757 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:29.581410 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:29.580853 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:29.585694 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:29.585670 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:30.569841 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:30.569809 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-748c689b74-m628v" Apr 28 19:20:30.631471 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:30.631434 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58886b789f-kcs45"] Apr 28 19:20:45.517640 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:45.517608 2582 kubelet.go:1628] "Image garbage collection succeeded" Apr 28 19:20:48.440164 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:48.440123 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc"] Apr 28 19:20:48.443392 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:48.443376 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc" Apr 28 19:20:48.446404 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:48.446384 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 28 19:20:48.446404 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:48.446395 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 28 19:20:48.447543 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:48.447530 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-mb8qp\"" Apr 28 19:20:48.456999 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:48.456974 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc"] Apr 28 19:20:48.505130 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:48.505094 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9fadc980-eb36-4e53-992d-3de6e831461e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc\" (UID: \"9fadc980-eb36-4e53-992d-3de6e831461e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc" Apr 28 19:20:48.505301 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:48.505140 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4cxr\" (UniqueName: \"kubernetes.io/projected/9fadc980-eb36-4e53-992d-3de6e831461e-kube-api-access-s4cxr\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc\" (UID: \"9fadc980-eb36-4e53-992d-3de6e831461e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc" Apr 28 19:20:48.505301 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:48.505230 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9fadc980-eb36-4e53-992d-3de6e831461e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc\" (UID: \"9fadc980-eb36-4e53-992d-3de6e831461e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc" Apr 28 19:20:48.606195 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:48.606144 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9fadc980-eb36-4e53-992d-3de6e831461e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc\" (UID: \"9fadc980-eb36-4e53-992d-3de6e831461e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc" Apr 28 19:20:48.606343 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:48.606224 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9fadc980-eb36-4e53-992d-3de6e831461e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc\" (UID: \"9fadc980-eb36-4e53-992d-3de6e831461e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc" Apr 28 19:20:48.606343 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:48.606270 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4cxr\" (UniqueName: \"kubernetes.io/projected/9fadc980-eb36-4e53-992d-3de6e831461e-kube-api-access-s4cxr\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc\" (UID: \"9fadc980-eb36-4e53-992d-3de6e831461e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc" Apr 28 19:20:48.606551 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:48.606530 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9fadc980-eb36-4e53-992d-3de6e831461e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc\" (UID: \"9fadc980-eb36-4e53-992d-3de6e831461e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc" Apr 28 19:20:48.606589 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:48.606558 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9fadc980-eb36-4e53-992d-3de6e831461e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc\" (UID: \"9fadc980-eb36-4e53-992d-3de6e831461e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc" Apr 28 19:20:48.616322 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:48.616284 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4cxr\" (UniqueName: \"kubernetes.io/projected/9fadc980-eb36-4e53-992d-3de6e831461e-kube-api-access-s4cxr\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc\" (UID: \"9fadc980-eb36-4e53-992d-3de6e831461e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc" Apr 28 19:20:48.752842 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:48.752812 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc" Apr 28 19:20:48.878581 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:48.878550 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc"] Apr 28 19:20:48.881715 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:20:48.881682 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fadc980_eb36_4e53_992d_3de6e831461e.slice/crio-b4e89f13bc44434f6594e7d7df29f60255c134a0f26a3a2b7b221f6f9c84b839 WatchSource:0}: Error finding container b4e89f13bc44434f6594e7d7df29f60255c134a0f26a3a2b7b221f6f9c84b839: Status 404 returned error can't find the container with id b4e89f13bc44434f6594e7d7df29f60255c134a0f26a3a2b7b221f6f9c84b839 Apr 28 19:20:48.883795 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:48.883776 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:20:49.619681 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:49.619642 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc" event={"ID":"9fadc980-eb36-4e53-992d-3de6e831461e","Type":"ContainerStarted","Data":"b4e89f13bc44434f6594e7d7df29f60255c134a0f26a3a2b7b221f6f9c84b839"} Apr 28 19:20:55.639909 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:55.639864 2582 generic.go:358] "Generic (PLEG): container finished" podID="9fadc980-eb36-4e53-992d-3de6e831461e" containerID="8061bc55cbdfc0f44f171695c1aecda15c3accf734e1b7e7e863563f71f1ced9" exitCode=0 Apr 28 19:20:55.640407 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:55.639927 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc" event={"ID":"9fadc980-eb36-4e53-992d-3de6e831461e","Type":"ContainerDied","Data":"8061bc55cbdfc0f44f171695c1aecda15c3accf734e1b7e7e863563f71f1ced9"} Apr 28 19:20:55.656356 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:55.656326 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-58886b789f-kcs45" podUID="bde4648b-b5b5-45ec-9f1e-c54a811ffbca" containerName="console" containerID="cri-o://bfdae4700e43e1c27b6960a2bbd8e636e81d1bd4a212d2142de9bbd8f8b91507" gracePeriod=15 Apr 28 19:20:55.922144 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:55.922122 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58886b789f-kcs45_bde4648b-b5b5-45ec-9f1e-c54a811ffbca/console/0.log" Apr 28 19:20:55.922261 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:55.922184 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:20:56.074538 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.074492 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-console-oauth-config\") pod \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " Apr 28 19:20:56.074702 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.074602 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-console-serving-cert\") pod \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " Apr 28 19:20:56.074702 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.074622 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-service-ca\") pod \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " Apr 28 19:20:56.074702 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.074657 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-console-config\") pod \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " Apr 28 19:20:56.074702 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.074682 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-847fb\" (UniqueName: \"kubernetes.io/projected/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-kube-api-access-847fb\") pod \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " Apr 28 19:20:56.074702 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.074700 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-oauth-serving-cert\") pod \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " Apr 28 19:20:56.074963 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.074720 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-trusted-ca-bundle\") pod \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\" (UID: \"bde4648b-b5b5-45ec-9f1e-c54a811ffbca\") " Apr 28 19:20:56.075164 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.075128 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-service-ca" (OuterVolumeSpecName: "service-ca") pod "bde4648b-b5b5-45ec-9f1e-c54a811ffbca" (UID: "bde4648b-b5b5-45ec-9f1e-c54a811ffbca"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:20:56.075287 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.075159 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-console-config" (OuterVolumeSpecName: "console-config") pod "bde4648b-b5b5-45ec-9f1e-c54a811ffbca" (UID: "bde4648b-b5b5-45ec-9f1e-c54a811ffbca"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:20:56.075287 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.075182 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bde4648b-b5b5-45ec-9f1e-c54a811ffbca" (UID: "bde4648b-b5b5-45ec-9f1e-c54a811ffbca"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:20:56.075287 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.075210 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bde4648b-b5b5-45ec-9f1e-c54a811ffbca" (UID: "bde4648b-b5b5-45ec-9f1e-c54a811ffbca"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:20:56.076862 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.076839 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bde4648b-b5b5-45ec-9f1e-c54a811ffbca" (UID: "bde4648b-b5b5-45ec-9f1e-c54a811ffbca"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:56.076968 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.076857 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bde4648b-b5b5-45ec-9f1e-c54a811ffbca" (UID: "bde4648b-b5b5-45ec-9f1e-c54a811ffbca"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:20:56.077003 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.076981 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-kube-api-access-847fb" (OuterVolumeSpecName: "kube-api-access-847fb") pod "bde4648b-b5b5-45ec-9f1e-c54a811ffbca" (UID: "bde4648b-b5b5-45ec-9f1e-c54a811ffbca"). InnerVolumeSpecName "kube-api-access-847fb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:20:56.175719 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.175636 2582 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-console-config\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:20:56.175719 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.175665 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-847fb\" (UniqueName: \"kubernetes.io/projected/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-kube-api-access-847fb\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:20:56.175719 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.175685 2582 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-oauth-serving-cert\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:20:56.175719 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.175694 2582 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-trusted-ca-bundle\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:20:56.175719 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.175703 2582 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-console-oauth-config\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:20:56.175719 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.175711 2582 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-console-serving-cert\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:20:56.175719 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.175720 2582 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bde4648b-b5b5-45ec-9f1e-c54a811ffbca-service-ca\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:20:56.644016 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.643990 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58886b789f-kcs45_bde4648b-b5b5-45ec-9f1e-c54a811ffbca/console/0.log" Apr 28 19:20:56.644501 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.644029 2582 generic.go:358] "Generic (PLEG): container finished" podID="bde4648b-b5b5-45ec-9f1e-c54a811ffbca" containerID="bfdae4700e43e1c27b6960a2bbd8e636e81d1bd4a212d2142de9bbd8f8b91507" exitCode=2 Apr 28 19:20:56.644501 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.644061 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58886b789f-kcs45" event={"ID":"bde4648b-b5b5-45ec-9f1e-c54a811ffbca","Type":"ContainerDied","Data":"bfdae4700e43e1c27b6960a2bbd8e636e81d1bd4a212d2142de9bbd8f8b91507"} Apr 28 19:20:56.644501 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.644103 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58886b789f-kcs45" event={"ID":"bde4648b-b5b5-45ec-9f1e-c54a811ffbca","Type":"ContainerDied","Data":"9cd231f5b291cba6bf2af9d2020e0a6524a82b204c6bc6fcb76ee8b811b7dc52"} Apr 28 19:20:56.644501 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.644109 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58886b789f-kcs45" Apr 28 19:20:56.644501 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.644119 2582 scope.go:117] "RemoveContainer" containerID="bfdae4700e43e1c27b6960a2bbd8e636e81d1bd4a212d2142de9bbd8f8b91507" Apr 28 19:20:56.654117 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.654096 2582 scope.go:117] "RemoveContainer" containerID="bfdae4700e43e1c27b6960a2bbd8e636e81d1bd4a212d2142de9bbd8f8b91507" Apr 28 19:20:56.654404 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:20:56.654372 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfdae4700e43e1c27b6960a2bbd8e636e81d1bd4a212d2142de9bbd8f8b91507\": container with ID starting with bfdae4700e43e1c27b6960a2bbd8e636e81d1bd4a212d2142de9bbd8f8b91507 not found: ID does not exist" containerID="bfdae4700e43e1c27b6960a2bbd8e636e81d1bd4a212d2142de9bbd8f8b91507" Apr 28 19:20:56.654516 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.654408 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfdae4700e43e1c27b6960a2bbd8e636e81d1bd4a212d2142de9bbd8f8b91507"} err="failed to get container status \"bfdae4700e43e1c27b6960a2bbd8e636e81d1bd4a212d2142de9bbd8f8b91507\": rpc error: code = NotFound desc = could not find container \"bfdae4700e43e1c27b6960a2bbd8e636e81d1bd4a212d2142de9bbd8f8b91507\": container with ID starting with bfdae4700e43e1c27b6960a2bbd8e636e81d1bd4a212d2142de9bbd8f8b91507 not found: ID does not exist" Apr 28 19:20:56.668245 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.668176 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58886b789f-kcs45"] Apr 28 19:20:56.669878 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:56.669857 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-58886b789f-kcs45"] Apr 28 19:20:57.637421 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:57.637393 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bde4648b-b5b5-45ec-9f1e-c54a811ffbca" path="/var/lib/kubelet/pods/bde4648b-b5b5-45ec-9f1e-c54a811ffbca/volumes" Apr 28 19:20:58.654351 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:58.654313 2582 generic.go:358] "Generic (PLEG): container finished" podID="9fadc980-eb36-4e53-992d-3de6e831461e" containerID="19432cd40190f3e473badb55789978414a7c44553c33cfe119a2dab5017d8efc" exitCode=0 Apr 28 19:20:58.654806 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:20:58.654401 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc" event={"ID":"9fadc980-eb36-4e53-992d-3de6e831461e","Type":"ContainerDied","Data":"19432cd40190f3e473badb55789978414a7c44553c33cfe119a2dab5017d8efc"} Apr 28 19:21:05.679163 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:05.679113 2582 generic.go:358] "Generic (PLEG): container finished" podID="9fadc980-eb36-4e53-992d-3de6e831461e" containerID="09da283002eaa77f27e3fd61b5a2bdbe0b1b4f52952101632b112531c8de043f" exitCode=0 Apr 28 19:21:05.679557 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:05.679235 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc" event={"ID":"9fadc980-eb36-4e53-992d-3de6e831461e","Type":"ContainerDied","Data":"09da283002eaa77f27e3fd61b5a2bdbe0b1b4f52952101632b112531c8de043f"} Apr 28 19:21:06.802758 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:06.802735 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc" Apr 28 19:21:06.971962 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:06.971871 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9fadc980-eb36-4e53-992d-3de6e831461e-bundle\") pod \"9fadc980-eb36-4e53-992d-3de6e831461e\" (UID: \"9fadc980-eb36-4e53-992d-3de6e831461e\") " Apr 28 19:21:06.972114 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:06.971978 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4cxr\" (UniqueName: \"kubernetes.io/projected/9fadc980-eb36-4e53-992d-3de6e831461e-kube-api-access-s4cxr\") pod \"9fadc980-eb36-4e53-992d-3de6e831461e\" (UID: \"9fadc980-eb36-4e53-992d-3de6e831461e\") " Apr 28 19:21:06.972114 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:06.972021 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9fadc980-eb36-4e53-992d-3de6e831461e-util\") pod \"9fadc980-eb36-4e53-992d-3de6e831461e\" (UID: \"9fadc980-eb36-4e53-992d-3de6e831461e\") " Apr 28 19:21:06.972504 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:06.972475 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fadc980-eb36-4e53-992d-3de6e831461e-bundle" (OuterVolumeSpecName: "bundle") pod "9fadc980-eb36-4e53-992d-3de6e831461e" (UID: "9fadc980-eb36-4e53-992d-3de6e831461e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:21:06.974348 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:06.974322 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fadc980-eb36-4e53-992d-3de6e831461e-kube-api-access-s4cxr" (OuterVolumeSpecName: "kube-api-access-s4cxr") pod "9fadc980-eb36-4e53-992d-3de6e831461e" (UID: "9fadc980-eb36-4e53-992d-3de6e831461e"). InnerVolumeSpecName "kube-api-access-s4cxr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:21:06.975928 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:06.975888 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fadc980-eb36-4e53-992d-3de6e831461e-util" (OuterVolumeSpecName: "util") pod "9fadc980-eb36-4e53-992d-3de6e831461e" (UID: "9fadc980-eb36-4e53-992d-3de6e831461e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 28 19:21:07.073069 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:07.073032 2582 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9fadc980-eb36-4e53-992d-3de6e831461e-util\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:21:07.073069 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:07.073061 2582 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9fadc980-eb36-4e53-992d-3de6e831461e-bundle\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:21:07.073069 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:07.073072 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s4cxr\" (UniqueName: \"kubernetes.io/projected/9fadc980-eb36-4e53-992d-3de6e831461e-kube-api-access-s4cxr\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:21:07.686201 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:07.686168 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc" event={"ID":"9fadc980-eb36-4e53-992d-3de6e831461e","Type":"ContainerDied","Data":"b4e89f13bc44434f6594e7d7df29f60255c134a0f26a3a2b7b221f6f9c84b839"} Apr 28 19:21:07.686201 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:07.686205 2582 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4e89f13bc44434f6594e7d7df29f60255c134a0f26a3a2b7b221f6f9c84b839" Apr 28 19:21:07.686390 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:07.686176 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ccdjkc" Apr 28 19:21:11.294608 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.294578 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-r567d"] Apr 28 19:21:11.295110 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.295089 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9fadc980-eb36-4e53-992d-3de6e831461e" containerName="pull" Apr 28 19:21:11.295189 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.295113 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fadc980-eb36-4e53-992d-3de6e831461e" containerName="pull" Apr 28 19:21:11.295189 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.295131 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9fadc980-eb36-4e53-992d-3de6e831461e" containerName="extract" Apr 28 19:21:11.295189 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.295139 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fadc980-eb36-4e53-992d-3de6e831461e" containerName="extract" Apr 28 19:21:11.295189 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.295152 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9fadc980-eb36-4e53-992d-3de6e831461e" containerName="util" Apr 28 19:21:11.295189 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.295162 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fadc980-eb36-4e53-992d-3de6e831461e" containerName="util" Apr 28 19:21:11.295189 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.295187 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bde4648b-b5b5-45ec-9f1e-c54a811ffbca" containerName="console" Apr 28 19:21:11.295460 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.295195 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde4648b-b5b5-45ec-9f1e-c54a811ffbca" containerName="console" Apr 28 19:21:11.295460 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.295266 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="bde4648b-b5b5-45ec-9f1e-c54a811ffbca" containerName="console" Apr 28 19:21:11.295460 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.295282 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="9fadc980-eb36-4e53-992d-3de6e831461e" containerName="extract" Apr 28 19:21:11.332780 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.332749 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-r567d"] Apr 28 19:21:11.332971 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.332880 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-r567d" Apr 28 19:21:11.337979 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.337956 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-56r6l\"" Apr 28 19:21:11.338287 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.338263 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 28 19:21:11.338287 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.338282 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 28 19:21:11.338578 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.338559 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 28 19:21:11.507768 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.507724 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpft8\" (UniqueName: \"kubernetes.io/projected/058298a8-9e32-472e-9aea-7308c899deb0-kube-api-access-fpft8\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-r567d\" (UID: \"058298a8-9e32-472e-9aea-7308c899deb0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-r567d" Apr 28 19:21:11.507977 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.507774 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/058298a8-9e32-472e-9aea-7308c899deb0-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-r567d\" (UID: \"058298a8-9e32-472e-9aea-7308c899deb0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-r567d" Apr 28 19:21:11.609068 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.608984 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpft8\" (UniqueName: \"kubernetes.io/projected/058298a8-9e32-472e-9aea-7308c899deb0-kube-api-access-fpft8\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-r567d\" (UID: \"058298a8-9e32-472e-9aea-7308c899deb0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-r567d" Apr 28 19:21:11.609068 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.609020 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/058298a8-9e32-472e-9aea-7308c899deb0-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-r567d\" (UID: \"058298a8-9e32-472e-9aea-7308c899deb0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-r567d" Apr 28 19:21:11.611533 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.611512 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/058298a8-9e32-472e-9aea-7308c899deb0-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-r567d\" (UID: \"058298a8-9e32-472e-9aea-7308c899deb0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-r567d" Apr 28 19:21:11.619157 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.619133 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpft8\" (UniqueName: \"kubernetes.io/projected/058298a8-9e32-472e-9aea-7308c899deb0-kube-api-access-fpft8\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-r567d\" (UID: \"058298a8-9e32-472e-9aea-7308c899deb0\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-r567d" Apr 28 19:21:11.642871 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.642845 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-r567d" Apr 28 19:21:11.790756 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:11.790726 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-r567d"] Apr 28 19:21:11.794355 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:21:11.794312 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod058298a8_9e32_472e_9aea_7308c899deb0.slice/crio-0c5b86346206a4765bd68334aea81320d131dffa825120bf69cc561bb215221a WatchSource:0}: Error finding container 0c5b86346206a4765bd68334aea81320d131dffa825120bf69cc561bb215221a: Status 404 returned error can't find the container with id 0c5b86346206a4765bd68334aea81320d131dffa825120bf69cc561bb215221a Apr 28 19:21:12.702650 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:12.702612 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-r567d" event={"ID":"058298a8-9e32-472e-9aea-7308c899deb0","Type":"ContainerStarted","Data":"0c5b86346206a4765bd68334aea81320d131dffa825120bf69cc561bb215221a"} Apr 28 19:21:17.722401 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:17.722297 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-r567d" event={"ID":"058298a8-9e32-472e-9aea-7308c899deb0","Type":"ContainerStarted","Data":"23f933f3cdb8bb17f91b19492037dca390d0140a33c7d64cce8839a677b15408"} Apr 28 19:21:17.722864 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:17.722399 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-r567d" Apr 28 19:21:17.742586 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:17.742529 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-r567d" podStartSLOduration=1.149239502 podStartE2EDuration="6.742504645s" podCreationTimestamp="2026-04-28 19:21:11 +0000 UTC" firstStartedPulling="2026-04-28 19:21:11.796795526 +0000 UTC m=+326.731840653" lastFinishedPulling="2026-04-28 19:21:17.390060667 +0000 UTC m=+332.325105796" observedRunningTime="2026-04-28 19:21:17.740885006 +0000 UTC m=+332.675930154" watchObservedRunningTime="2026-04-28 19:21:17.742504645 +0000 UTC m=+332.677549796" Apr 28 19:21:17.914130 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:17.914098 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-f8bnx"] Apr 28 19:21:17.917477 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:17.917450 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-f8bnx" Apr 28 19:21:17.919990 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:17.919966 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 28 19:21:17.920283 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:17.920265 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-bsgzw\"" Apr 28 19:21:17.920485 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:17.920467 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 28 19:21:17.925562 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:17.925541 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-f8bnx"] Apr 28 19:21:17.959240 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:17.959207 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/efc6be91-b839-4819-be73-4af5759d4ba7-certificates\") pod \"keda-operator-ffbb595cb-f8bnx\" (UID: \"efc6be91-b839-4819-be73-4af5759d4ba7\") " pod="openshift-keda/keda-operator-ffbb595cb-f8bnx" Apr 28 19:21:17.959397 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:17.959268 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/efc6be91-b839-4819-be73-4af5759d4ba7-cabundle0\") pod \"keda-operator-ffbb595cb-f8bnx\" (UID: \"efc6be91-b839-4819-be73-4af5759d4ba7\") " pod="openshift-keda/keda-operator-ffbb595cb-f8bnx" Apr 28 19:21:17.959397 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:17.959321 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lgt5\" (UniqueName: \"kubernetes.io/projected/efc6be91-b839-4819-be73-4af5759d4ba7-kube-api-access-9lgt5\") pod \"keda-operator-ffbb595cb-f8bnx\" (UID: \"efc6be91-b839-4819-be73-4af5759d4ba7\") " pod="openshift-keda/keda-operator-ffbb595cb-f8bnx" Apr 28 19:21:18.060107 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.060071 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9lgt5\" (UniqueName: \"kubernetes.io/projected/efc6be91-b839-4819-be73-4af5759d4ba7-kube-api-access-9lgt5\") pod \"keda-operator-ffbb595cb-f8bnx\" (UID: \"efc6be91-b839-4819-be73-4af5759d4ba7\") " pod="openshift-keda/keda-operator-ffbb595cb-f8bnx" Apr 28 19:21:18.060334 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.060135 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/efc6be91-b839-4819-be73-4af5759d4ba7-certificates\") pod \"keda-operator-ffbb595cb-f8bnx\" (UID: \"efc6be91-b839-4819-be73-4af5759d4ba7\") " pod="openshift-keda/keda-operator-ffbb595cb-f8bnx" Apr 28 19:21:18.060334 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.060195 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/efc6be91-b839-4819-be73-4af5759d4ba7-cabundle0\") pod \"keda-operator-ffbb595cb-f8bnx\" (UID: \"efc6be91-b839-4819-be73-4af5759d4ba7\") " pod="openshift-keda/keda-operator-ffbb595cb-f8bnx" Apr 28 19:21:18.060334 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:18.060226 2582 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 28 19:21:18.060334 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:18.060249 2582 secret.go:281] references non-existent secret key: ca.crt Apr 28 19:21:18.060334 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:18.060258 2582 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 28 19:21:18.060334 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:18.060271 2582 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-f8bnx: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 28 19:21:18.060644 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:18.060342 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efc6be91-b839-4819-be73-4af5759d4ba7-certificates podName:efc6be91-b839-4819-be73-4af5759d4ba7 nodeName:}" failed. No retries permitted until 2026-04-28 19:21:18.560324043 +0000 UTC m=+333.495369169 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/efc6be91-b839-4819-be73-4af5759d4ba7-certificates") pod "keda-operator-ffbb595cb-f8bnx" (UID: "efc6be91-b839-4819-be73-4af5759d4ba7") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 28 19:21:18.060944 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.060921 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/efc6be91-b839-4819-be73-4af5759d4ba7-cabundle0\") pod \"keda-operator-ffbb595cb-f8bnx\" (UID: \"efc6be91-b839-4819-be73-4af5759d4ba7\") " pod="openshift-keda/keda-operator-ffbb595cb-f8bnx" Apr 28 19:21:18.069605 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.069580 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lgt5\" (UniqueName: \"kubernetes.io/projected/efc6be91-b839-4819-be73-4af5759d4ba7-kube-api-access-9lgt5\") pod \"keda-operator-ffbb595cb-f8bnx\" (UID: \"efc6be91-b839-4819-be73-4af5759d4ba7\") " pod="openshift-keda/keda-operator-ffbb595cb-f8bnx" Apr 28 19:21:18.162099 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.162063 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7"] Apr 28 19:21:18.165645 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.165621 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7" Apr 28 19:21:18.168601 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.168580 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 28 19:21:18.173254 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.173226 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7"] Apr 28 19:21:18.261411 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.261373 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkqdn\" (UniqueName: \"kubernetes.io/projected/fa6eccad-d94e-45c3-a985-d82fdb03fb57-kube-api-access-zkqdn\") pod \"keda-metrics-apiserver-7c9f485588-xf8b7\" (UID: \"fa6eccad-d94e-45c3-a985-d82fdb03fb57\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7" Apr 28 19:21:18.261599 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.261445 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fa6eccad-d94e-45c3-a985-d82fdb03fb57-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xf8b7\" (UID: \"fa6eccad-d94e-45c3-a985-d82fdb03fb57\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7" Apr 28 19:21:18.261599 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.261504 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/fa6eccad-d94e-45c3-a985-d82fdb03fb57-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-xf8b7\" (UID: \"fa6eccad-d94e-45c3-a985-d82fdb03fb57\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7" Apr 28 19:21:18.362073 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.361976 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zkqdn\" (UniqueName: \"kubernetes.io/projected/fa6eccad-d94e-45c3-a985-d82fdb03fb57-kube-api-access-zkqdn\") pod \"keda-metrics-apiserver-7c9f485588-xf8b7\" (UID: \"fa6eccad-d94e-45c3-a985-d82fdb03fb57\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7" Apr 28 19:21:18.362073 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.362051 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fa6eccad-d94e-45c3-a985-d82fdb03fb57-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xf8b7\" (UID: \"fa6eccad-d94e-45c3-a985-d82fdb03fb57\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7" Apr 28 19:21:18.362300 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.362110 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/fa6eccad-d94e-45c3-a985-d82fdb03fb57-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-xf8b7\" (UID: \"fa6eccad-d94e-45c3-a985-d82fdb03fb57\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7" Apr 28 19:21:18.362300 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:18.362264 2582 secret.go:281] references non-existent secret key: tls.crt Apr 28 19:21:18.362300 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:18.362291 2582 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 28 19:21:18.362426 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:18.362328 2582 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7: references non-existent secret key: tls.crt Apr 28 19:21:18.362426 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:18.362405 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa6eccad-d94e-45c3-a985-d82fdb03fb57-certificates podName:fa6eccad-d94e-45c3-a985-d82fdb03fb57 nodeName:}" failed. No retries permitted until 2026-04-28 19:21:18.862376103 +0000 UTC m=+333.797421234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/fa6eccad-d94e-45c3-a985-d82fdb03fb57-certificates") pod "keda-metrics-apiserver-7c9f485588-xf8b7" (UID: "fa6eccad-d94e-45c3-a985-d82fdb03fb57") : references non-existent secret key: tls.crt Apr 28 19:21:18.362567 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.362549 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/fa6eccad-d94e-45c3-a985-d82fdb03fb57-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-xf8b7\" (UID: \"fa6eccad-d94e-45c3-a985-d82fdb03fb57\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7" Apr 28 19:21:18.373318 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.373288 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkqdn\" (UniqueName: \"kubernetes.io/projected/fa6eccad-d94e-45c3-a985-d82fdb03fb57-kube-api-access-zkqdn\") pod \"keda-metrics-apiserver-7c9f485588-xf8b7\" (UID: \"fa6eccad-d94e-45c3-a985-d82fdb03fb57\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7" Apr 28 19:21:18.430139 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.430107 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-bns7f"] Apr 28 19:21:18.433537 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.433511 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-bns7f" Apr 28 19:21:18.438561 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.438540 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 28 19:21:18.444865 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.444838 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-bns7f"] Apr 28 19:21:18.462761 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.462726 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj7jt\" (UniqueName: \"kubernetes.io/projected/783bae6b-ddb4-4e8b-86a8-54ec73a23d7e-kube-api-access-wj7jt\") pod \"keda-admission-cf49989db-bns7f\" (UID: \"783bae6b-ddb4-4e8b-86a8-54ec73a23d7e\") " pod="openshift-keda/keda-admission-cf49989db-bns7f" Apr 28 19:21:18.462964 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.462864 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/783bae6b-ddb4-4e8b-86a8-54ec73a23d7e-certificates\") pod \"keda-admission-cf49989db-bns7f\" (UID: \"783bae6b-ddb4-4e8b-86a8-54ec73a23d7e\") " pod="openshift-keda/keda-admission-cf49989db-bns7f" Apr 28 19:21:18.564157 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.564122 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wj7jt\" (UniqueName: \"kubernetes.io/projected/783bae6b-ddb4-4e8b-86a8-54ec73a23d7e-kube-api-access-wj7jt\") pod \"keda-admission-cf49989db-bns7f\" (UID: \"783bae6b-ddb4-4e8b-86a8-54ec73a23d7e\") " pod="openshift-keda/keda-admission-cf49989db-bns7f" Apr 28 19:21:18.564355 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.564229 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/efc6be91-b839-4819-be73-4af5759d4ba7-certificates\") pod \"keda-operator-ffbb595cb-f8bnx\" (UID: \"efc6be91-b839-4819-be73-4af5759d4ba7\") " pod="openshift-keda/keda-operator-ffbb595cb-f8bnx" Apr 28 19:21:18.564355 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.564292 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/783bae6b-ddb4-4e8b-86a8-54ec73a23d7e-certificates\") pod \"keda-admission-cf49989db-bns7f\" (UID: \"783bae6b-ddb4-4e8b-86a8-54ec73a23d7e\") " pod="openshift-keda/keda-admission-cf49989db-bns7f" Apr 28 19:21:18.564477 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:18.564383 2582 secret.go:281] references non-existent secret key: ca.crt Apr 28 19:21:18.564477 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:18.564406 2582 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 28 19:21:18.564477 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:18.564418 2582 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-f8bnx: references non-existent secret key: ca.crt Apr 28 19:21:18.564477 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:18.564435 2582 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 28 19:21:18.564477 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:18.564456 2582 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-bns7f: secret "keda-admission-webhooks-certs" not found Apr 28 19:21:18.564727 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:18.564482 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efc6be91-b839-4819-be73-4af5759d4ba7-certificates podName:efc6be91-b839-4819-be73-4af5759d4ba7 nodeName:}" failed. No retries permitted until 2026-04-28 19:21:19.564463311 +0000 UTC m=+334.499508452 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/efc6be91-b839-4819-be73-4af5759d4ba7-certificates") pod "keda-operator-ffbb595cb-f8bnx" (UID: "efc6be91-b839-4819-be73-4af5759d4ba7") : references non-existent secret key: ca.crt Apr 28 19:21:18.564727 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:18.564507 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/783bae6b-ddb4-4e8b-86a8-54ec73a23d7e-certificates podName:783bae6b-ddb4-4e8b-86a8-54ec73a23d7e nodeName:}" failed. No retries permitted until 2026-04-28 19:21:19.064491079 +0000 UTC m=+333.999536223 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/783bae6b-ddb4-4e8b-86a8-54ec73a23d7e-certificates") pod "keda-admission-cf49989db-bns7f" (UID: "783bae6b-ddb4-4e8b-86a8-54ec73a23d7e") : secret "keda-admission-webhooks-certs" not found Apr 28 19:21:18.575574 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.575544 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj7jt\" (UniqueName: \"kubernetes.io/projected/783bae6b-ddb4-4e8b-86a8-54ec73a23d7e-kube-api-access-wj7jt\") pod \"keda-admission-cf49989db-bns7f\" (UID: \"783bae6b-ddb4-4e8b-86a8-54ec73a23d7e\") " pod="openshift-keda/keda-admission-cf49989db-bns7f" Apr 28 19:21:18.867046 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:18.867008 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fa6eccad-d94e-45c3-a985-d82fdb03fb57-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xf8b7\" (UID: \"fa6eccad-d94e-45c3-a985-d82fdb03fb57\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7" Apr 28 19:21:18.867521 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:18.867186 2582 secret.go:281] references non-existent secret key: tls.crt Apr 28 19:21:18.867521 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:18.867211 2582 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 28 19:21:18.867521 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:18.867239 2582 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7: references non-existent secret key: tls.crt Apr 28 19:21:18.867521 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:18.867312 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa6eccad-d94e-45c3-a985-d82fdb03fb57-certificates podName:fa6eccad-d94e-45c3-a985-d82fdb03fb57 nodeName:}" failed. No retries permitted until 2026-04-28 19:21:19.867291275 +0000 UTC m=+334.802336418 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/fa6eccad-d94e-45c3-a985-d82fdb03fb57-certificates") pod "keda-metrics-apiserver-7c9f485588-xf8b7" (UID: "fa6eccad-d94e-45c3-a985-d82fdb03fb57") : references non-existent secret key: tls.crt Apr 28 19:21:19.068003 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:19.067965 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/783bae6b-ddb4-4e8b-86a8-54ec73a23d7e-certificates\") pod \"keda-admission-cf49989db-bns7f\" (UID: \"783bae6b-ddb4-4e8b-86a8-54ec73a23d7e\") " pod="openshift-keda/keda-admission-cf49989db-bns7f" Apr 28 19:21:19.070682 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:19.070646 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/783bae6b-ddb4-4e8b-86a8-54ec73a23d7e-certificates\") pod \"keda-admission-cf49989db-bns7f\" (UID: \"783bae6b-ddb4-4e8b-86a8-54ec73a23d7e\") " pod="openshift-keda/keda-admission-cf49989db-bns7f" Apr 28 19:21:19.343993 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:19.343954 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-bns7f" Apr 28 19:21:19.475463 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:19.475403 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-bns7f"] Apr 28 19:21:19.479363 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:21:19.479331 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod783bae6b_ddb4_4e8b_86a8_54ec73a23d7e.slice/crio-1dcbe88af25fafcf2a9ed6e4dff838a519d9f677fadfb164436f76f2870addfc WatchSource:0}: Error finding container 1dcbe88af25fafcf2a9ed6e4dff838a519d9f677fadfb164436f76f2870addfc: Status 404 returned error can't find the container with id 1dcbe88af25fafcf2a9ed6e4dff838a519d9f677fadfb164436f76f2870addfc Apr 28 19:21:19.574542 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:19.574490 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/efc6be91-b839-4819-be73-4af5759d4ba7-certificates\") pod \"keda-operator-ffbb595cb-f8bnx\" (UID: \"efc6be91-b839-4819-be73-4af5759d4ba7\") " pod="openshift-keda/keda-operator-ffbb595cb-f8bnx" Apr 28 19:21:19.574735 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:19.574678 2582 secret.go:281] references non-existent secret key: ca.crt Apr 28 19:21:19.574735 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:19.574702 2582 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 28 19:21:19.574735 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:19.574713 2582 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-f8bnx: references non-existent secret key: ca.crt Apr 28 19:21:19.574887 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:19.574775 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efc6be91-b839-4819-be73-4af5759d4ba7-certificates podName:efc6be91-b839-4819-be73-4af5759d4ba7 nodeName:}" failed. No retries permitted until 2026-04-28 19:21:21.574754771 +0000 UTC m=+336.509799916 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/efc6be91-b839-4819-be73-4af5759d4ba7-certificates") pod "keda-operator-ffbb595cb-f8bnx" (UID: "efc6be91-b839-4819-be73-4af5759d4ba7") : references non-existent secret key: ca.crt Apr 28 19:21:19.730789 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:19.730703 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-bns7f" event={"ID":"783bae6b-ddb4-4e8b-86a8-54ec73a23d7e","Type":"ContainerStarted","Data":"1dcbe88af25fafcf2a9ed6e4dff838a519d9f677fadfb164436f76f2870addfc"} Apr 28 19:21:19.877649 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:19.877604 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fa6eccad-d94e-45c3-a985-d82fdb03fb57-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xf8b7\" (UID: \"fa6eccad-d94e-45c3-a985-d82fdb03fb57\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7" Apr 28 19:21:19.878163 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:19.877800 2582 secret.go:281] references non-existent secret key: tls.crt Apr 28 19:21:19.878163 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:19.877825 2582 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 28 19:21:19.878163 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:19.877850 2582 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7: references non-existent secret key: tls.crt Apr 28 19:21:19.878163 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:19.877947 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa6eccad-d94e-45c3-a985-d82fdb03fb57-certificates podName:fa6eccad-d94e-45c3-a985-d82fdb03fb57 nodeName:}" failed. No retries permitted until 2026-04-28 19:21:21.877925156 +0000 UTC m=+336.812970286 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/fa6eccad-d94e-45c3-a985-d82fdb03fb57-certificates") pod "keda-metrics-apiserver-7c9f485588-xf8b7" (UID: "fa6eccad-d94e-45c3-a985-d82fdb03fb57") : references non-existent secret key: tls.crt Apr 28 19:21:21.593732 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:21.593690 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/efc6be91-b839-4819-be73-4af5759d4ba7-certificates\") pod \"keda-operator-ffbb595cb-f8bnx\" (UID: \"efc6be91-b839-4819-be73-4af5759d4ba7\") " pod="openshift-keda/keda-operator-ffbb595cb-f8bnx" Apr 28 19:21:21.594159 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:21.593840 2582 secret.go:281] references non-existent secret key: ca.crt Apr 28 19:21:21.594159 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:21.593859 2582 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 28 19:21:21.594159 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:21.593868 2582 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-f8bnx: references non-existent secret key: ca.crt Apr 28 19:21:21.594159 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:21.593951 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/efc6be91-b839-4819-be73-4af5759d4ba7-certificates podName:efc6be91-b839-4819-be73-4af5759d4ba7 nodeName:}" failed. No retries permitted until 2026-04-28 19:21:25.593934655 +0000 UTC m=+340.528979785 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/efc6be91-b839-4819-be73-4af5759d4ba7-certificates") pod "keda-operator-ffbb595cb-f8bnx" (UID: "efc6be91-b839-4819-be73-4af5759d4ba7") : references non-existent secret key: ca.crt Apr 28 19:21:21.738676 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:21.738635 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-bns7f" event={"ID":"783bae6b-ddb4-4e8b-86a8-54ec73a23d7e","Type":"ContainerStarted","Data":"aa4a3a89760ec358bd89c2d5fa05786624bd4eb0cf594d9dc736fdd0435f9bf3"} Apr 28 19:21:21.738838 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:21.738756 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-bns7f" Apr 28 19:21:21.896790 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:21.896700 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fa6eccad-d94e-45c3-a985-d82fdb03fb57-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xf8b7\" (UID: \"fa6eccad-d94e-45c3-a985-d82fdb03fb57\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7" Apr 28 19:21:21.896962 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:21.896846 2582 secret.go:281] references non-existent secret key: tls.crt Apr 28 19:21:21.896962 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:21.896868 2582 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 28 19:21:21.896962 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:21.896886 2582 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7: references non-existent secret key: tls.crt Apr 28 19:21:21.896962 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:21:21.896961 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa6eccad-d94e-45c3-a985-d82fdb03fb57-certificates podName:fa6eccad-d94e-45c3-a985-d82fdb03fb57 nodeName:}" failed. No retries permitted until 2026-04-28 19:21:25.896947616 +0000 UTC m=+340.831992746 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/fa6eccad-d94e-45c3-a985-d82fdb03fb57-certificates") pod "keda-metrics-apiserver-7c9f485588-xf8b7" (UID: "fa6eccad-d94e-45c3-a985-d82fdb03fb57") : references non-existent secret key: tls.crt Apr 28 19:21:25.628318 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:25.628285 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/efc6be91-b839-4819-be73-4af5759d4ba7-certificates\") pod \"keda-operator-ffbb595cb-f8bnx\" (UID: \"efc6be91-b839-4819-be73-4af5759d4ba7\") " pod="openshift-keda/keda-operator-ffbb595cb-f8bnx" Apr 28 19:21:25.630856 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:25.630833 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/efc6be91-b839-4819-be73-4af5759d4ba7-certificates\") pod \"keda-operator-ffbb595cb-f8bnx\" (UID: \"efc6be91-b839-4819-be73-4af5759d4ba7\") " pod="openshift-keda/keda-operator-ffbb595cb-f8bnx" Apr 28 19:21:25.728332 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:25.728296 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-f8bnx" Apr 28 19:21:25.857552 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:25.857471 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-bns7f" podStartSLOduration=6.22517934 podStartE2EDuration="7.857452454s" podCreationTimestamp="2026-04-28 19:21:18 +0000 UTC" firstStartedPulling="2026-04-28 19:21:19.480581842 +0000 UTC m=+334.415626973" lastFinishedPulling="2026-04-28 19:21:21.112854956 +0000 UTC m=+336.047900087" observedRunningTime="2026-04-28 19:21:21.765349762 +0000 UTC m=+336.700394910" watchObservedRunningTime="2026-04-28 19:21:25.857452454 +0000 UTC m=+340.792497604" Apr 28 19:21:25.859537 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:25.859511 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-f8bnx"] Apr 28 19:21:25.861941 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:21:25.861911 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefc6be91_b839_4819_be73_4af5759d4ba7.slice/crio-fd29123ecddb6a662e6da96ed2053ccaae39e8178fb039408bea4bc86decc06f WatchSource:0}: Error finding container fd29123ecddb6a662e6da96ed2053ccaae39e8178fb039408bea4bc86decc06f: Status 404 returned error can't find the container with id fd29123ecddb6a662e6da96ed2053ccaae39e8178fb039408bea4bc86decc06f Apr 28 19:21:25.930286 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:25.930195 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fa6eccad-d94e-45c3-a985-d82fdb03fb57-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xf8b7\" (UID: \"fa6eccad-d94e-45c3-a985-d82fdb03fb57\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7" Apr 28 19:21:25.932984 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:25.932959 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fa6eccad-d94e-45c3-a985-d82fdb03fb57-certificates\") pod \"keda-metrics-apiserver-7c9f485588-xf8b7\" (UID: \"fa6eccad-d94e-45c3-a985-d82fdb03fb57\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7" Apr 28 19:21:25.979846 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:25.979809 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7" Apr 28 19:21:26.101877 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:26.101851 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7"] Apr 28 19:21:26.103405 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:21:26.103376 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa6eccad_d94e_45c3_a985_d82fdb03fb57.slice/crio-0b69466508e8b67ea0af73193e40d809743f56af5ae90e132f1d02fcddb12c2d WatchSource:0}: Error finding container 0b69466508e8b67ea0af73193e40d809743f56af5ae90e132f1d02fcddb12c2d: Status 404 returned error can't find the container with id 0b69466508e8b67ea0af73193e40d809743f56af5ae90e132f1d02fcddb12c2d Apr 28 19:21:26.756057 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:26.756018 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-f8bnx" event={"ID":"efc6be91-b839-4819-be73-4af5759d4ba7","Type":"ContainerStarted","Data":"fd29123ecddb6a662e6da96ed2053ccaae39e8178fb039408bea4bc86decc06f"} Apr 28 19:21:26.757042 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:26.757012 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7" event={"ID":"fa6eccad-d94e-45c3-a985-d82fdb03fb57","Type":"ContainerStarted","Data":"0b69466508e8b67ea0af73193e40d809743f56af5ae90e132f1d02fcddb12c2d"} Apr 28 19:21:30.771768 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:30.771726 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7" event={"ID":"fa6eccad-d94e-45c3-a985-d82fdb03fb57","Type":"ContainerStarted","Data":"f8fdac0ae3707ecdae0cb4ca1ff9d9562613c9b23631c5ec8149b60b28982d93"} Apr 28 19:21:30.772258 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:30.771824 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7" Apr 28 19:21:30.773074 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:30.773052 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-f8bnx" event={"ID":"efc6be91-b839-4819-be73-4af5759d4ba7","Type":"ContainerStarted","Data":"b47a2ccb8130ef2f3580c922b7ce08a481f1694905646dc6906adfc68ccba28f"} Apr 28 19:21:30.773185 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:30.773173 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-f8bnx" Apr 28 19:21:30.794128 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:30.794062 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7" podStartSLOduration=8.925668018 podStartE2EDuration="12.794048485s" podCreationTimestamp="2026-04-28 19:21:18 +0000 UTC" firstStartedPulling="2026-04-28 19:21:26.10479049 +0000 UTC m=+341.039835620" lastFinishedPulling="2026-04-28 19:21:29.973170955 +0000 UTC m=+344.908216087" observedRunningTime="2026-04-28 19:21:30.792085721 +0000 UTC m=+345.727130872" watchObservedRunningTime="2026-04-28 19:21:30.794048485 +0000 UTC m=+345.729093633" Apr 28 19:21:30.814377 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:30.814318 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-f8bnx" podStartSLOduration=9.699438181 podStartE2EDuration="13.814300807s" podCreationTimestamp="2026-04-28 19:21:17 +0000 UTC" firstStartedPulling="2026-04-28 19:21:25.863292721 +0000 UTC m=+340.798337848" lastFinishedPulling="2026-04-28 19:21:29.978155343 +0000 UTC m=+344.913200474" observedRunningTime="2026-04-28 19:21:30.812596813 +0000 UTC m=+345.747641963" watchObservedRunningTime="2026-04-28 19:21:30.814300807 +0000 UTC m=+345.749345956" Apr 28 19:21:38.729059 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:38.728965 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-r567d" Apr 28 19:21:41.780785 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:41.780753 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-xf8b7" Apr 28 19:21:42.743972 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:42.743941 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-bns7f" Apr 28 19:21:51.779122 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:21:51.779086 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-f8bnx" Apr 28 19:22:24.197565 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.197531 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-b85c69797-kkrdp"] Apr 28 19:22:24.200707 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.200689 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-kkrdp" Apr 28 19:22:24.203837 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.203816 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 28 19:22:24.203967 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.203857 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 28 19:22:24.205020 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.204998 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 28 19:22:24.205180 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.205153 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-xv4rv\"" Apr 28 19:22:24.207543 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.207521 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-gtkgz"] Apr 28 19:22:24.211162 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.211131 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-gtkgz" Apr 28 19:22:24.212127 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.212098 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-kkrdp"] Apr 28 19:22:24.212201 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.212123 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gn42\" (UniqueName: \"kubernetes.io/projected/45bfe9aa-e833-4b70-a9cd-6d45727bb77a-kube-api-access-9gn42\") pod \"kserve-controller-manager-b85c69797-kkrdp\" (UID: \"45bfe9aa-e833-4b70-a9cd-6d45727bb77a\") " pod="kserve/kserve-controller-manager-b85c69797-kkrdp" Apr 28 19:22:24.212201 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.212158 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45bfe9aa-e833-4b70-a9cd-6d45727bb77a-cert\") pod \"kserve-controller-manager-b85c69797-kkrdp\" (UID: \"45bfe9aa-e833-4b70-a9cd-6d45727bb77a\") " pod="kserve/kserve-controller-manager-b85c69797-kkrdp" Apr 28 19:22:24.213642 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.213626 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-jn88v\"" Apr 28 19:22:24.214109 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.214089 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 28 19:22:24.226886 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.226860 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-gtkgz"] Apr 28 19:22:24.312587 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.312553 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9gn42\" (UniqueName: \"kubernetes.io/projected/45bfe9aa-e833-4b70-a9cd-6d45727bb77a-kube-api-access-9gn42\") pod \"kserve-controller-manager-b85c69797-kkrdp\" (UID: \"45bfe9aa-e833-4b70-a9cd-6d45727bb77a\") " pod="kserve/kserve-controller-manager-b85c69797-kkrdp" Apr 28 19:22:24.312767 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.312604 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45bfe9aa-e833-4b70-a9cd-6d45727bb77a-cert\") pod \"kserve-controller-manager-b85c69797-kkrdp\" (UID: \"45bfe9aa-e833-4b70-a9cd-6d45727bb77a\") " pod="kserve/kserve-controller-manager-b85c69797-kkrdp" Apr 28 19:22:24.312767 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.312647 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2sbj\" (UniqueName: \"kubernetes.io/projected/a7b1603f-75cb-4411-a658-485536622575-kube-api-access-p2sbj\") pod \"llmisvc-controller-manager-68cc5db7c4-gtkgz\" (UID: \"a7b1603f-75cb-4411-a658-485536622575\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-gtkgz" Apr 28 19:22:24.312767 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.312680 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7b1603f-75cb-4411-a658-485536622575-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-gtkgz\" (UID: \"a7b1603f-75cb-4411-a658-485536622575\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-gtkgz" Apr 28 19:22:24.315192 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.315166 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45bfe9aa-e833-4b70-a9cd-6d45727bb77a-cert\") pod \"kserve-controller-manager-b85c69797-kkrdp\" (UID: \"45bfe9aa-e833-4b70-a9cd-6d45727bb77a\") " pod="kserve/kserve-controller-manager-b85c69797-kkrdp" Apr 28 19:22:24.323137 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.323111 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gn42\" (UniqueName: \"kubernetes.io/projected/45bfe9aa-e833-4b70-a9cd-6d45727bb77a-kube-api-access-9gn42\") pod \"kserve-controller-manager-b85c69797-kkrdp\" (UID: \"45bfe9aa-e833-4b70-a9cd-6d45727bb77a\") " pod="kserve/kserve-controller-manager-b85c69797-kkrdp" Apr 28 19:22:24.413622 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.413588 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7b1603f-75cb-4411-a658-485536622575-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-gtkgz\" (UID: \"a7b1603f-75cb-4411-a658-485536622575\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-gtkgz" Apr 28 19:22:24.413799 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.413666 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p2sbj\" (UniqueName: \"kubernetes.io/projected/a7b1603f-75cb-4411-a658-485536622575-kube-api-access-p2sbj\") pod \"llmisvc-controller-manager-68cc5db7c4-gtkgz\" (UID: \"a7b1603f-75cb-4411-a658-485536622575\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-gtkgz" Apr 28 19:22:24.413799 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:22:24.413754 2582 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 28 19:22:24.413877 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:22:24.413830 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7b1603f-75cb-4411-a658-485536622575-cert podName:a7b1603f-75cb-4411-a658-485536622575 nodeName:}" failed. No retries permitted until 2026-04-28 19:22:24.913811526 +0000 UTC m=+399.848856659 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7b1603f-75cb-4411-a658-485536622575-cert") pod "llmisvc-controller-manager-68cc5db7c4-gtkgz" (UID: "a7b1603f-75cb-4411-a658-485536622575") : secret "llmisvc-webhook-server-cert" not found Apr 28 19:22:24.457088 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.457017 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2sbj\" (UniqueName: \"kubernetes.io/projected/a7b1603f-75cb-4411-a658-485536622575-kube-api-access-p2sbj\") pod \"llmisvc-controller-manager-68cc5db7c4-gtkgz\" (UID: \"a7b1603f-75cb-4411-a658-485536622575\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-gtkgz" Apr 28 19:22:24.513969 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.513935 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-kkrdp" Apr 28 19:22:24.667014 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.666989 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-kkrdp"] Apr 28 19:22:24.669739 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:22:24.669712 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45bfe9aa_e833_4b70_a9cd_6d45727bb77a.slice/crio-e9b4617ad9cf2e42998dcbadfc69b0679f3edc69e289ed58f4e3d226dce46d8b WatchSource:0}: Error finding container e9b4617ad9cf2e42998dcbadfc69b0679f3edc69e289ed58f4e3d226dce46d8b: Status 404 returned error can't find the container with id e9b4617ad9cf2e42998dcbadfc69b0679f3edc69e289ed58f4e3d226dce46d8b Apr 28 19:22:24.917983 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.917944 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7b1603f-75cb-4411-a658-485536622575-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-gtkgz\" (UID: \"a7b1603f-75cb-4411-a658-485536622575\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-gtkgz" Apr 28 19:22:24.920529 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.920507 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7b1603f-75cb-4411-a658-485536622575-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-gtkgz\" (UID: \"a7b1603f-75cb-4411-a658-485536622575\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-gtkgz" Apr 28 19:22:24.952570 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:24.952539 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-kkrdp" event={"ID":"45bfe9aa-e833-4b70-a9cd-6d45727bb77a","Type":"ContainerStarted","Data":"e9b4617ad9cf2e42998dcbadfc69b0679f3edc69e289ed58f4e3d226dce46d8b"} Apr 28 19:22:25.121166 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:25.121133 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-gtkgz" Apr 28 19:22:25.268846 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:25.268820 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-gtkgz"] Apr 28 19:22:25.271693 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:22:25.271661 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda7b1603f_75cb_4411_a658_485536622575.slice/crio-ebdc30f7c494ab96b267fc7ffe4b0afeb6846adda7ab3fb063fad679d7527150 WatchSource:0}: Error finding container ebdc30f7c494ab96b267fc7ffe4b0afeb6846adda7ab3fb063fad679d7527150: Status 404 returned error can't find the container with id ebdc30f7c494ab96b267fc7ffe4b0afeb6846adda7ab3fb063fad679d7527150 Apr 28 19:22:25.957874 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:25.957842 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-gtkgz" event={"ID":"a7b1603f-75cb-4411-a658-485536622575","Type":"ContainerStarted","Data":"ebdc30f7c494ab96b267fc7ffe4b0afeb6846adda7ab3fb063fad679d7527150"} Apr 28 19:22:28.969746 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:28.969706 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-gtkgz" event={"ID":"a7b1603f-75cb-4411-a658-485536622575","Type":"ContainerStarted","Data":"497ffce093f692dbd8de00e8b07ad5a9c8ed608c940ce1957a2ebbaaf00b0da1"} Apr 28 19:22:28.970279 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:28.969866 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-gtkgz" Apr 28 19:22:28.971170 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:28.971148 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-kkrdp" event={"ID":"45bfe9aa-e833-4b70-a9cd-6d45727bb77a","Type":"ContainerStarted","Data":"5849f43e5279c72e6dd697fe126ea49c1c2f8eabb1e752e58cb404d1e4ab839c"} Apr 28 19:22:28.971288 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:28.971248 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-b85c69797-kkrdp" Apr 28 19:22:28.988040 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:28.987998 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-gtkgz" podStartSLOduration=1.95888789 podStartE2EDuration="4.987986461s" podCreationTimestamp="2026-04-28 19:22:24 +0000 UTC" firstStartedPulling="2026-04-28 19:22:25.273325393 +0000 UTC m=+400.208370520" lastFinishedPulling="2026-04-28 19:22:28.30242395 +0000 UTC m=+403.237469091" observedRunningTime="2026-04-28 19:22:28.986495405 +0000 UTC m=+403.921540553" watchObservedRunningTime="2026-04-28 19:22:28.987986461 +0000 UTC m=+403.923031610" Apr 28 19:22:29.002985 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:29.002934 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-b85c69797-kkrdp" podStartSLOduration=1.426244225 podStartE2EDuration="5.002920566s" podCreationTimestamp="2026-04-28 19:22:24 +0000 UTC" firstStartedPulling="2026-04-28 19:22:24.671017782 +0000 UTC m=+399.606062909" lastFinishedPulling="2026-04-28 19:22:28.247694123 +0000 UTC m=+403.182739250" observedRunningTime="2026-04-28 19:22:29.002473848 +0000 UTC m=+403.937519000" watchObservedRunningTime="2026-04-28 19:22:29.002920566 +0000 UTC m=+403.937965705" Apr 28 19:22:59.976279 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:59.976248 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-gtkgz" Apr 28 19:22:59.979240 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:22:59.979222 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-b85c69797-kkrdp" Apr 28 19:23:01.141979 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:01.141943 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-kkrdp"] Apr 28 19:23:01.142374 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:01.142153 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-b85c69797-kkrdp" podUID="45bfe9aa-e833-4b70-a9cd-6d45727bb77a" containerName="manager" containerID="cri-o://5849f43e5279c72e6dd697fe126ea49c1c2f8eabb1e752e58cb404d1e4ab839c" gracePeriod=10 Apr 28 19:23:01.180747 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:01.180717 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-b85c69797-dtvcg"] Apr 28 19:23:01.184012 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:01.183995 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-dtvcg" Apr 28 19:23:01.194576 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:01.194547 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-dtvcg"] Apr 28 19:23:01.318228 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:01.318196 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9f024e6-fa41-4f14-a26b-e61136cd8e86-cert\") pod \"kserve-controller-manager-b85c69797-dtvcg\" (UID: \"e9f024e6-fa41-4f14-a26b-e61136cd8e86\") " pod="kserve/kserve-controller-manager-b85c69797-dtvcg" Apr 28 19:23:01.318371 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:01.318258 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfnx8\" (UniqueName: \"kubernetes.io/projected/e9f024e6-fa41-4f14-a26b-e61136cd8e86-kube-api-access-dfnx8\") pod \"kserve-controller-manager-b85c69797-dtvcg\" (UID: \"e9f024e6-fa41-4f14-a26b-e61136cd8e86\") " pod="kserve/kserve-controller-manager-b85c69797-dtvcg" Apr 28 19:23:01.379372 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:01.379349 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-kkrdp" Apr 28 19:23:01.418889 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:01.418808 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfnx8\" (UniqueName: \"kubernetes.io/projected/e9f024e6-fa41-4f14-a26b-e61136cd8e86-kube-api-access-dfnx8\") pod \"kserve-controller-manager-b85c69797-dtvcg\" (UID: \"e9f024e6-fa41-4f14-a26b-e61136cd8e86\") " pod="kserve/kserve-controller-manager-b85c69797-dtvcg" Apr 28 19:23:01.418889 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:01.418861 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9f024e6-fa41-4f14-a26b-e61136cd8e86-cert\") pod \"kserve-controller-manager-b85c69797-dtvcg\" (UID: \"e9f024e6-fa41-4f14-a26b-e61136cd8e86\") " pod="kserve/kserve-controller-manager-b85c69797-dtvcg" Apr 28 19:23:01.421490 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:01.421462 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9f024e6-fa41-4f14-a26b-e61136cd8e86-cert\") pod \"kserve-controller-manager-b85c69797-dtvcg\" (UID: \"e9f024e6-fa41-4f14-a26b-e61136cd8e86\") " pod="kserve/kserve-controller-manager-b85c69797-dtvcg" Apr 28 19:23:01.427858 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:01.427819 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfnx8\" (UniqueName: \"kubernetes.io/projected/e9f024e6-fa41-4f14-a26b-e61136cd8e86-kube-api-access-dfnx8\") pod \"kserve-controller-manager-b85c69797-dtvcg\" (UID: \"e9f024e6-fa41-4f14-a26b-e61136cd8e86\") " pod="kserve/kserve-controller-manager-b85c69797-dtvcg" Apr 28 19:23:01.519882 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:01.519838 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45bfe9aa-e833-4b70-a9cd-6d45727bb77a-cert\") pod \"45bfe9aa-e833-4b70-a9cd-6d45727bb77a\" (UID: \"45bfe9aa-e833-4b70-a9cd-6d45727bb77a\") " Apr 28 19:23:01.519882 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:01.519892 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gn42\" (UniqueName: \"kubernetes.io/projected/45bfe9aa-e833-4b70-a9cd-6d45727bb77a-kube-api-access-9gn42\") pod \"45bfe9aa-e833-4b70-a9cd-6d45727bb77a\" (UID: \"45bfe9aa-e833-4b70-a9cd-6d45727bb77a\") " Apr 28 19:23:01.522206 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:01.522172 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45bfe9aa-e833-4b70-a9cd-6d45727bb77a-cert" (OuterVolumeSpecName: "cert") pod "45bfe9aa-e833-4b70-a9cd-6d45727bb77a" (UID: "45bfe9aa-e833-4b70-a9cd-6d45727bb77a"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:23:01.522326 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:01.522208 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45bfe9aa-e833-4b70-a9cd-6d45727bb77a-kube-api-access-9gn42" (OuterVolumeSpecName: "kube-api-access-9gn42") pod "45bfe9aa-e833-4b70-a9cd-6d45727bb77a" (UID: "45bfe9aa-e833-4b70-a9cd-6d45727bb77a"). InnerVolumeSpecName "kube-api-access-9gn42". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:23:01.537409 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:01.537370 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-dtvcg" Apr 28 19:23:01.621561 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:01.621531 2582 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45bfe9aa-e833-4b70-a9cd-6d45727bb77a-cert\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:23:01.621561 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:01.621563 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9gn42\" (UniqueName: \"kubernetes.io/projected/45bfe9aa-e833-4b70-a9cd-6d45727bb77a-kube-api-access-9gn42\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:23:01.662194 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:01.662143 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-dtvcg"] Apr 28 19:23:01.664483 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:23:01.664456 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9f024e6_fa41_4f14_a26b_e61136cd8e86.slice/crio-0499e95d5df541721febdf4259499b47c96bdde29888c37b6ca67c6b34efb8e2 WatchSource:0}: Error finding container 0499e95d5df541721febdf4259499b47c96bdde29888c37b6ca67c6b34efb8e2: Status 404 returned error can't find the container with id 0499e95d5df541721febdf4259499b47c96bdde29888c37b6ca67c6b34efb8e2 Apr 28 19:23:02.077674 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:02.077639 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-dtvcg" event={"ID":"e9f024e6-fa41-4f14-a26b-e61136cd8e86","Type":"ContainerStarted","Data":"0499e95d5df541721febdf4259499b47c96bdde29888c37b6ca67c6b34efb8e2"} Apr 28 19:23:02.078754 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:02.078727 2582 generic.go:358] "Generic (PLEG): container finished" podID="45bfe9aa-e833-4b70-a9cd-6d45727bb77a" containerID="5849f43e5279c72e6dd697fe126ea49c1c2f8eabb1e752e58cb404d1e4ab839c" exitCode=0 Apr 28 19:23:02.078889 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:02.078788 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-b85c69797-kkrdp" Apr 28 19:23:02.078889 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:02.078813 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-kkrdp" event={"ID":"45bfe9aa-e833-4b70-a9cd-6d45727bb77a","Type":"ContainerDied","Data":"5849f43e5279c72e6dd697fe126ea49c1c2f8eabb1e752e58cb404d1e4ab839c"} Apr 28 19:23:02.078889 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:02.078847 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-kkrdp" event={"ID":"45bfe9aa-e833-4b70-a9cd-6d45727bb77a","Type":"ContainerDied","Data":"e9b4617ad9cf2e42998dcbadfc69b0679f3edc69e289ed58f4e3d226dce46d8b"} Apr 28 19:23:02.078889 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:02.078862 2582 scope.go:117] "RemoveContainer" containerID="5849f43e5279c72e6dd697fe126ea49c1c2f8eabb1e752e58cb404d1e4ab839c" Apr 28 19:23:02.102329 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:02.102298 2582 scope.go:117] "RemoveContainer" containerID="5849f43e5279c72e6dd697fe126ea49c1c2f8eabb1e752e58cb404d1e4ab839c" Apr 28 19:23:02.102676 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:23:02.102645 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5849f43e5279c72e6dd697fe126ea49c1c2f8eabb1e752e58cb404d1e4ab839c\": container with ID starting with 5849f43e5279c72e6dd697fe126ea49c1c2f8eabb1e752e58cb404d1e4ab839c not found: ID does not exist" containerID="5849f43e5279c72e6dd697fe126ea49c1c2f8eabb1e752e58cb404d1e4ab839c" Apr 28 19:23:02.102742 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:02.102678 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5849f43e5279c72e6dd697fe126ea49c1c2f8eabb1e752e58cb404d1e4ab839c"} err="failed to get container status \"5849f43e5279c72e6dd697fe126ea49c1c2f8eabb1e752e58cb404d1e4ab839c\": rpc error: code = NotFound desc = could not find container \"5849f43e5279c72e6dd697fe126ea49c1c2f8eabb1e752e58cb404d1e4ab839c\": container with ID starting with 5849f43e5279c72e6dd697fe126ea49c1c2f8eabb1e752e58cb404d1e4ab839c not found: ID does not exist" Apr 28 19:23:02.112233 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:02.112207 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-kkrdp"] Apr 28 19:23:02.119062 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:02.119041 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-b85c69797-kkrdp"] Apr 28 19:23:03.083622 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:03.083538 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-b85c69797-dtvcg" event={"ID":"e9f024e6-fa41-4f14-a26b-e61136cd8e86","Type":"ContainerStarted","Data":"639545e766fbc970f172dc3abd28a78323f1e6ec99e3c907bca77c07ee879327"} Apr 28 19:23:03.084091 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:03.083706 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-b85c69797-dtvcg" Apr 28 19:23:03.101266 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:03.101218 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-b85c69797-dtvcg" podStartSLOduration=1.727512036 podStartE2EDuration="2.101205356s" podCreationTimestamp="2026-04-28 19:23:01 +0000 UTC" firstStartedPulling="2026-04-28 19:23:01.665651134 +0000 UTC m=+436.600696262" lastFinishedPulling="2026-04-28 19:23:02.039344454 +0000 UTC m=+436.974389582" observedRunningTime="2026-04-28 19:23:03.099517272 +0000 UTC m=+438.034562421" watchObservedRunningTime="2026-04-28 19:23:03.101205356 +0000 UTC m=+438.036250504" Apr 28 19:23:03.637759 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:03.637721 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45bfe9aa-e833-4b70-a9cd-6d45727bb77a" path="/var/lib/kubelet/pods/45bfe9aa-e833-4b70-a9cd-6d45727bb77a/volumes" Apr 28 19:23:34.093481 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:23:34.093451 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-b85c69797-dtvcg" Apr 28 19:24:15.429124 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.429089 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f4cb54d54-s4gfs"] Apr 28 19:24:15.429592 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.429442 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45bfe9aa-e833-4b70-a9cd-6d45727bb77a" containerName="manager" Apr 28 19:24:15.429592 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.429454 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="45bfe9aa-e833-4b70-a9cd-6d45727bb77a" containerName="manager" Apr 28 19:24:15.429592 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.429522 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="45bfe9aa-e833-4b70-a9cd-6d45727bb77a" containerName="manager" Apr 28 19:24:15.432508 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.432485 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.443444 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.443418 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f4cb54d54-s4gfs"] Apr 28 19:24:15.533658 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.533624 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ffd4cac-0219-4ab3-83fe-3f03b7a67175-console-config\") pod \"console-5f4cb54d54-s4gfs\" (UID: \"4ffd4cac-0219-4ab3-83fe-3f03b7a67175\") " pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.533830 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.533676 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ffd4cac-0219-4ab3-83fe-3f03b7a67175-console-serving-cert\") pod \"console-5f4cb54d54-s4gfs\" (UID: \"4ffd4cac-0219-4ab3-83fe-3f03b7a67175\") " pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.533830 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.533701 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ffd4cac-0219-4ab3-83fe-3f03b7a67175-console-oauth-config\") pod \"console-5f4cb54d54-s4gfs\" (UID: \"4ffd4cac-0219-4ab3-83fe-3f03b7a67175\") " pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.533830 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.533752 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ffd4cac-0219-4ab3-83fe-3f03b7a67175-service-ca\") pod \"console-5f4cb54d54-s4gfs\" (UID: \"4ffd4cac-0219-4ab3-83fe-3f03b7a67175\") " pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.533830 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.533775 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ffd4cac-0219-4ab3-83fe-3f03b7a67175-trusted-ca-bundle\") pod \"console-5f4cb54d54-s4gfs\" (UID: \"4ffd4cac-0219-4ab3-83fe-3f03b7a67175\") " pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.534053 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.533878 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ffd4cac-0219-4ab3-83fe-3f03b7a67175-oauth-serving-cert\") pod \"console-5f4cb54d54-s4gfs\" (UID: \"4ffd4cac-0219-4ab3-83fe-3f03b7a67175\") " pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.534053 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.533939 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7kwn\" (UniqueName: \"kubernetes.io/projected/4ffd4cac-0219-4ab3-83fe-3f03b7a67175-kube-api-access-r7kwn\") pod \"console-5f4cb54d54-s4gfs\" (UID: \"4ffd4cac-0219-4ab3-83fe-3f03b7a67175\") " pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.634510 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.634470 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ffd4cac-0219-4ab3-83fe-3f03b7a67175-service-ca\") pod \"console-5f4cb54d54-s4gfs\" (UID: \"4ffd4cac-0219-4ab3-83fe-3f03b7a67175\") " pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.634510 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.634510 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ffd4cac-0219-4ab3-83fe-3f03b7a67175-trusted-ca-bundle\") pod \"console-5f4cb54d54-s4gfs\" (UID: \"4ffd4cac-0219-4ab3-83fe-3f03b7a67175\") " pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.634760 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.634563 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ffd4cac-0219-4ab3-83fe-3f03b7a67175-oauth-serving-cert\") pod \"console-5f4cb54d54-s4gfs\" (UID: \"4ffd4cac-0219-4ab3-83fe-3f03b7a67175\") " pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.634760 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.634591 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7kwn\" (UniqueName: \"kubernetes.io/projected/4ffd4cac-0219-4ab3-83fe-3f03b7a67175-kube-api-access-r7kwn\") pod \"console-5f4cb54d54-s4gfs\" (UID: \"4ffd4cac-0219-4ab3-83fe-3f03b7a67175\") " pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.634760 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.634631 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ffd4cac-0219-4ab3-83fe-3f03b7a67175-console-config\") pod \"console-5f4cb54d54-s4gfs\" (UID: \"4ffd4cac-0219-4ab3-83fe-3f03b7a67175\") " pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.634955 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.634797 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ffd4cac-0219-4ab3-83fe-3f03b7a67175-console-serving-cert\") pod \"console-5f4cb54d54-s4gfs\" (UID: \"4ffd4cac-0219-4ab3-83fe-3f03b7a67175\") " pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.634955 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.634849 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ffd4cac-0219-4ab3-83fe-3f03b7a67175-console-oauth-config\") pod \"console-5f4cb54d54-s4gfs\" (UID: \"4ffd4cac-0219-4ab3-83fe-3f03b7a67175\") " pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.635431 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.635382 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4ffd4cac-0219-4ab3-83fe-3f03b7a67175-console-config\") pod \"console-5f4cb54d54-s4gfs\" (UID: \"4ffd4cac-0219-4ab3-83fe-3f03b7a67175\") " pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.635431 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.635382 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4ffd4cac-0219-4ab3-83fe-3f03b7a67175-service-ca\") pod \"console-5f4cb54d54-s4gfs\" (UID: \"4ffd4cac-0219-4ab3-83fe-3f03b7a67175\") " pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.635892 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.635870 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4ffd4cac-0219-4ab3-83fe-3f03b7a67175-oauth-serving-cert\") pod \"console-5f4cb54d54-s4gfs\" (UID: \"4ffd4cac-0219-4ab3-83fe-3f03b7a67175\") " pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.636183 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.636160 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4ffd4cac-0219-4ab3-83fe-3f03b7a67175-trusted-ca-bundle\") pod \"console-5f4cb54d54-s4gfs\" (UID: \"4ffd4cac-0219-4ab3-83fe-3f03b7a67175\") " pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.637804 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.637776 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ffd4cac-0219-4ab3-83fe-3f03b7a67175-console-serving-cert\") pod \"console-5f4cb54d54-s4gfs\" (UID: \"4ffd4cac-0219-4ab3-83fe-3f03b7a67175\") " pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.638469 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.638437 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4ffd4cac-0219-4ab3-83fe-3f03b7a67175-console-oauth-config\") pod \"console-5f4cb54d54-s4gfs\" (UID: \"4ffd4cac-0219-4ab3-83fe-3f03b7a67175\") " pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.644504 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.644482 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7kwn\" (UniqueName: \"kubernetes.io/projected/4ffd4cac-0219-4ab3-83fe-3f03b7a67175-kube-api-access-r7kwn\") pod \"console-5f4cb54d54-s4gfs\" (UID: \"4ffd4cac-0219-4ab3-83fe-3f03b7a67175\") " pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.741821 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.741794 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:15.874468 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:15.874428 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f4cb54d54-s4gfs"] Apr 28 19:24:15.876987 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:24:15.876958 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ffd4cac_0219_4ab3_83fe_3f03b7a67175.slice/crio-1211fa60d5b13b634ff20ceed76b6e87ed3980542fc464af5a56c7590fedaeb3 WatchSource:0}: Error finding container 1211fa60d5b13b634ff20ceed76b6e87ed3980542fc464af5a56c7590fedaeb3: Status 404 returned error can't find the container with id 1211fa60d5b13b634ff20ceed76b6e87ed3980542fc464af5a56c7590fedaeb3 Apr 28 19:24:16.330515 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:16.330474 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f4cb54d54-s4gfs" event={"ID":"4ffd4cac-0219-4ab3-83fe-3f03b7a67175","Type":"ContainerStarted","Data":"2d816296f33eec9339be39ec8520c50bdcfa5b17f14324083ef803d27017897a"} Apr 28 19:24:16.330515 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:16.330519 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f4cb54d54-s4gfs" event={"ID":"4ffd4cac-0219-4ab3-83fe-3f03b7a67175","Type":"ContainerStarted","Data":"1211fa60d5b13b634ff20ceed76b6e87ed3980542fc464af5a56c7590fedaeb3"} Apr 28 19:24:16.350284 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:16.350241 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f4cb54d54-s4gfs" podStartSLOduration=1.350227738 podStartE2EDuration="1.350227738s" podCreationTimestamp="2026-04-28 19:24:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:24:16.348419595 +0000 UTC m=+511.283464745" watchObservedRunningTime="2026-04-28 19:24:16.350227738 +0000 UTC m=+511.285272886" Apr 28 19:24:25.742716 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:25.742681 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:25.743174 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:25.742753 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:25.747264 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:25.747244 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:26.367770 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:26.367745 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f4cb54d54-s4gfs" Apr 28 19:24:26.418292 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:26.418256 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-748c689b74-m628v"] Apr 28 19:24:51.438195 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.438086 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-748c689b74-m628v" podUID="1aae57c4-15f1-4852-bce2-ca391f1b23c6" containerName="console" containerID="cri-o://bce5ff5bacbd3e8bd51d858014f1ff28981dc6bd4ad89eb92bab77165d11a0c4" gracePeriod=15 Apr 28 19:24:51.681213 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.681191 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-748c689b74-m628v_1aae57c4-15f1-4852-bce2-ca391f1b23c6/console/0.log" Apr 28 19:24:51.681333 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.681252 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-748c689b74-m628v" Apr 28 19:24:51.744082 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.744052 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1aae57c4-15f1-4852-bce2-ca391f1b23c6-console-serving-cert\") pod \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " Apr 28 19:24:51.744270 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.744089 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-service-ca\") pod \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " Apr 28 19:24:51.744270 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.744130 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmtg5\" (UniqueName: \"kubernetes.io/projected/1aae57c4-15f1-4852-bce2-ca391f1b23c6-kube-api-access-kmtg5\") pod \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " Apr 28 19:24:51.744270 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.744158 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-trusted-ca-bundle\") pod \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " Apr 28 19:24:51.744270 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.744192 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-oauth-serving-cert\") pod \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " Apr 28 19:24:51.744270 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.744217 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-console-config\") pod \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " Apr 28 19:24:51.744270 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.744250 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1aae57c4-15f1-4852-bce2-ca391f1b23c6-console-oauth-config\") pod \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\" (UID: \"1aae57c4-15f1-4852-bce2-ca391f1b23c6\") " Apr 28 19:24:51.744630 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.744597 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-service-ca" (OuterVolumeSpecName: "service-ca") pod "1aae57c4-15f1-4852-bce2-ca391f1b23c6" (UID: "1aae57c4-15f1-4852-bce2-ca391f1b23c6"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:24:51.744630 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.744614 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1aae57c4-15f1-4852-bce2-ca391f1b23c6" (UID: "1aae57c4-15f1-4852-bce2-ca391f1b23c6"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:24:51.744741 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.744627 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-console-config" (OuterVolumeSpecName: "console-config") pod "1aae57c4-15f1-4852-bce2-ca391f1b23c6" (UID: "1aae57c4-15f1-4852-bce2-ca391f1b23c6"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:24:51.744741 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.744637 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1aae57c4-15f1-4852-bce2-ca391f1b23c6" (UID: "1aae57c4-15f1-4852-bce2-ca391f1b23c6"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:24:51.746491 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.746461 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aae57c4-15f1-4852-bce2-ca391f1b23c6-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1aae57c4-15f1-4852-bce2-ca391f1b23c6" (UID: "1aae57c4-15f1-4852-bce2-ca391f1b23c6"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:24:51.746595 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.746501 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aae57c4-15f1-4852-bce2-ca391f1b23c6-kube-api-access-kmtg5" (OuterVolumeSpecName: "kube-api-access-kmtg5") pod "1aae57c4-15f1-4852-bce2-ca391f1b23c6" (UID: "1aae57c4-15f1-4852-bce2-ca391f1b23c6"). InnerVolumeSpecName "kube-api-access-kmtg5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 28 19:24:51.746595 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.746545 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aae57c4-15f1-4852-bce2-ca391f1b23c6-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1aae57c4-15f1-4852-bce2-ca391f1b23c6" (UID: "1aae57c4-15f1-4852-bce2-ca391f1b23c6"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:24:51.845279 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.845246 2582 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-oauth-serving-cert\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:24:51.845279 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.845278 2582 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-console-config\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:24:51.845279 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.845289 2582 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1aae57c4-15f1-4852-bce2-ca391f1b23c6-console-oauth-config\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:24:51.845279 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.845298 2582 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1aae57c4-15f1-4852-bce2-ca391f1b23c6-console-serving-cert\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:24:51.845547 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.845307 2582 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-service-ca\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:24:51.845547 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.845316 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kmtg5\" (UniqueName: \"kubernetes.io/projected/1aae57c4-15f1-4852-bce2-ca391f1b23c6-kube-api-access-kmtg5\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:24:51.845547 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:51.845325 2582 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1aae57c4-15f1-4852-bce2-ca391f1b23c6-trusted-ca-bundle\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:24:52.447053 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:52.447022 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-748c689b74-m628v_1aae57c4-15f1-4852-bce2-ca391f1b23c6/console/0.log" Apr 28 19:24:52.447456 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:52.447067 2582 generic.go:358] "Generic (PLEG): container finished" podID="1aae57c4-15f1-4852-bce2-ca391f1b23c6" containerID="bce5ff5bacbd3e8bd51d858014f1ff28981dc6bd4ad89eb92bab77165d11a0c4" exitCode=2 Apr 28 19:24:52.447456 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:52.447144 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-748c689b74-m628v" Apr 28 19:24:52.447456 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:52.447160 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-748c689b74-m628v" event={"ID":"1aae57c4-15f1-4852-bce2-ca391f1b23c6","Type":"ContainerDied","Data":"bce5ff5bacbd3e8bd51d858014f1ff28981dc6bd4ad89eb92bab77165d11a0c4"} Apr 28 19:24:52.447456 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:52.447205 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-748c689b74-m628v" event={"ID":"1aae57c4-15f1-4852-bce2-ca391f1b23c6","Type":"ContainerDied","Data":"b69b996d7b7075e04e5f78f739cc9d3ce2652225d3f46aec8d02308878462160"} Apr 28 19:24:52.447456 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:52.447226 2582 scope.go:117] "RemoveContainer" containerID="bce5ff5bacbd3e8bd51d858014f1ff28981dc6bd4ad89eb92bab77165d11a0c4" Apr 28 19:24:52.456433 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:52.456416 2582 scope.go:117] "RemoveContainer" containerID="bce5ff5bacbd3e8bd51d858014f1ff28981dc6bd4ad89eb92bab77165d11a0c4" Apr 28 19:24:52.456707 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:24:52.456687 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bce5ff5bacbd3e8bd51d858014f1ff28981dc6bd4ad89eb92bab77165d11a0c4\": container with ID starting with bce5ff5bacbd3e8bd51d858014f1ff28981dc6bd4ad89eb92bab77165d11a0c4 not found: ID does not exist" containerID="bce5ff5bacbd3e8bd51d858014f1ff28981dc6bd4ad89eb92bab77165d11a0c4" Apr 28 19:24:52.456753 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:52.456716 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bce5ff5bacbd3e8bd51d858014f1ff28981dc6bd4ad89eb92bab77165d11a0c4"} err="failed to get container status \"bce5ff5bacbd3e8bd51d858014f1ff28981dc6bd4ad89eb92bab77165d11a0c4\": rpc error: code = NotFound desc = could not find container \"bce5ff5bacbd3e8bd51d858014f1ff28981dc6bd4ad89eb92bab77165d11a0c4\": container with ID starting with bce5ff5bacbd3e8bd51d858014f1ff28981dc6bd4ad89eb92bab77165d11a0c4 not found: ID does not exist" Apr 28 19:24:52.471993 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:52.471965 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-748c689b74-m628v"] Apr 28 19:24:52.476465 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:52.476440 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-748c689b74-m628v"] Apr 28 19:24:53.637100 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:24:53.637059 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aae57c4-15f1-4852-bce2-ca391f1b23c6" path="/var/lib/kubelet/pods/1aae57c4-15f1-4852-bce2-ca391f1b23c6/volumes" Apr 28 19:27:36.214417 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:36.214332 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m"] Apr 28 19:27:36.215110 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:36.214677 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1aae57c4-15f1-4852-bce2-ca391f1b23c6" containerName="console" Apr 28 19:27:36.215110 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:36.214689 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aae57c4-15f1-4852-bce2-ca391f1b23c6" containerName="console" Apr 28 19:27:36.215110 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:36.214759 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="1aae57c4-15f1-4852-bce2-ca391f1b23c6" containerName="console" Apr 28 19:27:36.217711 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:36.217695 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" Apr 28 19:27:36.220589 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:36.220489 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-357ae-serving-cert\"" Apr 28 19:27:36.220589 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:36.220503 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-357ae-kube-rbac-proxy-sar-config\"" Apr 28 19:27:36.220589 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:36.220521 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 19:27:36.220589 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:36.220530 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-5h9ft\"" Apr 28 19:27:36.227527 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:36.227502 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m"] Apr 28 19:27:36.316371 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:36.316331 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25e44e91-07da-4b0d-90a4-a4cb65b8d799-openshift-service-ca-bundle\") pod \"model-chainer-raw-357ae-5d6b99669d-w894m\" (UID: \"25e44e91-07da-4b0d-90a4-a4cb65b8d799\") " pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" Apr 28 19:27:36.316550 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:36.316388 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25e44e91-07da-4b0d-90a4-a4cb65b8d799-proxy-tls\") pod \"model-chainer-raw-357ae-5d6b99669d-w894m\" (UID: \"25e44e91-07da-4b0d-90a4-a4cb65b8d799\") " pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" Apr 28 19:27:36.417223 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:36.417185 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25e44e91-07da-4b0d-90a4-a4cb65b8d799-proxy-tls\") pod \"model-chainer-raw-357ae-5d6b99669d-w894m\" (UID: \"25e44e91-07da-4b0d-90a4-a4cb65b8d799\") " pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" Apr 28 19:27:36.417410 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:36.417258 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25e44e91-07da-4b0d-90a4-a4cb65b8d799-openshift-service-ca-bundle\") pod \"model-chainer-raw-357ae-5d6b99669d-w894m\" (UID: \"25e44e91-07da-4b0d-90a4-a4cb65b8d799\") " pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" Apr 28 19:27:36.417410 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:27:36.417331 2582 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-357ae-serving-cert: secret "model-chainer-raw-357ae-serving-cert" not found Apr 28 19:27:36.417410 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:27:36.417405 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25e44e91-07da-4b0d-90a4-a4cb65b8d799-proxy-tls podName:25e44e91-07da-4b0d-90a4-a4cb65b8d799 nodeName:}" failed. No retries permitted until 2026-04-28 19:27:36.917389302 +0000 UTC m=+711.852434434 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/25e44e91-07da-4b0d-90a4-a4cb65b8d799-proxy-tls") pod "model-chainer-raw-357ae-5d6b99669d-w894m" (UID: "25e44e91-07da-4b0d-90a4-a4cb65b8d799") : secret "model-chainer-raw-357ae-serving-cert" not found Apr 28 19:27:36.417826 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:36.417809 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25e44e91-07da-4b0d-90a4-a4cb65b8d799-openshift-service-ca-bundle\") pod \"model-chainer-raw-357ae-5d6b99669d-w894m\" (UID: \"25e44e91-07da-4b0d-90a4-a4cb65b8d799\") " pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" Apr 28 19:27:36.920752 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:36.920715 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25e44e91-07da-4b0d-90a4-a4cb65b8d799-proxy-tls\") pod \"model-chainer-raw-357ae-5d6b99669d-w894m\" (UID: \"25e44e91-07da-4b0d-90a4-a4cb65b8d799\") " pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" Apr 28 19:27:36.923364 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:36.923340 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25e44e91-07da-4b0d-90a4-a4cb65b8d799-proxy-tls\") pod \"model-chainer-raw-357ae-5d6b99669d-w894m\" (UID: \"25e44e91-07da-4b0d-90a4-a4cb65b8d799\") " pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" Apr 28 19:27:37.128848 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:37.128819 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" Apr 28 19:27:37.275940 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:37.275890 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m"] Apr 28 19:27:37.282052 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:27:37.282015 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25e44e91_07da_4b0d_90a4_a4cb65b8d799.slice/crio-9105d5a6990d3e28ad41520cf31ec6fab600959f33f0d07800a4e68f764ebabf WatchSource:0}: Error finding container 9105d5a6990d3e28ad41520cf31ec6fab600959f33f0d07800a4e68f764ebabf: Status 404 returned error can't find the container with id 9105d5a6990d3e28ad41520cf31ec6fab600959f33f0d07800a4e68f764ebabf Apr 28 19:27:37.284173 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:37.284153 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:27:38.015725 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:38.015689 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" event={"ID":"25e44e91-07da-4b0d-90a4-a4cb65b8d799","Type":"ContainerStarted","Data":"9105d5a6990d3e28ad41520cf31ec6fab600959f33f0d07800a4e68f764ebabf"} Apr 28 19:27:40.023520 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:40.023484 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" event={"ID":"25e44e91-07da-4b0d-90a4-a4cb65b8d799","Type":"ContainerStarted","Data":"c292cae335ca09ad4c1a9441cbab61c189fe16d1724b7bab75a0d4e3e79f053e"} Apr 28 19:27:40.023951 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:40.023581 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" Apr 28 19:27:40.054968 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:40.054858 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" podStartSLOduration=1.526304048 podStartE2EDuration="4.054841358s" podCreationTimestamp="2026-04-28 19:27:36 +0000 UTC" firstStartedPulling="2026-04-28 19:27:37.28431826 +0000 UTC m=+712.219363389" lastFinishedPulling="2026-04-28 19:27:39.812855573 +0000 UTC m=+714.747900699" observedRunningTime="2026-04-28 19:27:40.051111811 +0000 UTC m=+714.986156962" watchObservedRunningTime="2026-04-28 19:27:40.054841358 +0000 UTC m=+714.989886506" Apr 28 19:27:46.036584 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:46.036556 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" Apr 28 19:27:46.127241 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:46.127208 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m"] Apr 28 19:27:46.127456 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:46.127432 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" podUID="25e44e91-07da-4b0d-90a4-a4cb65b8d799" containerName="model-chainer-raw-357ae" containerID="cri-o://c292cae335ca09ad4c1a9441cbab61c189fe16d1724b7bab75a0d4e3e79f053e" gracePeriod=30 Apr 28 19:27:51.035016 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:51.034964 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" podUID="25e44e91-07da-4b0d-90a4-a4cb65b8d799" containerName="model-chainer-raw-357ae" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:27:56.034439 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:27:56.034401 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" podUID="25e44e91-07da-4b0d-90a4-a4cb65b8d799" containerName="model-chainer-raw-357ae" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:28:01.034652 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:28:01.034615 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" podUID="25e44e91-07da-4b0d-90a4-a4cb65b8d799" containerName="model-chainer-raw-357ae" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:28:01.035091 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:28:01.034726 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" Apr 28 19:28:06.034010 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:28:06.033969 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" podUID="25e44e91-07da-4b0d-90a4-a4cb65b8d799" containerName="model-chainer-raw-357ae" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:28:11.034105 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:28:11.034058 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" podUID="25e44e91-07da-4b0d-90a4-a4cb65b8d799" containerName="model-chainer-raw-357ae" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:28:16.034253 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:28:16.034210 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" podUID="25e44e91-07da-4b0d-90a4-a4cb65b8d799" containerName="model-chainer-raw-357ae" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:28:16.145038 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:28:16.145010 2582 generic.go:358] "Generic (PLEG): container finished" podID="25e44e91-07da-4b0d-90a4-a4cb65b8d799" containerID="c292cae335ca09ad4c1a9441cbab61c189fe16d1724b7bab75a0d4e3e79f053e" exitCode=0 Apr 28 19:28:16.145150 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:28:16.145055 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" event={"ID":"25e44e91-07da-4b0d-90a4-a4cb65b8d799","Type":"ContainerDied","Data":"c292cae335ca09ad4c1a9441cbab61c189fe16d1724b7bab75a0d4e3e79f053e"} Apr 28 19:28:16.275966 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:28:16.275938 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" Apr 28 19:28:16.363510 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:28:16.363411 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25e44e91-07da-4b0d-90a4-a4cb65b8d799-openshift-service-ca-bundle\") pod \"25e44e91-07da-4b0d-90a4-a4cb65b8d799\" (UID: \"25e44e91-07da-4b0d-90a4-a4cb65b8d799\") " Apr 28 19:28:16.363689 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:28:16.363552 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25e44e91-07da-4b0d-90a4-a4cb65b8d799-proxy-tls\") pod \"25e44e91-07da-4b0d-90a4-a4cb65b8d799\" (UID: \"25e44e91-07da-4b0d-90a4-a4cb65b8d799\") " Apr 28 19:28:16.363813 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:28:16.363786 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e44e91-07da-4b0d-90a4-a4cb65b8d799-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "25e44e91-07da-4b0d-90a4-a4cb65b8d799" (UID: "25e44e91-07da-4b0d-90a4-a4cb65b8d799"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:28:16.365961 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:28:16.365930 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e44e91-07da-4b0d-90a4-a4cb65b8d799-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "25e44e91-07da-4b0d-90a4-a4cb65b8d799" (UID: "25e44e91-07da-4b0d-90a4-a4cb65b8d799"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:28:16.464998 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:28:16.464966 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/25e44e91-07da-4b0d-90a4-a4cb65b8d799-proxy-tls\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:28:16.464998 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:28:16.464996 2582 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25e44e91-07da-4b0d-90a4-a4cb65b8d799-openshift-service-ca-bundle\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:28:17.150216 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:28:17.150182 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" event={"ID":"25e44e91-07da-4b0d-90a4-a4cb65b8d799","Type":"ContainerDied","Data":"9105d5a6990d3e28ad41520cf31ec6fab600959f33f0d07800a4e68f764ebabf"} Apr 28 19:28:17.150216 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:28:17.150209 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m" Apr 28 19:28:17.150727 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:28:17.150233 2582 scope.go:117] "RemoveContainer" containerID="c292cae335ca09ad4c1a9441cbab61c189fe16d1724b7bab75a0d4e3e79f053e" Apr 28 19:28:17.172644 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:28:17.172611 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m"] Apr 28 19:28:17.176332 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:28:17.176300 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-357ae-5d6b99669d-w894m"] Apr 28 19:28:17.637573 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:28:17.637533 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e44e91-07da-4b0d-90a4-a4cb65b8d799" path="/var/lib/kubelet/pods/25e44e91-07da-4b0d-90a4-a4cb65b8d799/volumes" Apr 28 19:29:16.472242 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:16.472204 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p"] Apr 28 19:29:16.472730 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:16.472663 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25e44e91-07da-4b0d-90a4-a4cb65b8d799" containerName="model-chainer-raw-357ae" Apr 28 19:29:16.472730 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:16.472677 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="25e44e91-07da-4b0d-90a4-a4cb65b8d799" containerName="model-chainer-raw-357ae" Apr 28 19:29:16.472830 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:16.472764 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="25e44e91-07da-4b0d-90a4-a4cb65b8d799" containerName="model-chainer-raw-357ae" Apr 28 19:29:16.475880 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:16.475863 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" Apr 28 19:29:16.478615 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:16.478590 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-5628a-kube-rbac-proxy-sar-config\"" Apr 28 19:29:16.478764 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:16.478626 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-5h9ft\"" Apr 28 19:29:16.478764 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:16.478626 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-5628a-serving-cert\"" Apr 28 19:29:16.479664 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:16.479648 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 28 19:29:16.486420 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:16.486395 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p"] Apr 28 19:29:16.554050 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:16.554016 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24891f2e-3b1e-4852-a49d-2450a35264ba-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-5628a-687c9b99db-wsj4p\" (UID: \"24891f2e-3b1e-4852-a49d-2450a35264ba\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" Apr 28 19:29:16.554216 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:16.554054 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24891f2e-3b1e-4852-a49d-2450a35264ba-proxy-tls\") pod \"model-chainer-raw-hpa-5628a-687c9b99db-wsj4p\" (UID: \"24891f2e-3b1e-4852-a49d-2450a35264ba\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" Apr 28 19:29:16.655208 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:16.655169 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24891f2e-3b1e-4852-a49d-2450a35264ba-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-5628a-687c9b99db-wsj4p\" (UID: \"24891f2e-3b1e-4852-a49d-2450a35264ba\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" Apr 28 19:29:16.655208 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:16.655211 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24891f2e-3b1e-4852-a49d-2450a35264ba-proxy-tls\") pod \"model-chainer-raw-hpa-5628a-687c9b99db-wsj4p\" (UID: \"24891f2e-3b1e-4852-a49d-2450a35264ba\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" Apr 28 19:29:16.655964 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:16.655940 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24891f2e-3b1e-4852-a49d-2450a35264ba-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-5628a-687c9b99db-wsj4p\" (UID: \"24891f2e-3b1e-4852-a49d-2450a35264ba\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" Apr 28 19:29:16.657816 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:16.657790 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24891f2e-3b1e-4852-a49d-2450a35264ba-proxy-tls\") pod \"model-chainer-raw-hpa-5628a-687c9b99db-wsj4p\" (UID: \"24891f2e-3b1e-4852-a49d-2450a35264ba\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" Apr 28 19:29:16.786533 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:16.786494 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" Apr 28 19:29:16.918065 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:16.918037 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p"] Apr 28 19:29:16.920964 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:29:16.920930 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24891f2e_3b1e_4852_a49d_2450a35264ba.slice/crio-7c0d0665cb8ae3de6053aa77dd633e7df61c97f3e768857b10c39a87cc58d621 WatchSource:0}: Error finding container 7c0d0665cb8ae3de6053aa77dd633e7df61c97f3e768857b10c39a87cc58d621: Status 404 returned error can't find the container with id 7c0d0665cb8ae3de6053aa77dd633e7df61c97f3e768857b10c39a87cc58d621 Apr 28 19:29:17.353419 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:17.353376 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" event={"ID":"24891f2e-3b1e-4852-a49d-2450a35264ba","Type":"ContainerStarted","Data":"d1bbe929a2b3d4cd1d8eaa8aac6674d7d22f6453fe192dc27c937f2be5f93284"} Apr 28 19:29:17.353419 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:17.353414 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" event={"ID":"24891f2e-3b1e-4852-a49d-2450a35264ba","Type":"ContainerStarted","Data":"7c0d0665cb8ae3de6053aa77dd633e7df61c97f3e768857b10c39a87cc58d621"} Apr 28 19:29:17.353702 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:17.353445 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" Apr 28 19:29:17.377474 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:17.377423 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" podStartSLOduration=1.3774085409999999 podStartE2EDuration="1.377408541s" podCreationTimestamp="2026-04-28 19:29:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:29:17.376382535 +0000 UTC m=+812.311427696" watchObservedRunningTime="2026-04-28 19:29:17.377408541 +0000 UTC m=+812.312453689" Apr 28 19:29:23.361921 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:23.361877 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" Apr 28 19:29:26.347987 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:26.347958 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p"] Apr 28 19:29:26.348365 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:26.348155 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" podUID="24891f2e-3b1e-4852-a49d-2450a35264ba" containerName="model-chainer-raw-hpa-5628a" containerID="cri-o://d1bbe929a2b3d4cd1d8eaa8aac6674d7d22f6453fe192dc27c937f2be5f93284" gracePeriod=30 Apr 28 19:29:28.361409 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:28.361369 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" podUID="24891f2e-3b1e-4852-a49d-2450a35264ba" containerName="model-chainer-raw-hpa-5628a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:29:33.361399 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:33.361356 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" podUID="24891f2e-3b1e-4852-a49d-2450a35264ba" containerName="model-chainer-raw-hpa-5628a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:29:38.360962 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:38.360884 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" podUID="24891f2e-3b1e-4852-a49d-2450a35264ba" containerName="model-chainer-raw-hpa-5628a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:29:38.361366 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:38.361060 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" Apr 28 19:29:43.360937 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:43.360869 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" podUID="24891f2e-3b1e-4852-a49d-2450a35264ba" containerName="model-chainer-raw-hpa-5628a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:29:48.360769 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:48.360721 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" podUID="24891f2e-3b1e-4852-a49d-2450a35264ba" containerName="model-chainer-raw-hpa-5628a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:29:53.361173 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:53.361129 2582 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" podUID="24891f2e-3b1e-4852-a49d-2450a35264ba" containerName="model-chainer-raw-hpa-5628a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 28 19:29:56.377808 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:29:56.377776 2582 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24891f2e_3b1e_4852_a49d_2450a35264ba.slice/crio-conmon-d1bbe929a2b3d4cd1d8eaa8aac6674d7d22f6453fe192dc27c937f2be5f93284.scope\": RecentStats: unable to find data in memory cache]" Apr 28 19:29:56.378134 ip-10-0-132-160 kubenswrapper[2582]: E0428 19:29:56.377779 2582 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24891f2e_3b1e_4852_a49d_2450a35264ba.slice/crio-conmon-d1bbe929a2b3d4cd1d8eaa8aac6674d7d22f6453fe192dc27c937f2be5f93284.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24891f2e_3b1e_4852_a49d_2450a35264ba.slice/crio-d1bbe929a2b3d4cd1d8eaa8aac6674d7d22f6453fe192dc27c937f2be5f93284.scope\": RecentStats: unable to find data in memory cache]" Apr 28 19:29:56.489956 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:56.489878 2582 generic.go:358] "Generic (PLEG): container finished" podID="24891f2e-3b1e-4852-a49d-2450a35264ba" containerID="d1bbe929a2b3d4cd1d8eaa8aac6674d7d22f6453fe192dc27c937f2be5f93284" exitCode=0 Apr 28 19:29:56.490090 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:56.489950 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" event={"ID":"24891f2e-3b1e-4852-a49d-2450a35264ba","Type":"ContainerDied","Data":"d1bbe929a2b3d4cd1d8eaa8aac6674d7d22f6453fe192dc27c937f2be5f93284"} Apr 28 19:29:56.490090 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:56.490008 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" event={"ID":"24891f2e-3b1e-4852-a49d-2450a35264ba","Type":"ContainerDied","Data":"7c0d0665cb8ae3de6053aa77dd633e7df61c97f3e768857b10c39a87cc58d621"} Apr 28 19:29:56.490090 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:56.490026 2582 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c0d0665cb8ae3de6053aa77dd633e7df61c97f3e768857b10c39a87cc58d621" Apr 28 19:29:56.500921 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:56.500883 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" Apr 28 19:29:56.595011 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:56.594974 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24891f2e-3b1e-4852-a49d-2450a35264ba-proxy-tls\") pod \"24891f2e-3b1e-4852-a49d-2450a35264ba\" (UID: \"24891f2e-3b1e-4852-a49d-2450a35264ba\") " Apr 28 19:29:56.595189 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:56.595087 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24891f2e-3b1e-4852-a49d-2450a35264ba-openshift-service-ca-bundle\") pod \"24891f2e-3b1e-4852-a49d-2450a35264ba\" (UID: \"24891f2e-3b1e-4852-a49d-2450a35264ba\") " Apr 28 19:29:56.595446 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:56.595423 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24891f2e-3b1e-4852-a49d-2450a35264ba-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "24891f2e-3b1e-4852-a49d-2450a35264ba" (UID: "24891f2e-3b1e-4852-a49d-2450a35264ba"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 28 19:29:56.597207 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:56.597181 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24891f2e-3b1e-4852-a49d-2450a35264ba-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "24891f2e-3b1e-4852-a49d-2450a35264ba" (UID: "24891f2e-3b1e-4852-a49d-2450a35264ba"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 28 19:29:56.696484 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:56.696442 2582 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24891f2e-3b1e-4852-a49d-2450a35264ba-openshift-service-ca-bundle\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:29:56.696484 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:56.696474 2582 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/24891f2e-3b1e-4852-a49d-2450a35264ba-proxy-tls\") on node \"ip-10-0-132-160.ec2.internal\" DevicePath \"\"" Apr 28 19:29:57.493201 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:57.493170 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p" Apr 28 19:29:57.516719 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:57.516690 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p"] Apr 28 19:29:57.520504 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:57.520479 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-5628a-687c9b99db-wsj4p"] Apr 28 19:29:57.639298 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:29:57.639264 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24891f2e-3b1e-4852-a49d-2450a35264ba" path="/var/lib/kubelet/pods/24891f2e-3b1e-4852-a49d-2450a35264ba/volumes" Apr 28 19:35:45.610293 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:35:45.610261 2582 scope.go:117] "RemoveContainer" containerID="d1bbe929a2b3d4cd1d8eaa8aac6674d7d22f6453fe192dc27c937f2be5f93284" Apr 28 19:38:26.517259 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:26.517232 2582 ???:1] "http: TLS handshake error from 10.0.134.36:34046: EOF" Apr 28 19:38:26.520493 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:26.520472 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-rb4pd_82e1e8da-199f-4f4a-b552-16b36c427bd1/global-pull-secret-syncer/0.log" Apr 28 19:38:26.606577 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:26.606542 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-h8xww_ad20da8d-206b-440c-8c98-5039db8e6f65/konnectivity-agent/0.log" Apr 28 19:38:26.657224 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:26.657194 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-160.ec2.internal_206ff994154571336dcc99880b36f4f2/haproxy/0.log" Apr 28 19:38:30.155652 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:30.155621 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-qg8lf_3e9dbac1-8dfa-4366-ae62-c45f1598141a/kube-state-metrics/0.log" Apr 28 19:38:30.180205 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:30.180177 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-qg8lf_3e9dbac1-8dfa-4366-ae62-c45f1598141a/kube-rbac-proxy-main/0.log" Apr 28 19:38:30.203767 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:30.203739 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-qg8lf_3e9dbac1-8dfa-4366-ae62-c45f1598141a/kube-rbac-proxy-self/0.log" Apr 28 19:38:30.460401 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:30.460322 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nv9pk_ead0a897-3d24-481a-8227-839c80a17804/node-exporter/0.log" Apr 28 19:38:30.482267 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:30.482239 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nv9pk_ead0a897-3d24-481a-8227-839c80a17804/kube-rbac-proxy/0.log" Apr 28 19:38:30.504878 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:30.504853 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nv9pk_ead0a897-3d24-481a-8227-839c80a17804/init-textfile/0.log" Apr 28 19:38:30.754496 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:30.754466 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-9ljl5_915ecf94-aae8-4cde-b2a3-fa92ebde0d2c/prometheus-operator/0.log" Apr 28 19:38:30.771509 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:30.771479 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-9ljl5_915ecf94-aae8-4cde-b2a3-fa92ebde0d2c/kube-rbac-proxy/0.log" Apr 28 19:38:30.794865 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:30.794838 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-hgdsv_057f39ec-8955-454a-9e94-9140f32a99bb/prometheus-operator-admission-webhook/0.log" Apr 28 19:38:30.909363 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:30.909333 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b9f9f476c-rlv57_8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc/thanos-query/0.log" Apr 28 19:38:30.932579 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:30.932547 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b9f9f476c-rlv57_8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc/kube-rbac-proxy-web/0.log" Apr 28 19:38:30.955175 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:30.955145 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b9f9f476c-rlv57_8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc/kube-rbac-proxy/0.log" Apr 28 19:38:30.980354 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:30.980325 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b9f9f476c-rlv57_8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc/prom-label-proxy/0.log" Apr 28 19:38:31.001884 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:31.001857 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b9f9f476c-rlv57_8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc/kube-rbac-proxy-rules/0.log" Apr 28 19:38:31.027348 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:31.027269 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-b9f9f476c-rlv57_8c1a742d-18af-40cf-b3cc-c0c9f7fb64cc/kube-rbac-proxy-metrics/0.log" Apr 28 19:38:32.986432 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:32.986400 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f4cb54d54-s4gfs_4ffd4cac-0219-4ab3-83fe-3f03b7a67175/console/0.log" Apr 28 19:38:33.011972 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.011941 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-hws42_3f6140f3-f0ea-4c6c-aa32-0e5ca135571f/download-server/0.log" Apr 28 19:38:33.558405 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.558361 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk"] Apr 28 19:38:33.558848 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.558822 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24891f2e-3b1e-4852-a49d-2450a35264ba" containerName="model-chainer-raw-hpa-5628a" Apr 28 19:38:33.558848 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.558850 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="24891f2e-3b1e-4852-a49d-2450a35264ba" containerName="model-chainer-raw-hpa-5628a" Apr 28 19:38:33.559096 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.558952 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="24891f2e-3b1e-4852-a49d-2450a35264ba" containerName="model-chainer-raw-hpa-5628a" Apr 28 19:38:33.561980 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.561959 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk" Apr 28 19:38:33.564615 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.564595 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rqj59\"/\"kube-root-ca.crt\"" Apr 28 19:38:33.564740 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.564596 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-rqj59\"/\"openshift-service-ca.crt\"" Apr 28 19:38:33.565599 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.565580 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-rqj59\"/\"default-dockercfg-bqjfr\"" Apr 28 19:38:33.571798 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.571772 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk"] Apr 28 19:38:33.699140 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.699102 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e82c6c78-aa63-4d8e-b91c-caceda4086d1-proc\") pod \"perf-node-gather-daemonset-z96xk\" (UID: \"e82c6c78-aa63-4d8e-b91c-caceda4086d1\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk" Apr 28 19:38:33.699334 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.699172 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e82c6c78-aa63-4d8e-b91c-caceda4086d1-sys\") pod \"perf-node-gather-daemonset-z96xk\" (UID: \"e82c6c78-aa63-4d8e-b91c-caceda4086d1\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk" Apr 28 19:38:33.699334 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.699191 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxdx2\" (UniqueName: \"kubernetes.io/projected/e82c6c78-aa63-4d8e-b91c-caceda4086d1-kube-api-access-cxdx2\") pod \"perf-node-gather-daemonset-z96xk\" (UID: \"e82c6c78-aa63-4d8e-b91c-caceda4086d1\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk" Apr 28 19:38:33.699334 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.699256 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e82c6c78-aa63-4d8e-b91c-caceda4086d1-lib-modules\") pod \"perf-node-gather-daemonset-z96xk\" (UID: \"e82c6c78-aa63-4d8e-b91c-caceda4086d1\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk" Apr 28 19:38:33.699334 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.699301 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e82c6c78-aa63-4d8e-b91c-caceda4086d1-podres\") pod \"perf-node-gather-daemonset-z96xk\" (UID: \"e82c6c78-aa63-4d8e-b91c-caceda4086d1\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk" Apr 28 19:38:33.800210 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.800171 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e82c6c78-aa63-4d8e-b91c-caceda4086d1-lib-modules\") pod \"perf-node-gather-daemonset-z96xk\" (UID: \"e82c6c78-aa63-4d8e-b91c-caceda4086d1\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk" Apr 28 19:38:33.800210 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.800217 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e82c6c78-aa63-4d8e-b91c-caceda4086d1-podres\") pod \"perf-node-gather-daemonset-z96xk\" (UID: \"e82c6c78-aa63-4d8e-b91c-caceda4086d1\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk" Apr 28 19:38:33.800433 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.800284 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e82c6c78-aa63-4d8e-b91c-caceda4086d1-proc\") pod \"perf-node-gather-daemonset-z96xk\" (UID: \"e82c6c78-aa63-4d8e-b91c-caceda4086d1\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk" Apr 28 19:38:33.800433 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.800334 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e82c6c78-aa63-4d8e-b91c-caceda4086d1-sys\") pod \"perf-node-gather-daemonset-z96xk\" (UID: \"e82c6c78-aa63-4d8e-b91c-caceda4086d1\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk" Apr 28 19:38:33.800433 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.800353 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e82c6c78-aa63-4d8e-b91c-caceda4086d1-lib-modules\") pod \"perf-node-gather-daemonset-z96xk\" (UID: \"e82c6c78-aa63-4d8e-b91c-caceda4086d1\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk" Apr 28 19:38:33.800433 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.800359 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxdx2\" (UniqueName: \"kubernetes.io/projected/e82c6c78-aa63-4d8e-b91c-caceda4086d1-kube-api-access-cxdx2\") pod \"perf-node-gather-daemonset-z96xk\" (UID: \"e82c6c78-aa63-4d8e-b91c-caceda4086d1\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk" Apr 28 19:38:33.800433 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.800377 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e82c6c78-aa63-4d8e-b91c-caceda4086d1-podres\") pod \"perf-node-gather-daemonset-z96xk\" (UID: \"e82c6c78-aa63-4d8e-b91c-caceda4086d1\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk" Apr 28 19:38:33.800433 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.800419 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e82c6c78-aa63-4d8e-b91c-caceda4086d1-proc\") pod \"perf-node-gather-daemonset-z96xk\" (UID: \"e82c6c78-aa63-4d8e-b91c-caceda4086d1\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk" Apr 28 19:38:33.800648 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.800447 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e82c6c78-aa63-4d8e-b91c-caceda4086d1-sys\") pod \"perf-node-gather-daemonset-z96xk\" (UID: \"e82c6c78-aa63-4d8e-b91c-caceda4086d1\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk" Apr 28 19:38:33.807854 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.807824 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxdx2\" (UniqueName: \"kubernetes.io/projected/e82c6c78-aa63-4d8e-b91c-caceda4086d1-kube-api-access-cxdx2\") pod \"perf-node-gather-daemonset-z96xk\" (UID: \"e82c6c78-aa63-4d8e-b91c-caceda4086d1\") " pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk" Apr 28 19:38:33.872472 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.872384 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk" Apr 28 19:38:33.998288 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:33.998233 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk"] Apr 28 19:38:34.000550 ip-10-0-132-160 kubenswrapper[2582]: W0428 19:38:34.000517 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode82c6c78_aa63_4d8e_b91c_caceda4086d1.slice/crio-78d56c11da9ad0c01280ac4f9687159fa0739d6cd28f8cf72a347984732f03ee WatchSource:0}: Error finding container 78d56c11da9ad0c01280ac4f9687159fa0739d6cd28f8cf72a347984732f03ee: Status 404 returned error can't find the container with id 78d56c11da9ad0c01280ac4f9687159fa0739d6cd28f8cf72a347984732f03ee Apr 28 19:38:34.002297 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:34.002276 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 28 19:38:34.083738 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:34.083708 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-txhn5_cf70a1bd-b873-41bc-a143-901d7c76665c/dns/0.log" Apr 28 19:38:34.102259 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:34.102229 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-txhn5_cf70a1bd-b873-41bc-a143-901d7c76665c/kube-rbac-proxy/0.log" Apr 28 19:38:34.174850 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:34.174777 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-dn855_51106027-8f90-4285-9257-0da036866696/dns-node-resolver/0.log" Apr 28 19:38:34.262056 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:34.262014 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk" event={"ID":"e82c6c78-aa63-4d8e-b91c-caceda4086d1","Type":"ContainerStarted","Data":"03fb6c03076b8e04b091ebcfc2d15340c640aeb4f34872929e1c9bf337363e43"} Apr 28 19:38:34.262056 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:34.262050 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk" event={"ID":"e82c6c78-aa63-4d8e-b91c-caceda4086d1","Type":"ContainerStarted","Data":"78d56c11da9ad0c01280ac4f9687159fa0739d6cd28f8cf72a347984732f03ee"} Apr 28 19:38:34.262309 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:34.262149 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk" Apr 28 19:38:34.278307 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:34.278258 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk" podStartSLOduration=1.278242878 podStartE2EDuration="1.278242878s" podCreationTimestamp="2026-04-28 19:38:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-28 19:38:34.276723477 +0000 UTC m=+1369.211768626" watchObservedRunningTime="2026-04-28 19:38:34.278242878 +0000 UTC m=+1369.213288025" Apr 28 19:38:34.647519 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:34.647494 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qtt7w_1c539cc3-1090-4486-ab6c-9d184f87803d/node-ca/0.log" Apr 28 19:38:35.362582 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:35.362543 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-86dfbc9d5d-cmkg7_07abf802-d61b-4b5a-8138-9f569e8d18b7/router/0.log" Apr 28 19:38:35.706150 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:35.706068 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-t6pm8_d392608a-a370-4a06-8556-ce4952638d04/serve-healthcheck-canary/0.log" Apr 28 19:38:36.175184 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:36.175153 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-72v9l_17932199-2efe-457e-ab6b-f3d6f37d1c05/kube-rbac-proxy/0.log" Apr 28 19:38:36.193155 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:36.193128 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-72v9l_17932199-2efe-457e-ab6b-f3d6f37d1c05/exporter/0.log" Apr 28 19:38:36.212379 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:36.212347 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-72v9l_17932199-2efe-457e-ab6b-f3d6f37d1c05/extractor/0.log" Apr 28 19:38:38.443959 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:38.443923 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-b85c69797-dtvcg_e9f024e6-fa41-4f14-a26b-e61136cd8e86/manager/0.log" Apr 28 19:38:38.473557 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:38.473534 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-gtkgz_a7b1603f-75cb-4411-a658-485536622575/manager/0.log" Apr 28 19:38:40.276017 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:40.275985 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-rqj59/perf-node-gather-daemonset-z96xk" Apr 28 19:38:44.085963 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:44.085927 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-f2h6c_18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7/kube-storage-version-migrator-operator/1.log" Apr 28 19:38:44.086718 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:44.086700 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-f2h6c_18ad5e91-b88a-4b8c-88a2-c52f5fe8a9f7/kube-storage-version-migrator-operator/0.log" Apr 28 19:38:45.053509 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:45.053478 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-89vjp_ae3db714-6a3b-402e-af99-fdeffaa6cdfa/kube-multus/0.log" Apr 28 19:38:45.393852 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:45.393774 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qtgvt_d2c7c82c-bdc2-456a-b466-42dee787562e/kube-multus-additional-cni-plugins/0.log" Apr 28 19:38:45.412569 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:45.412535 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qtgvt_d2c7c82c-bdc2-456a-b466-42dee787562e/egress-router-binary-copy/0.log" Apr 28 19:38:45.430929 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:45.430886 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qtgvt_d2c7c82c-bdc2-456a-b466-42dee787562e/cni-plugins/0.log" Apr 28 19:38:45.449455 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:45.449429 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qtgvt_d2c7c82c-bdc2-456a-b466-42dee787562e/bond-cni-plugin/0.log" Apr 28 19:38:45.468254 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:45.468223 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qtgvt_d2c7c82c-bdc2-456a-b466-42dee787562e/routeoverride-cni/0.log" Apr 28 19:38:45.487943 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:45.487890 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qtgvt_d2c7c82c-bdc2-456a-b466-42dee787562e/whereabouts-cni-bincopy/0.log" Apr 28 19:38:45.508036 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:45.508008 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qtgvt_d2c7c82c-bdc2-456a-b466-42dee787562e/whereabouts-cni/0.log" Apr 28 19:38:45.657790 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:45.657708 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-q2wj9_ae2a816c-4f04-45d7-bb27-80786c738721/network-metrics-daemon/0.log" Apr 28 19:38:45.675456 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:45.675432 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-q2wj9_ae2a816c-4f04-45d7-bb27-80786c738721/kube-rbac-proxy/0.log" Apr 28 19:38:46.371495 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:46.371453 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6xgsz_f830cd46-083b-489f-b13d-5a749b919ab7/ovn-controller/0.log" Apr 28 19:38:46.393178 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:46.393151 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6xgsz_f830cd46-083b-489f-b13d-5a749b919ab7/ovn-acl-logging/0.log" Apr 28 19:38:46.412936 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:46.412882 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6xgsz_f830cd46-083b-489f-b13d-5a749b919ab7/kube-rbac-proxy-node/0.log" Apr 28 19:38:46.431239 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:46.431212 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6xgsz_f830cd46-083b-489f-b13d-5a749b919ab7/kube-rbac-proxy-ovn-metrics/0.log" Apr 28 19:38:46.447779 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:46.447754 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6xgsz_f830cd46-083b-489f-b13d-5a749b919ab7/northd/0.log" Apr 28 19:38:46.468461 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:46.468387 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6xgsz_f830cd46-083b-489f-b13d-5a749b919ab7/nbdb/0.log" Apr 28 19:38:46.487069 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:46.487045 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6xgsz_f830cd46-083b-489f-b13d-5a749b919ab7/sbdb/0.log" Apr 28 19:38:46.586265 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:46.586231 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6xgsz_f830cd46-083b-489f-b13d-5a749b919ab7/ovnkube-controller/0.log" Apr 28 19:38:48.179712 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:48.179675 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-sgd2m_de46f190-a71a-4aa0-b9b0-54682ec8837f/check-endpoints/0.log" Apr 28 19:38:48.221792 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:48.221765 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-fmthr_789c39a9-aea8-4abd-b196-f303f6c6f063/network-check-target-container/0.log" Apr 28 19:38:49.178660 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:49.178633 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-nvcxv_13bd3061-759d-43b2-bf3d-0c09c0a62063/iptables-alerter/0.log" Apr 28 19:38:49.837948 ip-10-0-132-160 kubenswrapper[2582]: I0428 19:38:49.837892 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-8l89d_5a509ee1-53b3-4bd7-822e-06cb6363beff/tuned/0.log"