Apr 21 04:21:15.779572 ip-10-0-139-26 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 21 04:21:15.779583 ip-10-0-139-26 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 21 04:21:15.779590 ip-10-0-139-26 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 21 04:21:15.779833 ip-10-0-139-26 systemd[1]: Failed to start Kubernetes Kubelet. Apr 21 04:21:25.803415 ip-10-0-139-26 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 21 04:21:25.803430 ip-10-0-139-26 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 253f2272c0694d12ad1896ec6cd356a9 -- Apr 21 04:23:51.508892 ip-10-0-139-26 systemd[1]: Starting Kubernetes Kubelet... Apr 21 04:23:51.919666 ip-10-0-139-26 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 04:23:51.919666 ip-10-0-139-26 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 04:23:51.919666 ip-10-0-139-26 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 04:23:51.919666 ip-10-0-139-26 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 04:23:51.919666 ip-10-0-139-26 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 04:23:51.921151 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.921059 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 04:23:51.925447 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925420 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:23:51.925447 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925443 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:23:51.925447 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925447 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:23:51.925447 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925452 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:23:51.925687 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925458 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:23:51.925687 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925462 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:23:51.925687 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925466 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:23:51.925687 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925469 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:23:51.925917 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925521 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:23:51.925917 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925710 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:23:51.925917 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925718 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:23:51.925917 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925724 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:23:51.925917 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925728 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:23:51.925917 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925733 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:23:51.925917 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925738 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:23:51.925917 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925747 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:23:51.925917 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925754 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:23:51.925917 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925760 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:23:51.925917 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925765 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:23:51.925917 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925770 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:23:51.925917 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925774 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:23:51.925917 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925778 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:23:51.925917 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925783 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:23:51.925917 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925787 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:23:51.925917 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925791 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:23:51.925917 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925795 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:23:51.925917 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925800 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:23:51.926658 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925804 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:23:51.926658 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925808 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:23:51.926658 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925813 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:23:51.926658 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925817 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:23:51.926658 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925821 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:23:51.926658 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925826 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:23:51.926658 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925830 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:23:51.926658 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925839 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:23:51.926658 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925844 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:23:51.926658 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925849 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:23:51.926658 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925853 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:23:51.926658 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925857 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:23:51.926658 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925861 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:23:51.926658 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925865 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:23:51.926658 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925872 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:23:51.926658 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925876 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:23:51.926658 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925880 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:23:51.926658 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925884 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:23:51.926658 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925889 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:23:51.926658 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925894 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:23:51.927237 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925899 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:23:51.927237 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925904 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:23:51.927237 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925909 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:23:51.927237 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925916 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:23:51.927237 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925921 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:23:51.927237 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925925 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:23:51.927237 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925930 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:23:51.927237 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925935 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:23:51.927237 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925939 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:23:51.927237 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925943 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:23:51.927237 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925948 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:23:51.927237 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925952 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:23:51.927237 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925957 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:23:51.927237 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925961 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:23:51.927237 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925965 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:23:51.927237 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925970 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:23:51.927237 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925974 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:23:51.927237 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925978 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:23:51.927237 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925982 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:23:51.927237 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925988 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:23:51.928020 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925993 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:23:51.928020 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.925998 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:23:51.928020 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.926002 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:23:51.928020 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.926006 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:23:51.928020 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.926013 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:23:51.928020 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.926018 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:23:51.928020 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.926024 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:23:51.928020 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.926028 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:23:51.928020 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.926033 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:23:51.928020 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.926037 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:23:51.928020 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.926041 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:23:51.928020 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.926046 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:23:51.928020 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.926051 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:23:51.928020 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.926055 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:23:51.928020 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.926059 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:23:51.928020 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.926063 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:23:51.928020 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.926067 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:23:51.928020 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.926071 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:23:51.928020 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.926075 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:23:51.928020 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927362 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:23:51.928511 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927372 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:23:51.928511 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927380 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:23:51.928511 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927385 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:23:51.928511 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927390 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:23:51.928511 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927396 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:23:51.928511 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927400 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:23:51.928511 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927405 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:23:51.928511 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927409 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:23:51.928511 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927414 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:23:51.928511 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927418 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:23:51.928511 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927422 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:23:51.928511 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927426 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:23:51.928511 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927430 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:23:51.928511 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927434 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:23:51.928511 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927438 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:23:51.928511 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927443 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:23:51.928511 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927447 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:23:51.928511 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927451 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:23:51.928511 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927457 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:23:51.929001 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927461 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:23:51.929001 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927465 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:23:51.929001 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927469 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:23:51.929001 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927474 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:23:51.929001 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927479 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:23:51.929001 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927483 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:23:51.929001 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927487 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:23:51.929001 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927492 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:23:51.929001 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927496 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:23:51.929001 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927500 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:23:51.929001 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927504 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:23:51.929001 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927509 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:23:51.929001 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927514 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:23:51.929001 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927518 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:23:51.929001 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927522 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:23:51.929001 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927526 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:23:51.929001 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927530 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:23:51.929001 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927534 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:23:51.929001 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927538 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:23:51.929001 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927542 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:23:51.929811 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927546 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:23:51.929811 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927550 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:23:51.929811 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927556 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:23:51.929811 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927560 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:23:51.929811 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927565 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:23:51.929811 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927569 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:23:51.929811 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927573 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:23:51.929811 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927577 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:23:51.929811 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927581 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:23:51.929811 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927586 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:23:51.929811 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927605 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:23:51.929811 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927611 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:23:51.929811 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927615 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:23:51.929811 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927619 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:23:51.929811 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927623 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:23:51.929811 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927627 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:23:51.929811 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927632 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:23:51.929811 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927637 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:23:51.929811 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927641 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:23:51.929811 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927646 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:23:51.930702 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927650 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:23:51.930702 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927654 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:23:51.930702 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927658 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:23:51.930702 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927662 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:23:51.930702 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927666 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:23:51.930702 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927671 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:23:51.930702 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927675 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:23:51.930702 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927679 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:23:51.930702 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927683 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:23:51.930702 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927687 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:23:51.930702 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927692 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:23:51.930702 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927696 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:23:51.930702 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927700 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:23:51.930702 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927704 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:23:51.930702 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927709 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:23:51.930702 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927715 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:23:51.930702 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927719 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:23:51.930702 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927723 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:23:51.930702 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927727 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927733 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927739 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927743 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927748 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927753 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927758 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.927762 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929245 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929262 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929272 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929279 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929290 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929295 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929302 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929310 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929315 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929320 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929326 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929331 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929336 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929341 2574 flags.go:64] FLAG: --cgroup-root="" Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929346 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 04:23:51.931177 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929351 2574 flags.go:64] FLAG: --client-ca-file="" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929355 2574 flags.go:64] FLAG: --cloud-config="" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929360 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929365 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929372 2574 flags.go:64] FLAG: --cluster-domain="" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929377 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929382 2574 flags.go:64] FLAG: --config-dir="" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929388 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929393 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929399 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929404 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929409 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929415 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929421 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929425 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929431 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929436 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929441 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929448 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929453 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929458 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929463 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929469 2574 flags.go:64] FLAG: --enable-server="true" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929474 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929480 2574 flags.go:64] FLAG: --event-burst="100" Apr 21 04:23:51.931858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929485 2574 flags.go:64] FLAG: --event-qps="50" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929490 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929494 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929500 2574 flags.go:64] FLAG: --eviction-hard="" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929506 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929510 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929515 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929520 2574 flags.go:64] FLAG: --eviction-soft="" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929525 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929530 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929534 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929539 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929544 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929548 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929553 2574 flags.go:64] FLAG: --feature-gates="" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929559 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929564 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929569 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929574 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929580 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929585 2574 flags.go:64] FLAG: --help="false" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929605 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-139-26.ec2.internal" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929611 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929616 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 04:23:51.932796 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929621 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929626 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929632 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929637 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929642 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929646 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929652 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929657 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929662 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929667 2574 flags.go:64] FLAG: --kube-reserved="" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929672 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929676 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929681 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929686 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929691 2574 flags.go:64] FLAG: --lock-file="" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929696 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929701 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929706 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929715 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929720 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929725 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929729 2574 flags.go:64] FLAG: --logging-format="text" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929734 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929740 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 04:23:51.933402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929745 2574 flags.go:64] FLAG: --manifest-url="" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929749 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929756 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929761 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929768 2574 flags.go:64] FLAG: --max-pods="110" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929773 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929778 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929783 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929788 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929793 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929799 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929803 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929815 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929819 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929824 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929830 2574 flags.go:64] FLAG: --pod-cidr="" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929835 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929843 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929848 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929853 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929858 2574 flags.go:64] FLAG: --port="10250" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929863 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929868 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-001763e57ea11661f" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929873 2574 flags.go:64] FLAG: --qos-reserved="" Apr 21 04:23:51.934109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929878 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929883 2574 flags.go:64] FLAG: --register-node="true" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929888 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929892 2574 flags.go:64] FLAG: --register-with-taints="" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929898 2574 flags.go:64] FLAG: --registry-burst="10" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929903 2574 flags.go:64] FLAG: --registry-qps="5" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929908 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929913 2574 flags.go:64] FLAG: --reserved-memory="" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929927 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929932 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929937 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929943 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929947 2574 flags.go:64] FLAG: --runonce="false" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929952 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929957 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929962 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929967 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929972 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929978 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929983 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929988 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929993 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.929997 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.930002 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.930008 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.930014 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 04:23:51.934735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.930019 2574 flags.go:64] FLAG: --system-cgroups="" Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.930024 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.930033 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.930038 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.930042 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.930049 2574 flags.go:64] FLAG: --tls-min-version="" Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.930054 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.930059 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.930064 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.930068 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.930073 2574 flags.go:64] FLAG: --v="2" Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.930080 2574 flags.go:64] FLAG: --version="false" Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.930086 2574 flags.go:64] FLAG: --vmodule="" Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.930093 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.930099 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930251 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930258 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930263 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930273 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930277 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930281 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930285 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930289 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:23:51.935368 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930294 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:23:51.935974 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930299 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:23:51.935974 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930303 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:23:51.935974 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930307 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:23:51.935974 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930312 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:23:51.935974 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930317 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:23:51.935974 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930321 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:23:51.935974 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930325 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:23:51.935974 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930331 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:23:51.935974 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930336 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:23:51.935974 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930340 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:23:51.935974 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930345 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:23:51.935974 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930349 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:23:51.935974 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930353 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:23:51.935974 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930358 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:23:51.935974 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930362 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:23:51.935974 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930366 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:23:51.935974 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930370 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:23:51.935974 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930375 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:23:51.935974 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930379 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:23:51.935974 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930383 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:23:51.936481 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930387 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:23:51.936481 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930391 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:23:51.936481 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930397 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:23:51.936481 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930404 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:23:51.936481 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930408 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:23:51.936481 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930413 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:23:51.936481 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930420 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:23:51.936481 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930424 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:23:51.936481 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930428 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:23:51.936481 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930433 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:23:51.936481 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930437 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:23:51.936481 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930442 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:23:51.936481 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930446 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:23:51.936481 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930450 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:23:51.936481 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930454 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:23:51.936481 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930458 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:23:51.936481 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930462 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:23:51.936481 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930466 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:23:51.936481 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930471 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:23:51.937023 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930475 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:23:51.937023 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930480 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:23:51.937023 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930485 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:23:51.937023 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930489 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:23:51.937023 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930494 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:23:51.937023 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930498 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:23:51.937023 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930503 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:23:51.937023 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930507 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:23:51.937023 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930511 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:23:51.937023 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930516 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:23:51.937023 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930520 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:23:51.937023 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930524 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:23:51.937023 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930529 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:23:51.937023 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930533 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:23:51.937023 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930537 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:23:51.937023 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930541 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:23:51.937023 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930545 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:23:51.937023 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930550 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:23:51.937023 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930553 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:23:51.937023 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930560 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:23:51.937513 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930564 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:23:51.937513 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930569 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:23:51.937513 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930574 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:23:51.937513 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930578 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:23:51.937513 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930582 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:23:51.937513 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930586 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:23:51.937513 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930605 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:23:51.937513 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930610 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:23:51.937513 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930617 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:23:51.937513 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930622 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:23:51.937513 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930627 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:23:51.937513 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930632 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:23:51.937513 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930637 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:23:51.937513 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930641 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:23:51.937513 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930646 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:23:51.937513 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930653 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:23:51.937513 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930657 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:23:51.937513 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.930661 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:23:51.937984 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.931322 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 04:23:51.938231 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.938207 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 04:23:51.938261 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.938232 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 04:23:51.938293 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938285 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:23:51.938293 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938293 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:23:51.938343 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938297 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:23:51.938343 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938300 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:23:51.938343 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938303 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:23:51.938343 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938306 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:23:51.938343 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938309 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:23:51.938343 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938312 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:23:51.938343 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938314 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:23:51.938343 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938317 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:23:51.938343 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938320 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:23:51.938343 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938322 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:23:51.938343 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938325 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:23:51.938343 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938327 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:23:51.938343 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938330 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:23:51.938343 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938333 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:23:51.938343 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938336 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:23:51.938343 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938338 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:23:51.938343 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938341 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:23:51.938343 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938343 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:23:51.938343 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938347 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:23:51.938343 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938349 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:23:51.938852 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938352 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:23:51.938852 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938355 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:23:51.938852 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938358 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:23:51.938852 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938361 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:23:51.938852 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938363 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:23:51.938852 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938366 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:23:51.938852 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938368 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:23:51.938852 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938371 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:23:51.938852 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938373 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:23:51.938852 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938376 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:23:51.938852 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938379 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:23:51.938852 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938382 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:23:51.938852 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938384 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:23:51.938852 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938387 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:23:51.938852 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938389 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:23:51.938852 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938392 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:23:51.938852 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938394 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:23:51.938852 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938398 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:23:51.938852 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938401 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:23:51.939330 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938404 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:23:51.939330 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938406 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:23:51.939330 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938409 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:23:51.939330 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938411 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:23:51.939330 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938414 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:23:51.939330 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938416 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:23:51.939330 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938419 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:23:51.939330 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938421 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:23:51.939330 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938424 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:23:51.939330 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938426 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:23:51.939330 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938429 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:23:51.939330 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938432 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:23:51.939330 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938434 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:23:51.939330 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938436 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:23:51.939330 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938440 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:23:51.939330 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938443 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:23:51.939330 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938445 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:23:51.939330 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938448 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:23:51.939330 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938451 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:23:51.939330 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938453 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:23:51.939879 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938456 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:23:51.939879 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938458 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:23:51.939879 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938461 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:23:51.939879 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938464 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:23:51.939879 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938466 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:23:51.939879 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938469 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:23:51.939879 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938472 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:23:51.939879 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938474 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:23:51.939879 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938477 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:23:51.939879 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938479 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:23:51.939879 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938481 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:23:51.939879 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938484 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:23:51.939879 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938487 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:23:51.939879 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938489 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:23:51.939879 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938492 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:23:51.939879 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938494 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:23:51.939879 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938497 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:23:51.939879 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938499 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:23:51.939879 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938502 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:23:51.939879 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938504 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:23:51.940384 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938507 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:23:51.940384 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938510 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:23:51.940384 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938512 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:23:51.940384 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938514 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:23:51.940384 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938517 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:23:51.940384 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.938522 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 04:23:51.940384 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938643 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 04:23:51.940384 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938648 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 04:23:51.940384 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938651 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 04:23:51.940384 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938654 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 04:23:51.940384 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938657 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 04:23:51.940384 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938660 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 04:23:51.940384 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938663 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 04:23:51.940384 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938665 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 04:23:51.940384 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938668 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 04:23:51.940384 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938670 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 04:23:51.940812 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938673 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 04:23:51.940812 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938676 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 04:23:51.940812 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938679 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 04:23:51.940812 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938681 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 04:23:51.940812 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938684 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 04:23:51.940812 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938686 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 04:23:51.940812 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938689 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 04:23:51.940812 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938692 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 04:23:51.940812 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938694 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 04:23:51.940812 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938696 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 04:23:51.940812 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938699 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 04:23:51.940812 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938702 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 04:23:51.940812 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938704 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 04:23:51.940812 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938707 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 04:23:51.940812 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938709 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 04:23:51.940812 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938712 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 04:23:51.940812 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938714 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 04:23:51.940812 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938717 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 04:23:51.940812 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938719 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 04:23:51.940812 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938721 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 04:23:51.941309 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938724 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 04:23:51.941309 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938726 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 04:23:51.941309 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938730 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 04:23:51.941309 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938733 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 04:23:51.941309 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938735 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 04:23:51.941309 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938738 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 04:23:51.941309 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938742 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 04:23:51.941309 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938745 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 04:23:51.941309 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938747 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 04:23:51.941309 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938750 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 04:23:51.941309 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938752 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 04:23:51.941309 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938755 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 04:23:51.941309 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938757 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 04:23:51.941309 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938761 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 04:23:51.941309 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938763 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 04:23:51.941309 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938766 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 04:23:51.941309 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938768 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 04:23:51.941309 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938771 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 04:23:51.941309 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938773 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 04:23:51.941309 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938776 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 04:23:51.941826 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938778 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 04:23:51.941826 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938781 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 04:23:51.941826 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938783 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 04:23:51.941826 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938786 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 04:23:51.941826 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938788 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 21 04:23:51.941826 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938791 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 04:23:51.941826 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938793 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 04:23:51.941826 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938797 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 04:23:51.941826 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938800 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 04:23:51.941826 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938803 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 04:23:51.941826 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938806 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 04:23:51.941826 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938809 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 04:23:51.941826 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938812 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 04:23:51.941826 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938814 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 04:23:51.941826 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938817 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 04:23:51.941826 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938820 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 04:23:51.941826 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938823 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 04:23:51.941826 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938825 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 04:23:51.941826 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938828 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 04:23:51.942279 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938830 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 04:23:51.942279 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938833 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 04:23:51.942279 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938835 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 04:23:51.942279 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938838 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 04:23:51.942279 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938840 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 04:23:51.942279 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938843 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 04:23:51.942279 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938845 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 04:23:51.942279 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938848 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 04:23:51.942279 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938851 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 04:23:51.942279 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938853 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 04:23:51.942279 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938856 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 04:23:51.942279 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938859 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 04:23:51.942279 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938861 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 04:23:51.942279 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938864 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 04:23:51.942279 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938867 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 04:23:51.942279 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938869 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 04:23:51.942279 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:51.938871 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 04:23:51.942722 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.938876 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 04:23:51.942722 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.939551 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 04:23:51.942722 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.942574 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 04:23:51.943573 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.943560 2574 server.go:1019] "Starting client certificate rotation" Apr 21 04:23:51.943698 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.943662 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 04:23:51.943734 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.943704 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 04:23:51.965802 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.965765 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 04:23:51.973342 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.973315 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 04:23:51.989125 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.989101 2574 log.go:25] "Validated CRI v1 runtime API" Apr 21 04:23:51.994638 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.994622 2574 log.go:25] "Validated CRI v1 image API" Apr 21 04:23:51.995767 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.995743 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 04:23:51.998519 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.998490 2574 fs.go:135] Filesystem UUIDs: map[0cf0bc96-e462-480f-8fdd-3d8e8280e0a3:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 95d51217-5bde-4e75-aae0-2f4841f522c4:/dev/nvme0n1p4] Apr 21 04:23:51.998673 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:51.998519 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 04:23:52.005351 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.005228 2574 manager.go:217] Machine: {Timestamp:2026-04-21 04:23:52.002895528 +0000 UTC m=+0.383998407 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098402 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec254d3b4b29dab6a94a84c00aeb1577 SystemUUID:ec254d3b-4b29-dab6-a94a-84c00aeb1577 BootID:253f2272-c069-4d12-ad18-96ec6cd356a9 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:03:9f:11:90:53 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:03:9f:11:90:53 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1e:db:1e:88:24:65 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 04:23:52.005351 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.005343 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 04:23:52.005493 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.005430 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 04:23:52.008125 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.008098 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 04:23:52.008283 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.008125 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-26.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 04:23:52.008339 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.008289 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 04:23:52.008339 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.008297 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 04:23:52.008339 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.008310 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 04:23:52.009836 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.009814 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 04:23:52.010132 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.010115 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 04:23:52.011484 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.011473 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 21 04:23:52.011603 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.011583 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 04:23:52.014002 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.013992 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 21 04:23:52.014037 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.014010 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 04:23:52.014064 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.014043 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 04:23:52.014064 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.014053 2574 kubelet.go:397] "Adding apiserver pod source" Apr 21 04:23:52.014064 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.014062 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 04:23:52.015115 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.015103 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 04:23:52.015163 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.015121 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 04:23:52.017642 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.017626 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 04:23:52.019451 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.019436 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 04:23:52.020722 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.020710 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 04:23:52.020776 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.020728 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 04:23:52.020776 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.020734 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 04:23:52.020776 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.020740 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 04:23:52.020776 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.020746 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 04:23:52.020776 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.020753 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 04:23:52.020776 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.020762 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 04:23:52.020776 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.020770 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 04:23:52.020956 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.020781 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 04:23:52.020956 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.020788 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 04:23:52.020956 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.020805 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 04:23:52.020956 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.020814 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 04:23:52.022507 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.022488 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bxzkd" Apr 21 04:23:52.022585 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.022559 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 04:23:52.022585 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.022570 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 04:23:52.023834 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:52.023811 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-26.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 04:23:52.023932 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:52.023858 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 04:23:52.026107 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.026094 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 04:23:52.026162 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.026133 2574 server.go:1295] "Started kubelet" Apr 21 04:23:52.026258 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.026234 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 04:23:52.026375 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.026319 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 04:23:52.026471 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.026460 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 04:23:52.026702 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.026689 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-26.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 04:23:52.026982 ip-10-0-139-26 systemd[1]: Started Kubernetes Kubelet. Apr 21 04:23:52.028131 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.028098 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 04:23:52.028724 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.028708 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 21 04:23:52.029945 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.029925 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bxzkd" Apr 21 04:23:52.034346 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.034331 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 04:23:52.034410 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.034371 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 04:23:52.035018 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.035001 2574 factory.go:55] Registering systemd factory Apr 21 04:23:52.035018 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.035020 2574 factory.go:223] Registration of the systemd container factory successfully Apr 21 04:23:52.035196 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.035062 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 04:23:52.035196 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.035078 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 04:23:52.035298 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.035198 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 04:23:52.035298 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.035241 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 21 04:23:52.035298 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.035249 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 21 04:23:52.035466 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.035447 2574 factory.go:153] Registering CRI-O factory Apr 21 04:23:52.035537 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.035471 2574 factory.go:223] Registration of the crio container factory successfully Apr 21 04:23:52.035617 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:52.035533 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-26.ec2.internal\" not found" Apr 21 04:23:52.035617 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.035541 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 04:23:52.035617 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.035573 2574 factory.go:103] Registering Raw factory Apr 21 04:23:52.035617 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.035610 2574 manager.go:1196] Started watching for new ooms in manager Apr 21 04:23:52.035819 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:52.035615 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 04:23:52.037028 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.036986 2574 manager.go:319] Starting recovery of all containers Apr 21 04:23:52.037642 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.037622 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:23:52.041172 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:52.041149 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-26.ec2.internal\" not found" node="ip-10-0-139-26.ec2.internal" Apr 21 04:23:52.048156 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.047987 2574 manager.go:324] Recovery completed Apr 21 04:23:52.052526 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.052509 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:23:52.057374 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.057360 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-26.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:23:52.057436 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.057393 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-26.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:23:52.057436 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.057404 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-26.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:23:52.057832 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.057819 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 04:23:52.057832 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.057832 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 04:23:52.057949 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.057855 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 21 04:23:52.060804 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.060793 2574 policy_none.go:49] "None policy: Start" Apr 21 04:23:52.060839 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.060808 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 04:23:52.060839 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.060818 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 21 04:23:52.092848 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.092830 2574 manager.go:341] "Starting Device Plugin manager" Apr 21 04:23:52.092936 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:52.092865 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 04:23:52.092936 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.092878 2574 server.go:85] "Starting device plugin registration server" Apr 21 04:23:52.093133 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.093119 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 04:23:52.093197 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.093136 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 04:23:52.093349 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.093265 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 04:23:52.093349 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.093335 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 04:23:52.093349 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.093343 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 04:23:52.093841 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:52.093821 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 04:23:52.093942 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:52.093862 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-26.ec2.internal\" not found" Apr 21 04:23:52.170685 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.170612 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 04:23:52.172156 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.171841 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 04:23:52.172156 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.171867 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 04:23:52.172156 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.171887 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 04:23:52.172156 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.171896 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 04:23:52.172156 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:52.171937 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 04:23:52.175132 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.175112 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:23:52.194058 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.194039 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:23:52.195054 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.195026 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-26.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:23:52.195131 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.195072 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-26.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:23:52.195131 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.195087 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-26.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:23:52.195131 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.195120 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-26.ec2.internal" Apr 21 04:23:52.201630 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.201617 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-26.ec2.internal" Apr 21 04:23:52.201678 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:52.201637 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-26.ec2.internal\": node \"ip-10-0-139-26.ec2.internal\" not found" Apr 21 04:23:52.217992 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:52.217971 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-26.ec2.internal\" not found" Apr 21 04:23:52.272061 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.272025 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-139-26.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-26.ec2.internal"] Apr 21 04:23:52.272163 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.272125 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:23:52.273203 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.273188 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-26.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:23:52.273273 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.273219 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-26.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:23:52.273273 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.273232 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-26.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:23:52.274602 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.274580 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:23:52.274723 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.274700 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-26.ec2.internal" Apr 21 04:23:52.274764 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.274727 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:23:52.276850 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.276837 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-26.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:23:52.276923 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.276837 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-26.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:23:52.276923 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.276895 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-26.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:23:52.276923 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.276908 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-26.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:23:52.276923 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.276866 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-26.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:23:52.277051 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.276937 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-26.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:23:52.278180 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.278167 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-26.ec2.internal" Apr 21 04:23:52.278234 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.278190 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 04:23:52.278840 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.278820 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-26.ec2.internal" event="NodeHasSufficientMemory" Apr 21 04:23:52.278939 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.278849 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-26.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 04:23:52.278939 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.278860 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-26.ec2.internal" event="NodeHasSufficientPID" Apr 21 04:23:52.293189 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:52.293171 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-26.ec2.internal\" not found" node="ip-10-0-139-26.ec2.internal" Apr 21 04:23:52.297376 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:52.297362 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-26.ec2.internal\" not found" node="ip-10-0-139-26.ec2.internal" Apr 21 04:23:52.318429 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:52.318401 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-26.ec2.internal\" not found" Apr 21 04:23:52.338055 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.338028 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1bab162aca85481717598a6021c84d94-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-26.ec2.internal\" (UID: \"1bab162aca85481717598a6021c84d94\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-26.ec2.internal" Apr 21 04:23:52.338170 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.338059 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1bab162aca85481717598a6021c84d94-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-26.ec2.internal\" (UID: \"1bab162aca85481717598a6021c84d94\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-26.ec2.internal" Apr 21 04:23:52.338170 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.338080 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cd7c596238e57e8ba8682f24a2cccbe3-config\") pod \"kube-apiserver-proxy-ip-10-0-139-26.ec2.internal\" (UID: \"cd7c596238e57e8ba8682f24a2cccbe3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-26.ec2.internal" Apr 21 04:23:52.418835 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:52.418795 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-26.ec2.internal\" not found" Apr 21 04:23:52.439209 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.439140 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1bab162aca85481717598a6021c84d94-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-26.ec2.internal\" (UID: \"1bab162aca85481717598a6021c84d94\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-26.ec2.internal" Apr 21 04:23:52.439209 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.439173 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1bab162aca85481717598a6021c84d94-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-26.ec2.internal\" (UID: \"1bab162aca85481717598a6021c84d94\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-26.ec2.internal" Apr 21 04:23:52.439209 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.439190 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cd7c596238e57e8ba8682f24a2cccbe3-config\") pod \"kube-apiserver-proxy-ip-10-0-139-26.ec2.internal\" (UID: \"cd7c596238e57e8ba8682f24a2cccbe3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-26.ec2.internal" Apr 21 04:23:52.439405 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.439234 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cd7c596238e57e8ba8682f24a2cccbe3-config\") pod \"kube-apiserver-proxy-ip-10-0-139-26.ec2.internal\" (UID: \"cd7c596238e57e8ba8682f24a2cccbe3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-26.ec2.internal" Apr 21 04:23:52.439405 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.439236 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1bab162aca85481717598a6021c84d94-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-26.ec2.internal\" (UID: \"1bab162aca85481717598a6021c84d94\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-26.ec2.internal" Apr 21 04:23:52.439405 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.439256 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1bab162aca85481717598a6021c84d94-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-26.ec2.internal\" (UID: \"1bab162aca85481717598a6021c84d94\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-26.ec2.internal" Apr 21 04:23:52.519614 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:52.519554 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-26.ec2.internal\" not found" Apr 21 04:23:52.595099 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.595063 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-26.ec2.internal" Apr 21 04:23:52.599580 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.599555 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-26.ec2.internal" Apr 21 04:23:52.620465 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:52.620444 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-26.ec2.internal\" not found" Apr 21 04:23:52.720995 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:52.720967 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-26.ec2.internal\" not found" Apr 21 04:23:52.821474 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:52.821449 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-26.ec2.internal\" not found" Apr 21 04:23:52.921983 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:52.921952 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-26.ec2.internal\" not found" Apr 21 04:23:52.943369 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.943344 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 04:23:52.943519 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.943490 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 04:23:52.943559 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.943516 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 04:23:52.985997 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:52.985938 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:23:53.023021 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:53.022991 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-26.ec2.internal\" not found" Apr 21 04:23:53.032229 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:53.032204 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 04:18:52 +0000 UTC" deadline="2027-10-02 06:20:42.264684402 +0000 UTC" Apr 21 04:23:53.032229 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:53.032227 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12697h56m49.232459925s" Apr 21 04:23:53.035400 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:53.035378 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 04:23:53.043492 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:53.043471 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 04:23:53.065034 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:53.065009 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-qfqwr" Apr 21 04:23:53.072530 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:53.072511 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-qfqwr" Apr 21 04:23:53.103450 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:53.103421 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd7c596238e57e8ba8682f24a2cccbe3.slice/crio-445d92886773549b0f4a364d2d8871666ae55e426411a6ecd6f965f487c82c3d WatchSource:0}: Error finding container 445d92886773549b0f4a364d2d8871666ae55e426411a6ecd6f965f487c82c3d: Status 404 returned error can't find the container with id 445d92886773549b0f4a364d2d8871666ae55e426411a6ecd6f965f487c82c3d Apr 21 04:23:53.103855 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:53.103837 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bab162aca85481717598a6021c84d94.slice/crio-84e1979f14b36af7e5997d48bb668216d30772b9e3f549efd81795e66b087704 WatchSource:0}: Error finding container 84e1979f14b36af7e5997d48bb668216d30772b9e3f549efd81795e66b087704: Status 404 returned error can't find the container with id 84e1979f14b36af7e5997d48bb668216d30772b9e3f549efd81795e66b087704 Apr 21 04:23:53.108388 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:53.108365 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:23:53.123360 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:53.123337 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-26.ec2.internal\" not found" Apr 21 04:23:53.174748 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:53.174691 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-26.ec2.internal" event={"ID":"1bab162aca85481717598a6021c84d94","Type":"ContainerStarted","Data":"84e1979f14b36af7e5997d48bb668216d30772b9e3f549efd81795e66b087704"} Apr 21 04:23:53.175558 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:53.175535 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-26.ec2.internal" event={"ID":"cd7c596238e57e8ba8682f24a2cccbe3","Type":"ContainerStarted","Data":"445d92886773549b0f4a364d2d8871666ae55e426411a6ecd6f965f487c82c3d"} Apr 21 04:23:53.223728 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:53.223702 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-26.ec2.internal\" not found" Apr 21 04:23:53.324270 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:53.324186 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-26.ec2.internal\" not found" Apr 21 04:23:53.406406 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:53.406374 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:23:53.435216 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:53.435190 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-26.ec2.internal" Apr 21 04:23:53.444679 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:53.444652 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 04:23:53.445370 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:53.445359 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-26.ec2.internal" Apr 21 04:23:53.455667 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:53.455642 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 04:23:53.769989 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:53.769957 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:23:54.015764 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.015729 2574 apiserver.go:52] "Watching apiserver" Apr 21 04:23:54.021578 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.021508 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 04:23:54.022048 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.022020 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-139-26.ec2.internal","openshift-cluster-node-tuning-operator/tuned-vqt7j","openshift-image-registry/node-ca-rnwcg","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-26.ec2.internal","openshift-multus/network-metrics-daemon-g7q5r","openshift-network-diagnostics/network-check-target-77kmz","openshift-network-operator/iptables-alerter-5cbdt","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5","openshift-dns/node-resolver-hpx9f","openshift-multus/multus-8hxd5","openshift-multus/multus-additional-cni-plugins-d4kbl","openshift-ovn-kubernetes/ovnkube-node-tv4p6","kube-system/konnectivity-agent-h4mfk"] Apr 21 04:23:54.023866 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.023846 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.025953 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.025932 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.026070 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.025931 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 04:23:54.026070 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.026046 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 04:23:54.026367 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.026285 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 04:23:54.026367 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.026296 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vfg2r\"" Apr 21 04:23:54.026499 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.026397 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 04:23:54.027064 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.027033 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rnwcg" Apr 21 04:23:54.028529 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.028297 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:23:54.028529 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:54.028360 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7q5r" podUID="d0b4a7b4-e6a1-4816-a96e-0792f47539d9" Apr 21 04:23:54.028529 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.028436 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 04:23:54.028745 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.028686 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-nl5wz\"" Apr 21 04:23:54.028745 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.028693 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:23:54.028986 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.028966 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 04:23:54.029242 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.029227 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 04:23:54.029349 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.029329 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-x5h7q\"" Apr 21 04:23:54.029436 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.029414 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:23:54.030323 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:54.029881 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77kmz" podUID="2e9d379b-34d6-4a3f-8e2b-addd13dce02f" Apr 21 04:23:54.030323 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.029944 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5cbdt" Apr 21 04:23:54.030323 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.030047 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 04:23:54.032542 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.032474 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:23:54.032691 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.032674 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 04:23:54.032765 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.032702 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jl4d4\"" Apr 21 04:23:54.033333 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.032998 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 04:23:54.033333 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.033006 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.035037 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.035018 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 04:23:54.035553 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.035305 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 04:23:54.035553 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.035313 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8jrn8\"" Apr 21 04:23:54.035553 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.035372 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 04:23:54.037119 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.036888 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hpx9f" Apr 21 04:23:54.037214 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.037146 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.039263 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.038770 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 04:23:54.039263 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.038824 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 04:23:54.039263 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.038899 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 04:23:54.039263 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.038951 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.039263 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.038963 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-7mcsw\"" Apr 21 04:23:54.039263 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.039103 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-27vpl\"" Apr 21 04:23:54.039263 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.039117 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 04:23:54.040136 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.040117 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-h4mfk" Apr 21 04:23:54.041792 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.041775 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 04:23:54.042181 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.042162 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 04:23:54.042181 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.042172 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-h5tb7\"" Apr 21 04:23:54.042344 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.042181 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 04:23:54.042344 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.042216 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 04:23:54.042344 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.042231 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 04:23:54.042344 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.042177 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 04:23:54.042668 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.042649 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-dqs72\"" Apr 21 04:23:54.042741 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.042693 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 04:23:54.042844 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.042827 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 04:23:54.047622 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.047563 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-sys\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.047793 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.047771 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-lib-modules\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.047894 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.047820 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/189fab48-4392-499e-b780-7994fa04cd1b-host-slash\") pod \"iptables-alerter-5cbdt\" (UID: \"189fab48-4392-499e-b780-7994fa04cd1b\") " pod="openshift-network-operator/iptables-alerter-5cbdt" Apr 21 04:23:54.047894 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.047847 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-hostroot\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.047894 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.047873 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-host-run-multus-certs\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.048038 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.047907 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-etc-sysctl-d\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.048038 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.047929 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-host\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.048038 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.047957 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c-socket-dir\") pod \"aws-ebs-csi-driver-node-sscd5\" (UID: \"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.048038 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.047980 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c-registration-dir\") pod \"aws-ebs-csi-driver-node-sscd5\" (UID: \"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.048038 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048029 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c-sys-fs\") pod \"aws-ebs-csi-driver-node-sscd5\" (UID: \"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.048288 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048063 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmtqm\" (UniqueName: \"kubernetes.io/projected/50d19486-d861-4624-a572-7c1d8e897542-kube-api-access-gmtqm\") pod \"node-resolver-hpx9f\" (UID: \"50d19486-d861-4624-a572-7c1d8e897542\") " pod="openshift-dns/node-resolver-hpx9f" Apr 21 04:23:54.048288 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048118 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-system-cni-dir\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.048288 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048144 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-host-var-lib-cni-bin\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.048288 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048163 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-run\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.048288 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048186 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c-etc-selinux\") pod \"aws-ebs-csi-driver-node-sscd5\" (UID: \"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.048288 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048209 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/50d19486-d861-4624-a572-7c1d8e897542-hosts-file\") pod \"node-resolver-hpx9f\" (UID: \"50d19486-d861-4624-a572-7c1d8e897542\") " pod="openshift-dns/node-resolver-hpx9f" Apr 21 04:23:54.048288 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048232 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8910acd9-f7d1-43e0-86a5-84c8a0670a16-multus-daemon-config\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.048288 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048262 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs\") pod \"network-metrics-daemon-g7q5r\" (UID: \"d0b4a7b4-e6a1-4816-a96e-0792f47539d9\") " pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:23:54.048288 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048281 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-tmp\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.048660 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048323 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zwds\" (UniqueName: \"kubernetes.io/projected/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-kube-api-access-5zwds\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.048660 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048352 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq7gf\" (UniqueName: \"kubernetes.io/projected/189fab48-4392-499e-b780-7994fa04cd1b-kube-api-access-qq7gf\") pod \"iptables-alerter-5cbdt\" (UID: \"189fab48-4392-499e-b780-7994fa04cd1b\") " pod="openshift-network-operator/iptables-alerter-5cbdt" Apr 21 04:23:54.048660 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048378 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/50d19486-d861-4624-a572-7c1d8e897542-tmp-dir\") pod \"node-resolver-hpx9f\" (UID: \"50d19486-d861-4624-a572-7c1d8e897542\") " pod="openshift-dns/node-resolver-hpx9f" Apr 21 04:23:54.048660 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048399 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-etc-tuned\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.048660 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048419 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-etc-modprobe-d\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.048660 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048472 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-cnibin\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.048660 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048514 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-os-release\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.048660 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048551 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-multus-socket-dir-parent\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.048660 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048576 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk4pl\" (UniqueName: \"kubernetes.io/projected/493c7d5b-0f42-40a4-ab37-19d6681834e3-kube-api-access-vk4pl\") pod \"node-ca-rnwcg\" (UID: \"493c7d5b-0f42-40a4-ab37-19d6681834e3\") " pod="openshift-image-registry/node-ca-rnwcg" Apr 21 04:23:54.048660 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048621 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs5wp\" (UniqueName: \"kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp\") pod \"network-check-target-77kmz\" (UID: \"2e9d379b-34d6-4a3f-8e2b-addd13dce02f\") " pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:23:54.048660 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048649 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-etc-kubernetes\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.049079 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048699 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-multus-cni-dir\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.049079 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048715 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8910acd9-f7d1-43e0-86a5-84c8a0670a16-cni-binary-copy\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.049079 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048729 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-host-run-k8s-cni-cncf-io\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.049079 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048749 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-host-run-netns\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.049079 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048772 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-host-var-lib-kubelet\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.049079 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048810 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-multus-conf-dir\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.049079 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048850 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-etc-kubernetes\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.049079 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048874 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzcwb\" (UniqueName: \"kubernetes.io/projected/8910acd9-f7d1-43e0-86a5-84c8a0670a16-kube-api-access-mzcwb\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.049079 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048897 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/493c7d5b-0f42-40a4-ab37-19d6681834e3-serviceca\") pod \"node-ca-rnwcg\" (UID: \"493c7d5b-0f42-40a4-ab37-19d6681834e3\") " pod="openshift-image-registry/node-ca-rnwcg" Apr 21 04:23:54.049079 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048922 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-etc-sysconfig\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.049079 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048946 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-var-lib-kubelet\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.049079 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048968 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/189fab48-4392-499e-b780-7994fa04cd1b-iptables-alerter-script\") pod \"iptables-alerter-5cbdt\" (UID: \"189fab48-4392-499e-b780-7994fa04cd1b\") " pod="openshift-network-operator/iptables-alerter-5cbdt" Apr 21 04:23:54.049079 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.048996 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sscd5\" (UID: \"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.049079 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.049038 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c-device-dir\") pod \"aws-ebs-csi-driver-node-sscd5\" (UID: \"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.049079 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.049076 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnpxw\" (UniqueName: \"kubernetes.io/projected/b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c-kube-api-access-rnpxw\") pod \"aws-ebs-csi-driver-node-sscd5\" (UID: \"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.049636 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.049112 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-host-var-lib-cni-multus\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.049636 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.049135 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/493c7d5b-0f42-40a4-ab37-19d6681834e3-host\") pod \"node-ca-rnwcg\" (UID: \"493c7d5b-0f42-40a4-ab37-19d6681834e3\") " pod="openshift-image-registry/node-ca-rnwcg" Apr 21 04:23:54.049636 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.049155 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxs7d\" (UniqueName: \"kubernetes.io/projected/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-kube-api-access-xxs7d\") pod \"network-metrics-daemon-g7q5r\" (UID: \"d0b4a7b4-e6a1-4816-a96e-0792f47539d9\") " pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:23:54.049636 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.049170 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-etc-sysctl-conf\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.049636 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.049249 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-etc-systemd\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.059417 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.059387 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 04:23:54.073175 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.073136 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 04:18:53 +0000 UTC" deadline="2027-12-30 03:01:31.428457314 +0000 UTC" Apr 21 04:23:54.073175 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.073171 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14830h37m37.355289315s" Apr 21 04:23:54.135996 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.135962 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 04:23:54.149784 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.149751 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-host-var-lib-cni-multus\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.149933 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.149790 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/493c7d5b-0f42-40a4-ab37-19d6681834e3-host\") pod \"node-ca-rnwcg\" (UID: \"493c7d5b-0f42-40a4-ab37-19d6681834e3\") " pod="openshift-image-registry/node-ca-rnwcg" Apr 21 04:23:54.149933 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.149818 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxs7d\" (UniqueName: \"kubernetes.io/projected/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-kube-api-access-xxs7d\") pod \"network-metrics-daemon-g7q5r\" (UID: \"d0b4a7b4-e6a1-4816-a96e-0792f47539d9\") " pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:23:54.149933 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.149844 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-etc-systemd\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.149933 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.149855 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-host-var-lib-cni-multus\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.149933 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.149870 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-sys\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.149933 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.149892 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c-sys-fs\") pod \"aws-ebs-csi-driver-node-sscd5\" (UID: \"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.149933 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.149905 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/493c7d5b-0f42-40a4-ab37-19d6681834e3-host\") pod \"node-ca-rnwcg\" (UID: \"493c7d5b-0f42-40a4-ab37-19d6681834e3\") " pod="openshift-image-registry/node-ca-rnwcg" Apr 21 04:23:54.149933 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.149918 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-systemd-units\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.150330 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.149984 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-etc-systemd\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.150330 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.149987 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-sys\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.150330 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150036 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c-sys-fs\") pod \"aws-ebs-csi-driver-node-sscd5\" (UID: \"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.150330 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150042 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.150330 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150075 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-host-run-multus-certs\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.150330 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150115 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-host-run-multus-certs\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.150330 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150140 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-host\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.150330 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150181 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c-socket-dir\") pod \"aws-ebs-csi-driver-node-sscd5\" (UID: \"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.150330 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150212 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-host\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.150330 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150246 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-ovnkube-config\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.150330 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150273 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3d78d6d3-1607-4a6a-88f3-3f0d7eeda73a-konnectivity-ca\") pod \"konnectivity-agent-h4mfk\" (UID: \"3d78d6d3-1607-4a6a-88f3-3f0d7eeda73a\") " pod="kube-system/konnectivity-agent-h4mfk" Apr 21 04:23:54.150330 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150294 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/50d19486-d861-4624-a572-7c1d8e897542-hosts-file\") pod \"node-resolver-hpx9f\" (UID: \"50d19486-d861-4624-a572-7c1d8e897542\") " pod="openshift-dns/node-resolver-hpx9f" Apr 21 04:23:54.150847 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150333 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-host-cni-bin\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.150847 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150379 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c-socket-dir\") pod \"aws-ebs-csi-driver-node-sscd5\" (UID: \"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.150847 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150373 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/50d19486-d861-4624-a572-7c1d8e897542-hosts-file\") pod \"node-resolver-hpx9f\" (UID: \"50d19486-d861-4624-a572-7c1d8e897542\") " pod="openshift-dns/node-resolver-hpx9f" Apr 21 04:23:54.150847 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150403 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-ovn-node-metrics-cert\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.150847 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150437 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8910acd9-f7d1-43e0-86a5-84c8a0670a16-multus-daemon-config\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.150847 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150487 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs\") pod \"network-metrics-daemon-g7q5r\" (UID: \"d0b4a7b4-e6a1-4816-a96e-0792f47539d9\") " pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:23:54.150847 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150604 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5zwds\" (UniqueName: \"kubernetes.io/projected/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-kube-api-access-5zwds\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.150847 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:54.150609 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:23:54.150847 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150631 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1182efca-c6ba-4b0f-9492-7a32d77ea693-system-cni-dir\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.150847 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150650 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn79b\" (UniqueName: \"kubernetes.io/projected/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-kube-api-access-vn79b\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.150847 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150669 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-etc-tuned\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.150847 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150697 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1182efca-c6ba-4b0f-9492-7a32d77ea693-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.150847 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:54.150736 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs podName:d0b4a7b4-e6a1-4816-a96e-0792f47539d9 nodeName:}" failed. No retries permitted until 2026-04-21 04:23:54.650700773 +0000 UTC m=+3.031803655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs") pod "network-metrics-daemon-g7q5r" (UID: "d0b4a7b4-e6a1-4816-a96e-0792f47539d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:23:54.150847 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150813 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vjdj\" (UniqueName: \"kubernetes.io/projected/1182efca-c6ba-4b0f-9492-7a32d77ea693-kube-api-access-9vjdj\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.150847 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150845 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-env-overrides\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.151703 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150889 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-etc-sysctl-conf\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.151703 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150932 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-lib-modules\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.151703 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150958 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/50d19486-d861-4624-a572-7c1d8e897542-tmp-dir\") pod \"node-resolver-hpx9f\" (UID: \"50d19486-d861-4624-a572-7c1d8e897542\") " pod="openshift-dns/node-resolver-hpx9f" Apr 21 04:23:54.151703 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150986 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-cnibin\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.151703 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.150991 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 04:23:54.151703 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151093 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-lib-modules\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.151703 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151097 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8910acd9-f7d1-43e0-86a5-84c8a0670a16-multus-daemon-config\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.151703 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151121 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-etc-sysctl-conf\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.151703 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151192 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-cnibin\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.151703 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151012 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vk4pl\" (UniqueName: \"kubernetes.io/projected/493c7d5b-0f42-40a4-ab37-19d6681834e3-kube-api-access-vk4pl\") pod \"node-ca-rnwcg\" (UID: \"493c7d5b-0f42-40a4-ab37-19d6681834e3\") " pod="openshift-image-registry/node-ca-rnwcg" Apr 21 04:23:54.151703 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151259 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-ovnkube-script-lib\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.151703 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151306 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-multus-cni-dir\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.151703 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151332 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8910acd9-f7d1-43e0-86a5-84c8a0670a16-cni-binary-copy\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.151703 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151340 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/50d19486-d861-4624-a572-7c1d8e897542-tmp-dir\") pod \"node-resolver-hpx9f\" (UID: \"50d19486-d861-4624-a572-7c1d8e897542\") " pod="openshift-dns/node-resolver-hpx9f" Apr 21 04:23:54.151703 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151373 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-host-run-k8s-cni-cncf-io\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.151703 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151408 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-host-run-k8s-cni-cncf-io\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.151703 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151413 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-host-run-netns\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.151703 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151439 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-host-var-lib-kubelet\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.152728 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151456 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-host-run-netns\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.152728 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151471 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-multus-cni-dir\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.152728 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151476 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-multus-conf-dir\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.152728 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151494 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-host-var-lib-kubelet\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.152728 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151501 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/493c7d5b-0f42-40a4-ab37-19d6681834e3-serviceca\") pod \"node-ca-rnwcg\" (UID: \"493c7d5b-0f42-40a4-ab37-19d6681834e3\") " pod="openshift-image-registry/node-ca-rnwcg" Apr 21 04:23:54.152728 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151525 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzcwb\" (UniqueName: \"kubernetes.io/projected/8910acd9-f7d1-43e0-86a5-84c8a0670a16-kube-api-access-mzcwb\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.152728 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151531 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-multus-conf-dir\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.152728 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151541 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-etc-sysconfig\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.152728 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151584 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-etc-sysconfig\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.152728 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151638 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c-device-dir\") pod \"aws-ebs-csi-driver-node-sscd5\" (UID: \"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.152728 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151675 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rnpxw\" (UniqueName: \"kubernetes.io/projected/b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c-kube-api-access-rnpxw\") pod \"aws-ebs-csi-driver-node-sscd5\" (UID: \"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.152728 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151703 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-host-kubelet\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.152728 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151733 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-run-systemd\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.152728 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151757 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-var-lib-openvswitch\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.152728 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151780 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-run-ovn\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.152728 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151810 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/189fab48-4392-499e-b780-7994fa04cd1b-host-slash\") pod \"iptables-alerter-5cbdt\" (UID: \"189fab48-4392-499e-b780-7994fa04cd1b\") " pod="openshift-network-operator/iptables-alerter-5cbdt" Apr 21 04:23:54.152728 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151835 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c-registration-dir\") pod \"aws-ebs-csi-driver-node-sscd5\" (UID: \"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.153551 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151839 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c-device-dir\") pod \"aws-ebs-csi-driver-node-sscd5\" (UID: \"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.153551 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151860 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1182efca-c6ba-4b0f-9492-7a32d77ea693-cni-binary-copy\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.153551 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151861 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8910acd9-f7d1-43e0-86a5-84c8a0670a16-cni-binary-copy\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.153551 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151885 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-host-run-ovn-kubernetes\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.153551 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151912 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-hostroot\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.153551 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151931 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/493c7d5b-0f42-40a4-ab37-19d6681834e3-serviceca\") pod \"node-ca-rnwcg\" (UID: \"493c7d5b-0f42-40a4-ab37-19d6681834e3\") " pod="openshift-image-registry/node-ca-rnwcg" Apr 21 04:23:54.153551 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151935 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-etc-sysctl-d\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.153551 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151960 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gmtqm\" (UniqueName: \"kubernetes.io/projected/50d19486-d861-4624-a572-7c1d8e897542-kube-api-access-gmtqm\") pod \"node-resolver-hpx9f\" (UID: \"50d19486-d861-4624-a572-7c1d8e897542\") " pod="openshift-dns/node-resolver-hpx9f" Apr 21 04:23:54.153551 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.151973 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c-registration-dir\") pod \"aws-ebs-csi-driver-node-sscd5\" (UID: \"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.153551 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152026 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/189fab48-4392-499e-b780-7994fa04cd1b-host-slash\") pod \"iptables-alerter-5cbdt\" (UID: \"189fab48-4392-499e-b780-7994fa04cd1b\") " pod="openshift-network-operator/iptables-alerter-5cbdt" Apr 21 04:23:54.153551 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152025 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-hostroot\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.153551 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152031 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1182efca-c6ba-4b0f-9492-7a32d77ea693-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.153551 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152079 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-system-cni-dir\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.153551 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152110 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-host-var-lib-cni-bin\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.153551 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152135 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-run\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.153551 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152141 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-etc-sysctl-d\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.153551 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152164 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c-etc-selinux\") pod \"aws-ebs-csi-driver-node-sscd5\" (UID: \"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.154099 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152188 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-host-var-lib-cni-bin\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.154099 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152190 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-tmp\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.154099 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152230 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qq7gf\" (UniqueName: \"kubernetes.io/projected/189fab48-4392-499e-b780-7994fa04cd1b-kube-api-access-qq7gf\") pod \"iptables-alerter-5cbdt\" (UID: \"189fab48-4392-499e-b780-7994fa04cd1b\") " pod="openshift-network-operator/iptables-alerter-5cbdt" Apr 21 04:23:54.154099 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152261 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-run-openvswitch\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.154099 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152233 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-run\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.154099 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152366 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c-etc-selinux\") pod \"aws-ebs-csi-driver-node-sscd5\" (UID: \"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.154099 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152415 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-system-cni-dir\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.154099 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152423 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-node-log\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.154099 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152456 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3d78d6d3-1607-4a6a-88f3-3f0d7eeda73a-agent-certs\") pod \"konnectivity-agent-h4mfk\" (UID: \"3d78d6d3-1607-4a6a-88f3-3f0d7eeda73a\") " pod="kube-system/konnectivity-agent-h4mfk" Apr 21 04:23:54.154099 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152482 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-host-cni-netd\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.154099 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152507 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1182efca-c6ba-4b0f-9492-7a32d77ea693-os-release\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.154099 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152536 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-etc-modprobe-d\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.154099 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152560 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-os-release\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.154099 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152616 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-multus-socket-dir-parent\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.154099 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152641 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hs5wp\" (UniqueName: \"kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp\") pod \"network-check-target-77kmz\" (UID: \"2e9d379b-34d6-4a3f-8e2b-addd13dce02f\") " pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:23:54.154099 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152663 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-etc-kubernetes\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.154099 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152688 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-etc-modprobe-d\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.154698 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152691 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-host-slash\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.154698 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152743 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-etc-kubernetes\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.154698 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152761 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-os-release\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.154698 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152810 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-multus-socket-dir-parent\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.154698 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152830 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8910acd9-f7d1-43e0-86a5-84c8a0670a16-etc-kubernetes\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.154698 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152880 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-etc-kubernetes\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.154698 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152772 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sscd5\" (UID: \"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.154698 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152931 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1182efca-c6ba-4b0f-9492-7a32d77ea693-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.154698 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.152971 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-host-run-netns\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.154698 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.153010 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-etc-openvswitch\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.154698 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.153032 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-log-socket\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.154698 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.153033 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sscd5\" (UID: \"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.154698 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.153046 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1182efca-c6ba-4b0f-9492-7a32d77ea693-cnibin\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.154698 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.153072 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-var-lib-kubelet\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.154698 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.153102 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/189fab48-4392-499e-b780-7994fa04cd1b-iptables-alerter-script\") pod \"iptables-alerter-5cbdt\" (UID: \"189fab48-4392-499e-b780-7994fa04cd1b\") " pod="openshift-network-operator/iptables-alerter-5cbdt" Apr 21 04:23:54.154698 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.153118 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-var-lib-kubelet\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.154698 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.153491 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/189fab48-4392-499e-b780-7994fa04cd1b-iptables-alerter-script\") pod \"iptables-alerter-5cbdt\" (UID: \"189fab48-4392-499e-b780-7994fa04cd1b\") " pod="openshift-network-operator/iptables-alerter-5cbdt" Apr 21 04:23:54.155218 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.154385 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-tmp\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.155218 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.154674 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-etc-tuned\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.157433 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.157412 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxs7d\" (UniqueName: \"kubernetes.io/projected/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-kube-api-access-xxs7d\") pod \"network-metrics-daemon-g7q5r\" (UID: \"d0b4a7b4-e6a1-4816-a96e-0792f47539d9\") " pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:23:54.158688 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:54.158532 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:23:54.158688 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:54.158554 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:23:54.158688 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:54.158568 2574 projected.go:194] Error preparing data for projected volume kube-api-access-hs5wp for pod openshift-network-diagnostics/network-check-target-77kmz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:23:54.158688 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:54.158661 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp podName:2e9d379b-34d6-4a3f-8e2b-addd13dce02f nodeName:}" failed. No retries permitted until 2026-04-21 04:23:54.658641124 +0000 UTC m=+3.039744013 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hs5wp" (UniqueName: "kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp") pod "network-check-target-77kmz" (UID: "2e9d379b-34d6-4a3f-8e2b-addd13dce02f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:23:54.159238 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.159217 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk4pl\" (UniqueName: \"kubernetes.io/projected/493c7d5b-0f42-40a4-ab37-19d6681834e3-kube-api-access-vk4pl\") pod \"node-ca-rnwcg\" (UID: \"493c7d5b-0f42-40a4-ab37-19d6681834e3\") " pod="openshift-image-registry/node-ca-rnwcg" Apr 21 04:23:54.160140 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.160116 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmtqm\" (UniqueName: \"kubernetes.io/projected/50d19486-d861-4624-a572-7c1d8e897542-kube-api-access-gmtqm\") pod \"node-resolver-hpx9f\" (UID: \"50d19486-d861-4624-a572-7c1d8e897542\") " pod="openshift-dns/node-resolver-hpx9f" Apr 21 04:23:54.160658 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.160634 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnpxw\" (UniqueName: \"kubernetes.io/projected/b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c-kube-api-access-rnpxw\") pod \"aws-ebs-csi-driver-node-sscd5\" (UID: \"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.160938 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.160920 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq7gf\" (UniqueName: \"kubernetes.io/projected/189fab48-4392-499e-b780-7994fa04cd1b-kube-api-access-qq7gf\") pod \"iptables-alerter-5cbdt\" (UID: \"189fab48-4392-499e-b780-7994fa04cd1b\") " pod="openshift-network-operator/iptables-alerter-5cbdt" Apr 21 04:23:54.161219 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.161200 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzcwb\" (UniqueName: \"kubernetes.io/projected/8910acd9-f7d1-43e0-86a5-84c8a0670a16-kube-api-access-mzcwb\") pod \"multus-8hxd5\" (UID: \"8910acd9-f7d1-43e0-86a5-84c8a0670a16\") " pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.161345 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.161328 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zwds\" (UniqueName: \"kubernetes.io/projected/94a20e2f-d282-4a4b-8267-1c4c26dac5fe-kube-api-access-5zwds\") pod \"tuned-vqt7j\" (UID: \"94a20e2f-d282-4a4b-8267-1c4c26dac5fe\") " pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.254179 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254126 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-ovnkube-config\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.254179 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254179 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3d78d6d3-1607-4a6a-88f3-3f0d7eeda73a-konnectivity-ca\") pod \"konnectivity-agent-h4mfk\" (UID: \"3d78d6d3-1607-4a6a-88f3-3f0d7eeda73a\") " pod="kube-system/konnectivity-agent-h4mfk" Apr 21 04:23:54.254423 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254205 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-host-cni-bin\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.254423 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254243 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-ovn-node-metrics-cert\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.254423 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254293 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1182efca-c6ba-4b0f-9492-7a32d77ea693-system-cni-dir\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.254423 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254317 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vn79b\" (UniqueName: \"kubernetes.io/projected/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-kube-api-access-vn79b\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.254423 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254358 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-host-cni-bin\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.254423 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254380 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1182efca-c6ba-4b0f-9492-7a32d77ea693-system-cni-dir\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.254736 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254435 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1182efca-c6ba-4b0f-9492-7a32d77ea693-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.254736 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254468 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vjdj\" (UniqueName: \"kubernetes.io/projected/1182efca-c6ba-4b0f-9492-7a32d77ea693-kube-api-access-9vjdj\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.254736 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254508 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-env-overrides\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.254736 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254540 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-ovnkube-script-lib\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.254736 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254576 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-host-kubelet\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.254736 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254586 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1182efca-c6ba-4b0f-9492-7a32d77ea693-tuning-conf-dir\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.254736 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254615 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-run-systemd\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.254736 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254640 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-var-lib-openvswitch\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.254736 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254666 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-run-ovn\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.254736 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254692 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1182efca-c6ba-4b0f-9492-7a32d77ea693-cni-binary-copy\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.254736 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254729 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3d78d6d3-1607-4a6a-88f3-3f0d7eeda73a-konnectivity-ca\") pod \"konnectivity-agent-h4mfk\" (UID: \"3d78d6d3-1607-4a6a-88f3-3f0d7eeda73a\") " pod="kube-system/konnectivity-agent-h4mfk" Apr 21 04:23:54.255272 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254721 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-host-run-ovn-kubernetes\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.255272 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254760 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-run-systemd\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.255272 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254786 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1182efca-c6ba-4b0f-9492-7a32d77ea693-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.255272 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254803 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-var-lib-openvswitch\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.255272 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254828 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-run-ovn\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.255272 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254830 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-run-openvswitch\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.255272 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254775 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-host-run-ovn-kubernetes\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.255272 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254947 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-run-openvswitch\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.255272 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.254984 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-host-kubelet\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.255272 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.255013 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-node-log\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.255272 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.255053 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3d78d6d3-1607-4a6a-88f3-3f0d7eeda73a-agent-certs\") pod \"konnectivity-agent-h4mfk\" (UID: \"3d78d6d3-1607-4a6a-88f3-3f0d7eeda73a\") " pod="kube-system/konnectivity-agent-h4mfk" Apr 21 04:23:54.255272 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.255089 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-node-log\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.255828 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.255639 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1182efca-c6ba-4b0f-9492-7a32d77ea693-cni-binary-copy\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.256504 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.256131 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-ovnkube-script-lib\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.256504 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.256302 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1182efca-c6ba-4b0f-9492-7a32d77ea693-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.256504 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.256361 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-host-cni-netd\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.256504 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.256465 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-env-overrides\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.256783 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.256609 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-ovnkube-config\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.256783 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.256748 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-host-cni-netd\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.256783 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.256760 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1182efca-c6ba-4b0f-9492-7a32d77ea693-os-release\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.256926 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.256854 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-host-slash\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.256926 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.256894 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1182efca-c6ba-4b0f-9492-7a32d77ea693-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.257049 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.256931 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-host-run-netns\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.257049 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.256961 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-etc-openvswitch\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.257049 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.256973 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-host-slash\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.257049 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.256995 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-log-socket\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.257049 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.257028 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-host-run-netns\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.257049 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.257032 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-ovn-node-metrics-cert\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.257360 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.257052 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-etc-openvswitch\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.257360 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.257100 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1182efca-c6ba-4b0f-9492-7a32d77ea693-cnibin\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.257360 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.257143 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-log-socket\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.257360 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.257146 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-systemd-units\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.257360 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.257185 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1182efca-c6ba-4b0f-9492-7a32d77ea693-cnibin\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.257360 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.257194 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.257360 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.257187 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-systemd-units\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.257360 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.257282 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.257360 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.257287 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1182efca-c6ba-4b0f-9492-7a32d77ea693-os-release\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.257749 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.257463 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1182efca-c6ba-4b0f-9492-7a32d77ea693-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.259889 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.259840 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3d78d6d3-1607-4a6a-88f3-3f0d7eeda73a-agent-certs\") pod \"konnectivity-agent-h4mfk\" (UID: \"3d78d6d3-1607-4a6a-88f3-3f0d7eeda73a\") " pod="kube-system/konnectivity-agent-h4mfk" Apr 21 04:23:54.263181 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.263158 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn79b\" (UniqueName: \"kubernetes.io/projected/1b413c11-6c0e-410d-bffc-fdd6ba8e6689-kube-api-access-vn79b\") pod \"ovnkube-node-tv4p6\" (UID: \"1b413c11-6c0e-410d-bffc-fdd6ba8e6689\") " pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.263181 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.263175 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vjdj\" (UniqueName: \"kubernetes.io/projected/1182efca-c6ba-4b0f-9492-7a32d77ea693-kube-api-access-9vjdj\") pod \"multus-additional-cni-plugins-d4kbl\" (UID: \"1182efca-c6ba-4b0f-9492-7a32d77ea693\") " pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.336169 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.336080 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8hxd5" Apr 21 04:23:54.343927 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.343901 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" Apr 21 04:23:54.351525 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.351497 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rnwcg" Apr 21 04:23:54.357166 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.357142 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5cbdt" Apr 21 04:23:54.363754 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.363730 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" Apr 21 04:23:54.371290 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.371268 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hpx9f" Apr 21 04:23:54.378864 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.378843 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-d4kbl" Apr 21 04:23:54.384465 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.384446 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:23:54.389040 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.389022 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-h4mfk" Apr 21 04:23:54.660125 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.660035 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hs5wp\" (UniqueName: \"kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp\") pod \"network-check-target-77kmz\" (UID: \"2e9d379b-34d6-4a3f-8e2b-addd13dce02f\") " pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:23:54.660125 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:54.660098 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs\") pod \"network-metrics-daemon-g7q5r\" (UID: \"d0b4a7b4-e6a1-4816-a96e-0792f47539d9\") " pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:23:54.660351 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:54.660214 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:23:54.660351 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:54.660276 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:23:54.660351 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:54.660302 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:23:54.660351 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:54.660315 2574 projected.go:194] Error preparing data for projected volume kube-api-access-hs5wp for pod openshift-network-diagnostics/network-check-target-77kmz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:23:54.660351 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:54.660281 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs podName:d0b4a7b4-e6a1-4816-a96e-0792f47539d9 nodeName:}" failed. No retries permitted until 2026-04-21 04:23:55.660260833 +0000 UTC m=+4.041363713 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs") pod "network-metrics-daemon-g7q5r" (UID: "d0b4a7b4-e6a1-4816-a96e-0792f47539d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:23:54.660587 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:54.660377 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp podName:2e9d379b-34d6-4a3f-8e2b-addd13dce02f nodeName:}" failed. No retries permitted until 2026-04-21 04:23:55.660359602 +0000 UTC m=+4.041462468 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-hs5wp" (UniqueName: "kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp") pod "network-check-target-77kmz" (UID: "2e9d379b-34d6-4a3f-8e2b-addd13dce02f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:23:54.727539 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:54.727502 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8910acd9_f7d1_43e0_86a5_84c8a0670a16.slice/crio-aab6480c297f049e3ddea8289a86dd9505fa815033797dad550bbb971d2c5459 WatchSource:0}: Error finding container aab6480c297f049e3ddea8289a86dd9505fa815033797dad550bbb971d2c5459: Status 404 returned error can't find the container with id aab6480c297f049e3ddea8289a86dd9505fa815033797dad550bbb971d2c5459 Apr 21 04:23:54.729087 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:54.729047 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b413c11_6c0e_410d_bffc_fdd6ba8e6689.slice/crio-424b9c8891a2ff5038f4bb2d55b9892c0c52f32d33e36ed5302e6004c17c6f7d WatchSource:0}: Error finding container 424b9c8891a2ff5038f4bb2d55b9892c0c52f32d33e36ed5302e6004c17c6f7d: Status 404 returned error can't find the container with id 424b9c8891a2ff5038f4bb2d55b9892c0c52f32d33e36ed5302e6004c17c6f7d Apr 21 04:23:54.732717 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:54.732690 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod189fab48_4392_499e_b780_7994fa04cd1b.slice/crio-e1eb3c35f5530b89b7126c39904ede85b5e185ebcb8457823ea4f6bde6d67cf0 WatchSource:0}: Error finding container e1eb3c35f5530b89b7126c39904ede85b5e185ebcb8457823ea4f6bde6d67cf0: Status 404 returned error can't find the container with id e1eb3c35f5530b89b7126c39904ede85b5e185ebcb8457823ea4f6bde6d67cf0 Apr 21 04:23:54.734164 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:54.734142 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d78d6d3_1607_4a6a_88f3_3f0d7eeda73a.slice/crio-c9f782c41b975429b70566ea0e09e4dfe7935979754696b04ad220bf961963bc WatchSource:0}: Error finding container c9f782c41b975429b70566ea0e09e4dfe7935979754696b04ad220bf961963bc: Status 404 returned error can't find the container with id c9f782c41b975429b70566ea0e09e4dfe7935979754696b04ad220bf961963bc Apr 21 04:23:54.734917 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:54.734894 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50d19486_d861_4624_a572_7c1d8e897542.slice/crio-d5ab95819b4acb565b99b4f5c0cca2e9b78dc9d60b6d73f83517e2fdb651b1c0 WatchSource:0}: Error finding container d5ab95819b4acb565b99b4f5c0cca2e9b78dc9d60b6d73f83517e2fdb651b1c0: Status 404 returned error can't find the container with id d5ab95819b4acb565b99b4f5c0cca2e9b78dc9d60b6d73f83517e2fdb651b1c0 Apr 21 04:23:54.736233 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:54.736196 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1182efca_c6ba_4b0f_9492_7a32d77ea693.slice/crio-c89b93cdc91ff6cab10c4882b704f9b87e342c0a286a33be66259b169b1105c3 WatchSource:0}: Error finding container c89b93cdc91ff6cab10c4882b704f9b87e342c0a286a33be66259b169b1105c3: Status 404 returned error can't find the container with id c89b93cdc91ff6cab10c4882b704f9b87e342c0a286a33be66259b169b1105c3 Apr 21 04:23:54.736988 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:54.736920 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod493c7d5b_0f42_40a4_ab37_19d6681834e3.slice/crio-349c0e2aced674c8333d67caa27d38e7e244ec5bd6ec93eb9f621e381f592c0b WatchSource:0}: Error finding container 349c0e2aced674c8333d67caa27d38e7e244ec5bd6ec93eb9f621e381f592c0b: Status 404 returned error can't find the container with id 349c0e2aced674c8333d67caa27d38e7e244ec5bd6ec93eb9f621e381f592c0b Apr 21 04:23:54.737773 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:23:54.737745 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8dd3fa2_68e7_4a44_8df7_7f2f28ca3e5c.slice/crio-fd5e2991042bdd355ae2e8e921acd60f88a1dc69e09d946ab6e432681643d7c8 WatchSource:0}: Error finding container fd5e2991042bdd355ae2e8e921acd60f88a1dc69e09d946ab6e432681643d7c8: Status 404 returned error can't find the container with id fd5e2991042bdd355ae2e8e921acd60f88a1dc69e09d946ab6e432681643d7c8 Apr 21 04:23:55.074435 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:55.074208 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 04:18:53 +0000 UTC" deadline="2027-12-13 06:14:09.603739499 +0000 UTC" Apr 21 04:23:55.075092 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:55.074443 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14425h50m14.529303848s" Apr 21 04:23:55.174639 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:55.172982 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:23:55.174639 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:55.173122 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7q5r" podUID="d0b4a7b4-e6a1-4816-a96e-0792f47539d9" Apr 21 04:23:55.194622 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:55.194553 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-26.ec2.internal" event={"ID":"cd7c596238e57e8ba8682f24a2cccbe3","Type":"ContainerStarted","Data":"619ed9f2cf3d7b51a8e4b45513473abc6c965928f8670c4bb17b4f8f981891b1"} Apr 21 04:23:55.210610 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:55.210300 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-26.ec2.internal" podStartSLOduration=2.21028104 podStartE2EDuration="2.21028104s" podCreationTimestamp="2026-04-21 04:23:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:23:55.209419994 +0000 UTC m=+3.590522883" watchObservedRunningTime="2026-04-21 04:23:55.21028104 +0000 UTC m=+3.591383929" Apr 21 04:23:55.220498 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:55.220458 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" event={"ID":"94a20e2f-d282-4a4b-8267-1c4c26dac5fe","Type":"ContainerStarted","Data":"1d23524784a153bd557c1134bf5a032602ced8cc77fe52c3ffaa4cbe158f6b05"} Apr 21 04:23:55.250621 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:55.250519 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" event={"ID":"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c","Type":"ContainerStarted","Data":"fd5e2991042bdd355ae2e8e921acd60f88a1dc69e09d946ab6e432681643d7c8"} Apr 21 04:23:55.257696 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:55.257579 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rnwcg" event={"ID":"493c7d5b-0f42-40a4-ab37-19d6681834e3","Type":"ContainerStarted","Data":"349c0e2aced674c8333d67caa27d38e7e244ec5bd6ec93eb9f621e381f592c0b"} Apr 21 04:23:55.266686 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:55.266633 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4kbl" event={"ID":"1182efca-c6ba-4b0f-9492-7a32d77ea693","Type":"ContainerStarted","Data":"c89b93cdc91ff6cab10c4882b704f9b87e342c0a286a33be66259b169b1105c3"} Apr 21 04:23:55.276885 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:55.276845 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5cbdt" event={"ID":"189fab48-4392-499e-b780-7994fa04cd1b","Type":"ContainerStarted","Data":"e1eb3c35f5530b89b7126c39904ede85b5e185ebcb8457823ea4f6bde6d67cf0"} Apr 21 04:23:55.278884 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:55.278854 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8hxd5" event={"ID":"8910acd9-f7d1-43e0-86a5-84c8a0670a16","Type":"ContainerStarted","Data":"aab6480c297f049e3ddea8289a86dd9505fa815033797dad550bbb971d2c5459"} Apr 21 04:23:55.281525 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:55.281457 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hpx9f" event={"ID":"50d19486-d861-4624-a572-7c1d8e897542","Type":"ContainerStarted","Data":"d5ab95819b4acb565b99b4f5c0cca2e9b78dc9d60b6d73f83517e2fdb651b1c0"} Apr 21 04:23:55.290464 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:55.290408 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-h4mfk" event={"ID":"3d78d6d3-1607-4a6a-88f3-3f0d7eeda73a","Type":"ContainerStarted","Data":"c9f782c41b975429b70566ea0e09e4dfe7935979754696b04ad220bf961963bc"} Apr 21 04:23:55.292889 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:55.292857 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" event={"ID":"1b413c11-6c0e-410d-bffc-fdd6ba8e6689","Type":"ContainerStarted","Data":"424b9c8891a2ff5038f4bb2d55b9892c0c52f32d33e36ed5302e6004c17c6f7d"} Apr 21 04:23:55.671864 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:55.671829 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs\") pod \"network-metrics-daemon-g7q5r\" (UID: \"d0b4a7b4-e6a1-4816-a96e-0792f47539d9\") " pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:23:55.672013 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:55.671916 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hs5wp\" (UniqueName: \"kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp\") pod \"network-check-target-77kmz\" (UID: \"2e9d379b-34d6-4a3f-8e2b-addd13dce02f\") " pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:23:55.672078 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:55.672061 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:23:55.672127 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:55.672080 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:23:55.672127 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:55.672096 2574 projected.go:194] Error preparing data for projected volume kube-api-access-hs5wp for pod openshift-network-diagnostics/network-check-target-77kmz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:23:55.672221 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:55.672155 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp podName:2e9d379b-34d6-4a3f-8e2b-addd13dce02f nodeName:}" failed. No retries permitted until 2026-04-21 04:23:57.672135043 +0000 UTC m=+6.053237926 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-hs5wp" (UniqueName: "kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp") pod "network-check-target-77kmz" (UID: "2e9d379b-34d6-4a3f-8e2b-addd13dce02f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:23:55.672702 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:55.672561 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:23:55.672702 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:55.672632 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs podName:d0b4a7b4-e6a1-4816-a96e-0792f47539d9 nodeName:}" failed. No retries permitted until 2026-04-21 04:23:57.672615148 +0000 UTC m=+6.053718026 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs") pod "network-metrics-daemon-g7q5r" (UID: "d0b4a7b4-e6a1-4816-a96e-0792f47539d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:23:56.172802 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:56.172762 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:23:56.173476 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:56.172914 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77kmz" podUID="2e9d379b-34d6-4a3f-8e2b-addd13dce02f" Apr 21 04:23:56.312932 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:56.311569 2574 generic.go:358] "Generic (PLEG): container finished" podID="1bab162aca85481717598a6021c84d94" containerID="bb147f25da269562534a0273fc8a7ef955f298b5b68e67b9e6d66a2ab1252c18" exitCode=0 Apr 21 04:23:56.312932 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:56.312889 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-26.ec2.internal" event={"ID":"1bab162aca85481717598a6021c84d94","Type":"ContainerDied","Data":"bb147f25da269562534a0273fc8a7ef955f298b5b68e67b9e6d66a2ab1252c18"} Apr 21 04:23:57.172387 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:57.172350 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:23:57.172572 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:57.172497 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7q5r" podUID="d0b4a7b4-e6a1-4816-a96e-0792f47539d9" Apr 21 04:23:57.317609 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:57.317554 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-26.ec2.internal" event={"ID":"1bab162aca85481717598a6021c84d94","Type":"ContainerStarted","Data":"34e52f7b16ac31aa480aa38621034e5bcdd9d03c0db8d667b7d32ed904bd7783"} Apr 21 04:23:57.688541 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:57.688471 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs\") pod \"network-metrics-daemon-g7q5r\" (UID: \"d0b4a7b4-e6a1-4816-a96e-0792f47539d9\") " pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:23:57.688734 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:57.688553 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hs5wp\" (UniqueName: \"kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp\") pod \"network-check-target-77kmz\" (UID: \"2e9d379b-34d6-4a3f-8e2b-addd13dce02f\") " pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:23:57.688734 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:57.688707 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:23:57.688734 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:57.688726 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:23:57.688897 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:57.688739 2574 projected.go:194] Error preparing data for projected volume kube-api-access-hs5wp for pod openshift-network-diagnostics/network-check-target-77kmz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:23:57.688897 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:57.688799 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp podName:2e9d379b-34d6-4a3f-8e2b-addd13dce02f nodeName:}" failed. No retries permitted until 2026-04-21 04:24:01.688779792 +0000 UTC m=+10.069882667 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-hs5wp" (UniqueName: "kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp") pod "network-check-target-77kmz" (UID: "2e9d379b-34d6-4a3f-8e2b-addd13dce02f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:23:57.688897 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:57.688874 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:23:57.689048 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:57.688907 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs podName:d0b4a7b4-e6a1-4816-a96e-0792f47539d9 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:01.688896033 +0000 UTC m=+10.069998903 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs") pod "network-metrics-daemon-g7q5r" (UID: "d0b4a7b4-e6a1-4816-a96e-0792f47539d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:23:58.172556 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:58.172518 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:23:58.172757 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:58.172732 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77kmz" podUID="2e9d379b-34d6-4a3f-8e2b-addd13dce02f" Apr 21 04:23:59.172368 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:23:59.172326 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:23:59.172861 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:23:59.172553 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7q5r" podUID="d0b4a7b4-e6a1-4816-a96e-0792f47539d9" Apr 21 04:24:00.172861 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:00.172710 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:24:00.173558 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:00.172840 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77kmz" podUID="2e9d379b-34d6-4a3f-8e2b-addd13dce02f" Apr 21 04:24:01.172748 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:01.172680 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:24:01.172908 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:01.172850 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7q5r" podUID="d0b4a7b4-e6a1-4816-a96e-0792f47539d9" Apr 21 04:24:01.721181 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:01.721135 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hs5wp\" (UniqueName: \"kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp\") pod \"network-check-target-77kmz\" (UID: \"2e9d379b-34d6-4a3f-8e2b-addd13dce02f\") " pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:24:01.721363 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:01.721205 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs\") pod \"network-metrics-daemon-g7q5r\" (UID: \"d0b4a7b4-e6a1-4816-a96e-0792f47539d9\") " pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:24:01.721363 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:01.721331 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:24:01.721495 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:01.721399 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs podName:d0b4a7b4-e6a1-4816-a96e-0792f47539d9 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:09.721378795 +0000 UTC m=+18.102481678 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs") pod "network-metrics-daemon-g7q5r" (UID: "d0b4a7b4-e6a1-4816-a96e-0792f47539d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:24:01.721847 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:01.721824 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:24:01.721946 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:01.721852 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:24:01.721946 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:01.721865 2574 projected.go:194] Error preparing data for projected volume kube-api-access-hs5wp for pod openshift-network-diagnostics/network-check-target-77kmz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:24:01.722044 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:01.722017 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp podName:2e9d379b-34d6-4a3f-8e2b-addd13dce02f nodeName:}" failed. No retries permitted until 2026-04-21 04:24:09.721998627 +0000 UTC m=+18.103101512 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-hs5wp" (UniqueName: "kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp") pod "network-check-target-77kmz" (UID: "2e9d379b-34d6-4a3f-8e2b-addd13dce02f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:24:02.172784 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:02.172726 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:24:02.174280 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:02.173866 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77kmz" podUID="2e9d379b-34d6-4a3f-8e2b-addd13dce02f" Apr 21 04:24:03.172732 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:03.172695 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:24:03.172908 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:03.172856 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7q5r" podUID="d0b4a7b4-e6a1-4816-a96e-0792f47539d9" Apr 21 04:24:04.176389 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:04.176307 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:24:04.176838 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:04.176439 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77kmz" podUID="2e9d379b-34d6-4a3f-8e2b-addd13dce02f" Apr 21 04:24:05.173095 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:05.173050 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:24:05.173267 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:05.173185 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7q5r" podUID="d0b4a7b4-e6a1-4816-a96e-0792f47539d9" Apr 21 04:24:06.172555 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:06.172518 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:24:06.172977 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:06.172669 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77kmz" podUID="2e9d379b-34d6-4a3f-8e2b-addd13dce02f" Apr 21 04:24:07.172841 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:07.172803 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:24:07.173252 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:07.172928 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7q5r" podUID="d0b4a7b4-e6a1-4816-a96e-0792f47539d9" Apr 21 04:24:08.172171 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:08.172134 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:24:08.172343 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:08.172251 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77kmz" podUID="2e9d379b-34d6-4a3f-8e2b-addd13dce02f" Apr 21 04:24:08.683465 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:08.683403 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-26.ec2.internal" podStartSLOduration=15.683382346 podStartE2EDuration="15.683382346s" podCreationTimestamp="2026-04-21 04:23:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:23:57.330742395 +0000 UTC m=+5.711845290" watchObservedRunningTime="2026-04-21 04:24:08.683382346 +0000 UTC m=+17.064485235" Apr 21 04:24:08.683934 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:08.683916 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-qbdpv"] Apr 21 04:24:08.693202 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:08.693165 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:08.693327 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:08.693258 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qbdpv" podUID="3d73e541-4e3f-47bf-8031-d49890b7f8d2" Apr 21 04:24:08.773627 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:08.773572 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3d73e541-4e3f-47bf-8031-d49890b7f8d2-dbus\") pod \"global-pull-secret-syncer-qbdpv\" (UID: \"3d73e541-4e3f-47bf-8031-d49890b7f8d2\") " pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:08.773797 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:08.773669 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3d73e541-4e3f-47bf-8031-d49890b7f8d2-kubelet-config\") pod \"global-pull-secret-syncer-qbdpv\" (UID: \"3d73e541-4e3f-47bf-8031-d49890b7f8d2\") " pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:08.773797 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:08.773697 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3d73e541-4e3f-47bf-8031-d49890b7f8d2-original-pull-secret\") pod \"global-pull-secret-syncer-qbdpv\" (UID: \"3d73e541-4e3f-47bf-8031-d49890b7f8d2\") " pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:08.874842 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:08.874803 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3d73e541-4e3f-47bf-8031-d49890b7f8d2-kubelet-config\") pod \"global-pull-secret-syncer-qbdpv\" (UID: \"3d73e541-4e3f-47bf-8031-d49890b7f8d2\") " pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:08.875027 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:08.874920 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3d73e541-4e3f-47bf-8031-d49890b7f8d2-kubelet-config\") pod \"global-pull-secret-syncer-qbdpv\" (UID: \"3d73e541-4e3f-47bf-8031-d49890b7f8d2\") " pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:08.875027 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:08.874948 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3d73e541-4e3f-47bf-8031-d49890b7f8d2-original-pull-secret\") pod \"global-pull-secret-syncer-qbdpv\" (UID: \"3d73e541-4e3f-47bf-8031-d49890b7f8d2\") " pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:08.875156 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:08.875045 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3d73e541-4e3f-47bf-8031-d49890b7f8d2-dbus\") pod \"global-pull-secret-syncer-qbdpv\" (UID: \"3d73e541-4e3f-47bf-8031-d49890b7f8d2\") " pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:08.875156 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:08.875047 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:24:08.875230 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:08.875155 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d73e541-4e3f-47bf-8031-d49890b7f8d2-original-pull-secret podName:3d73e541-4e3f-47bf-8031-d49890b7f8d2 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:09.375137789 +0000 UTC m=+17.756240660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3d73e541-4e3f-47bf-8031-d49890b7f8d2-original-pull-secret") pod "global-pull-secret-syncer-qbdpv" (UID: "3d73e541-4e3f-47bf-8031-d49890b7f8d2") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:24:08.875230 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:08.875185 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3d73e541-4e3f-47bf-8031-d49890b7f8d2-dbus\") pod \"global-pull-secret-syncer-qbdpv\" (UID: \"3d73e541-4e3f-47bf-8031-d49890b7f8d2\") " pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:09.173105 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:09.173067 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:24:09.173319 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:09.173196 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7q5r" podUID="d0b4a7b4-e6a1-4816-a96e-0792f47539d9" Apr 21 04:24:09.378472 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:09.378417 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3d73e541-4e3f-47bf-8031-d49890b7f8d2-original-pull-secret\") pod \"global-pull-secret-syncer-qbdpv\" (UID: \"3d73e541-4e3f-47bf-8031-d49890b7f8d2\") " pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:09.378667 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:09.378577 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:24:09.378735 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:09.378679 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d73e541-4e3f-47bf-8031-d49890b7f8d2-original-pull-secret podName:3d73e541-4e3f-47bf-8031-d49890b7f8d2 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:10.378655789 +0000 UTC m=+18.759758659 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3d73e541-4e3f-47bf-8031-d49890b7f8d2-original-pull-secret") pod "global-pull-secret-syncer-qbdpv" (UID: "3d73e541-4e3f-47bf-8031-d49890b7f8d2") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:24:09.781122 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:09.781078 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hs5wp\" (UniqueName: \"kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp\") pod \"network-check-target-77kmz\" (UID: \"2e9d379b-34d6-4a3f-8e2b-addd13dce02f\") " pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:24:09.781514 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:09.781157 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs\") pod \"network-metrics-daemon-g7q5r\" (UID: \"d0b4a7b4-e6a1-4816-a96e-0792f47539d9\") " pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:24:09.781514 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:09.781234 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:24:09.781514 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:09.781258 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:24:09.781514 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:09.781263 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:24:09.781514 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:09.781271 2574 projected.go:194] Error preparing data for projected volume kube-api-access-hs5wp for pod openshift-network-diagnostics/network-check-target-77kmz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:24:09.781514 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:09.781328 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp podName:2e9d379b-34d6-4a3f-8e2b-addd13dce02f nodeName:}" failed. No retries permitted until 2026-04-21 04:24:25.781308097 +0000 UTC m=+34.162410966 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-hs5wp" (UniqueName: "kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp") pod "network-check-target-77kmz" (UID: "2e9d379b-34d6-4a3f-8e2b-addd13dce02f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:24:09.781514 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:09.781351 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs podName:d0b4a7b4-e6a1-4816-a96e-0792f47539d9 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:25.781340519 +0000 UTC m=+34.162443385 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs") pod "network-metrics-daemon-g7q5r" (UID: "d0b4a7b4-e6a1-4816-a96e-0792f47539d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:24:10.172202 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:10.172126 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:10.172363 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:10.172126 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:24:10.172363 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:10.172257 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qbdpv" podUID="3d73e541-4e3f-47bf-8031-d49890b7f8d2" Apr 21 04:24:10.172363 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:10.172299 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77kmz" podUID="2e9d379b-34d6-4a3f-8e2b-addd13dce02f" Apr 21 04:24:10.384728 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:10.384690 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3d73e541-4e3f-47bf-8031-d49890b7f8d2-original-pull-secret\") pod \"global-pull-secret-syncer-qbdpv\" (UID: \"3d73e541-4e3f-47bf-8031-d49890b7f8d2\") " pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:10.384900 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:10.384809 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:24:10.384900 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:10.384870 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d73e541-4e3f-47bf-8031-d49890b7f8d2-original-pull-secret podName:3d73e541-4e3f-47bf-8031-d49890b7f8d2 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:12.384851955 +0000 UTC m=+20.765954820 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3d73e541-4e3f-47bf-8031-d49890b7f8d2-original-pull-secret") pod "global-pull-secret-syncer-qbdpv" (UID: "3d73e541-4e3f-47bf-8031-d49890b7f8d2") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:24:11.172135 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:11.172103 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:24:11.172552 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:11.172227 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7q5r" podUID="d0b4a7b4-e6a1-4816-a96e-0792f47539d9" Apr 21 04:24:12.174040 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:12.173827 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:12.174382 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:12.174135 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qbdpv" podUID="3d73e541-4e3f-47bf-8031-d49890b7f8d2" Apr 21 04:24:12.174382 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:12.173827 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:24:12.174382 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:12.174270 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77kmz" podUID="2e9d379b-34d6-4a3f-8e2b-addd13dce02f" Apr 21 04:24:12.346237 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:12.346202 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" event={"ID":"94a20e2f-d282-4a4b-8267-1c4c26dac5fe","Type":"ContainerStarted","Data":"2aa3d17c647c34e1b3a8065d9787c3e7aa1d9622d1f444c3c64ba48a05c5e470"} Apr 21 04:24:12.347434 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:12.347407 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" event={"ID":"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c","Type":"ContainerStarted","Data":"cd4e1b556dbc516f926c5ee8728ada04d10e1222dee3d1991df79f41e48bd44b"} Apr 21 04:24:12.348801 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:12.348776 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rnwcg" event={"ID":"493c7d5b-0f42-40a4-ab37-19d6681834e3","Type":"ContainerStarted","Data":"b118a8bfefd50e1303916ad75f6614cd3bd6c45af95a705f77b9ed20c5b38f97"} Apr 21 04:24:12.350185 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:12.350154 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4kbl" event={"ID":"1182efca-c6ba-4b0f-9492-7a32d77ea693","Type":"ContainerStarted","Data":"bdcb6d0419bd1b4d9620424e23221fb46bde84e0454230efbc15795edbc37e3b"} Apr 21 04:24:12.351636 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:12.351570 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8hxd5" event={"ID":"8910acd9-f7d1-43e0-86a5-84c8a0670a16","Type":"ContainerStarted","Data":"c020f2680a12ba06225e228bc942b575e3b32930e0b48a3d790636ed56faa31f"} Apr 21 04:24:12.352944 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:12.352924 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hpx9f" event={"ID":"50d19486-d861-4624-a572-7c1d8e897542","Type":"ContainerStarted","Data":"3b553c6654773b5c727f4e24bac470c82cf383fbcdffc8a57525a5e8960f8a83"} Apr 21 04:24:12.356288 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:12.356266 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-h4mfk" event={"ID":"3d78d6d3-1607-4a6a-88f3-3f0d7eeda73a","Type":"ContainerStarted","Data":"5125908a8742e56821d5703c00285e78afe5a60f62471b602dbd7a03faef1ac1"} Apr 21 04:24:12.357695 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:12.357677 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" event={"ID":"1b413c11-6c0e-410d-bffc-fdd6ba8e6689","Type":"ContainerStarted","Data":"e270b79415c82e18c7c4554e55724590afff1834216d2dba6e8ae0f70918e229"} Apr 21 04:24:12.357776 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:12.357699 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" event={"ID":"1b413c11-6c0e-410d-bffc-fdd6ba8e6689","Type":"ContainerStarted","Data":"759604a3f6ccae0c5de1dd5a42828171794615144204c232f13abcf4950b0de0"} Apr 21 04:24:12.363795 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:12.363755 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-vqt7j" podStartSLOduration=3.247008161 podStartE2EDuration="20.363744497s" podCreationTimestamp="2026-04-21 04:23:52 +0000 UTC" firstStartedPulling="2026-04-21 04:23:54.742923946 +0000 UTC m=+3.124026814" lastFinishedPulling="2026-04-21 04:24:11.859660271 +0000 UTC m=+20.240763150" observedRunningTime="2026-04-21 04:24:12.363615888 +0000 UTC m=+20.744718770" watchObservedRunningTime="2026-04-21 04:24:12.363744497 +0000 UTC m=+20.744847385" Apr 21 04:24:12.375337 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:12.375297 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rnwcg" podStartSLOduration=11.395504064 podStartE2EDuration="20.375286413s" podCreationTimestamp="2026-04-21 04:23:52 +0000 UTC" firstStartedPulling="2026-04-21 04:23:54.740086895 +0000 UTC m=+3.121189798" lastFinishedPulling="2026-04-21 04:24:03.719869265 +0000 UTC m=+12.100972147" observedRunningTime="2026-04-21 04:24:12.374886515 +0000 UTC m=+20.755989404" watchObservedRunningTime="2026-04-21 04:24:12.375286413 +0000 UTC m=+20.756389301" Apr 21 04:24:12.386959 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:12.386917 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hpx9f" podStartSLOduration=3.627223091 podStartE2EDuration="20.38690378s" podCreationTimestamp="2026-04-21 04:23:52 +0000 UTC" firstStartedPulling="2026-04-21 04:23:54.736558203 +0000 UTC m=+3.117661070" lastFinishedPulling="2026-04-21 04:24:11.496238883 +0000 UTC m=+19.877341759" observedRunningTime="2026-04-21 04:24:12.386704614 +0000 UTC m=+20.767807501" watchObservedRunningTime="2026-04-21 04:24:12.38690378 +0000 UTC m=+20.768006667" Apr 21 04:24:12.400845 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:12.400674 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3d73e541-4e3f-47bf-8031-d49890b7f8d2-original-pull-secret\") pod \"global-pull-secret-syncer-qbdpv\" (UID: \"3d73e541-4e3f-47bf-8031-d49890b7f8d2\") " pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:12.400845 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:12.400787 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:24:12.401025 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:12.400865 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d73e541-4e3f-47bf-8031-d49890b7f8d2-original-pull-secret podName:3d73e541-4e3f-47bf-8031-d49890b7f8d2 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:16.400845763 +0000 UTC m=+24.781948630 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3d73e541-4e3f-47bf-8031-d49890b7f8d2-original-pull-secret") pod "global-pull-secret-syncer-qbdpv" (UID: "3d73e541-4e3f-47bf-8031-d49890b7f8d2") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:24:12.402336 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:12.401765 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8hxd5" podStartSLOduration=3.264530958 podStartE2EDuration="20.401752383s" podCreationTimestamp="2026-04-21 04:23:52 +0000 UTC" firstStartedPulling="2026-04-21 04:23:54.729708826 +0000 UTC m=+3.110811707" lastFinishedPulling="2026-04-21 04:24:11.866930252 +0000 UTC m=+20.248033132" observedRunningTime="2026-04-21 04:24:12.401412026 +0000 UTC m=+20.782514916" watchObservedRunningTime="2026-04-21 04:24:12.401752383 +0000 UTC m=+20.782855271" Apr 21 04:24:12.433986 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:12.433938 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-h4mfk" podStartSLOduration=3.673573039 podStartE2EDuration="20.433923977s" podCreationTimestamp="2026-04-21 04:23:52 +0000 UTC" firstStartedPulling="2026-04-21 04:23:54.735889621 +0000 UTC m=+3.116992501" lastFinishedPulling="2026-04-21 04:24:11.496240572 +0000 UTC m=+19.877343439" observedRunningTime="2026-04-21 04:24:12.43353574 +0000 UTC m=+20.814638627" watchObservedRunningTime="2026-04-21 04:24:12.433923977 +0000 UTC m=+20.815026906" Apr 21 04:24:12.866890 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:12.866653 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-h4mfk" Apr 21 04:24:12.867286 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:12.867269 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-h4mfk" Apr 21 04:24:13.173063 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:13.172988 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:24:13.173196 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:13.173100 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7q5r" podUID="d0b4a7b4-e6a1-4816-a96e-0792f47539d9" Apr 21 04:24:13.360291 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:13.360258 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5cbdt" event={"ID":"189fab48-4392-499e-b780-7994fa04cd1b","Type":"ContainerStarted","Data":"924578d62f964c9f3a5e8516bf0ca43e340e8d91f7e56d67f075445cfb114613"} Apr 21 04:24:13.362632 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:13.362616 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tv4p6_1b413c11-6c0e-410d-bffc-fdd6ba8e6689/ovn-acl-logging/0.log" Apr 21 04:24:13.362913 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:13.362892 2574 generic.go:358] "Generic (PLEG): container finished" podID="1b413c11-6c0e-410d-bffc-fdd6ba8e6689" containerID="e270b79415c82e18c7c4554e55724590afff1834216d2dba6e8ae0f70918e229" exitCode=1 Apr 21 04:24:13.362988 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:13.362963 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" event={"ID":"1b413c11-6c0e-410d-bffc-fdd6ba8e6689","Type":"ContainerDied","Data":"e270b79415c82e18c7c4554e55724590afff1834216d2dba6e8ae0f70918e229"} Apr 21 04:24:13.363029 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:13.363002 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" event={"ID":"1b413c11-6c0e-410d-bffc-fdd6ba8e6689","Type":"ContainerStarted","Data":"8177f38c5e10a2643629de7915acc88f74fb4d23e047aec708f56900e1ce1f40"} Apr 21 04:24:13.363083 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:13.363043 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" event={"ID":"1b413c11-6c0e-410d-bffc-fdd6ba8e6689","Type":"ContainerStarted","Data":"2068e173645517a37ea8e114f13f4515f272d114fb6c07bc8c41dad21767fd0a"} Apr 21 04:24:13.363083 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:13.363056 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" event={"ID":"1b413c11-6c0e-410d-bffc-fdd6ba8e6689","Type":"ContainerStarted","Data":"af68b9cc1ad837294c36337682e8ae1c8f3f0881aeac0f42a81e411017f2cdce"} Apr 21 04:24:13.363174 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:13.363069 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" event={"ID":"1b413c11-6c0e-410d-bffc-fdd6ba8e6689","Type":"ContainerStarted","Data":"5a6bd96ea46d756389a858587e9033c1389e8e83c809aa133f97edea61062ed1"} Apr 21 04:24:13.364182 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:13.364163 2574 generic.go:358] "Generic (PLEG): container finished" podID="1182efca-c6ba-4b0f-9492-7a32d77ea693" containerID="bdcb6d0419bd1b4d9620424e23221fb46bde84e0454230efbc15795edbc37e3b" exitCode=0 Apr 21 04:24:13.364266 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:13.364248 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4kbl" event={"ID":"1182efca-c6ba-4b0f-9492-7a32d77ea693","Type":"ContainerDied","Data":"bdcb6d0419bd1b4d9620424e23221fb46bde84e0454230efbc15795edbc37e3b"} Apr 21 04:24:13.364947 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:13.364927 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-h4mfk" Apr 21 04:24:13.365266 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:13.365250 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-h4mfk" Apr 21 04:24:13.373024 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:13.372989 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5cbdt" podStartSLOduration=4.611309065 podStartE2EDuration="21.372978393s" podCreationTimestamp="2026-04-21 04:23:52 +0000 UTC" firstStartedPulling="2026-04-21 04:23:54.73457254 +0000 UTC m=+3.115675405" lastFinishedPulling="2026-04-21 04:24:11.496241861 +0000 UTC m=+19.877344733" observedRunningTime="2026-04-21 04:24:13.372827818 +0000 UTC m=+21.753930707" watchObservedRunningTime="2026-04-21 04:24:13.372978393 +0000 UTC m=+21.754081280" Apr 21 04:24:13.611727 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:13.611691 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 04:24:14.105166 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:14.104967 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T04:24:13.611723594Z","UUID":"e0542a8a-f80f-452a-b480-be18af9e3689","Handler":null,"Name":"","Endpoint":""} Apr 21 04:24:14.108272 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:14.108241 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 04:24:14.108272 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:14.108273 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 04:24:14.172716 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:14.172682 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:24:14.172901 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:14.172723 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:14.172901 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:14.172821 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77kmz" podUID="2e9d379b-34d6-4a3f-8e2b-addd13dce02f" Apr 21 04:24:14.173003 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:14.172939 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qbdpv" podUID="3d73e541-4e3f-47bf-8031-d49890b7f8d2" Apr 21 04:24:14.368692 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:14.368600 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" event={"ID":"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c","Type":"ContainerStarted","Data":"40abc4e8919d4acce298186260147cec2c51099d842ad08dbea521e96183a64b"} Apr 21 04:24:15.173167 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:15.172911 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:24:15.173307 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:15.173200 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7q5r" podUID="d0b4a7b4-e6a1-4816-a96e-0792f47539d9" Apr 21 04:24:15.373731 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:15.373700 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tv4p6_1b413c11-6c0e-410d-bffc-fdd6ba8e6689/ovn-acl-logging/0.log" Apr 21 04:24:15.374144 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:15.374098 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" event={"ID":"1b413c11-6c0e-410d-bffc-fdd6ba8e6689","Type":"ContainerStarted","Data":"46215a641636f1b47a1545fc0232deb12179ef36d174d19f6ac2fd7a1cecdd80"} Apr 21 04:24:15.376357 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:15.376321 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" event={"ID":"b8dd3fa2-68e7-4a44-8df7-7f2f28ca3e5c","Type":"ContainerStarted","Data":"0c04fbace2ee881b7837dd0cb6beac8c3259f9e8ef78f0b511c52fe8153489b9"} Apr 21 04:24:15.391670 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:15.391619 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sscd5" podStartSLOduration=3.301434965 podStartE2EDuration="23.391586248s" podCreationTimestamp="2026-04-21 04:23:52 +0000 UTC" firstStartedPulling="2026-04-21 04:23:54.741788084 +0000 UTC m=+3.122890967" lastFinishedPulling="2026-04-21 04:24:14.831939383 +0000 UTC m=+23.213042250" observedRunningTime="2026-04-21 04:24:15.391437809 +0000 UTC m=+23.772540702" watchObservedRunningTime="2026-04-21 04:24:15.391586248 +0000 UTC m=+23.772689135" Apr 21 04:24:16.172338 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:16.172221 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:24:16.172338 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:16.172239 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:16.172652 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:16.172353 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77kmz" podUID="2e9d379b-34d6-4a3f-8e2b-addd13dce02f" Apr 21 04:24:16.172652 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:16.172491 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qbdpv" podUID="3d73e541-4e3f-47bf-8031-d49890b7f8d2" Apr 21 04:24:16.438412 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:16.438328 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3d73e541-4e3f-47bf-8031-d49890b7f8d2-original-pull-secret\") pod \"global-pull-secret-syncer-qbdpv\" (UID: \"3d73e541-4e3f-47bf-8031-d49890b7f8d2\") " pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:16.438821 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:16.438477 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:24:16.438821 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:16.438554 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d73e541-4e3f-47bf-8031-d49890b7f8d2-original-pull-secret podName:3d73e541-4e3f-47bf-8031-d49890b7f8d2 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:24.438534023 +0000 UTC m=+32.819636890 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3d73e541-4e3f-47bf-8031-d49890b7f8d2-original-pull-secret") pod "global-pull-secret-syncer-qbdpv" (UID: "3d73e541-4e3f-47bf-8031-d49890b7f8d2") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:24:17.172907 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:17.172876 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:24:17.173097 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:17.173008 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7q5r" podUID="d0b4a7b4-e6a1-4816-a96e-0792f47539d9" Apr 21 04:24:18.172274 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:18.172082 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:24:18.173177 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:18.172082 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:18.173177 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:18.172347 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77kmz" podUID="2e9d379b-34d6-4a3f-8e2b-addd13dce02f" Apr 21 04:24:18.173177 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:18.172423 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qbdpv" podUID="3d73e541-4e3f-47bf-8031-d49890b7f8d2" Apr 21 04:24:18.387853 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:18.387826 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tv4p6_1b413c11-6c0e-410d-bffc-fdd6ba8e6689/ovn-acl-logging/0.log" Apr 21 04:24:18.388230 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:18.388182 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" event={"ID":"1b413c11-6c0e-410d-bffc-fdd6ba8e6689","Type":"ContainerStarted","Data":"9174f54ba59e5f0478dffd17b2886d49bee77240d3a2b33772cc10e016daa315"} Apr 21 04:24:18.388787 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:18.388635 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:24:18.388787 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:18.388664 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:24:18.388787 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:18.388761 2574 scope.go:117] "RemoveContainer" containerID="e270b79415c82e18c7c4554e55724590afff1834216d2dba6e8ae0f70918e229" Apr 21 04:24:18.408645 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:18.408615 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:24:19.172213 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:19.172184 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:24:19.172404 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:19.172288 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7q5r" podUID="d0b4a7b4-e6a1-4816-a96e-0792f47539d9" Apr 21 04:24:19.391535 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:19.391503 2574 generic.go:358] "Generic (PLEG): container finished" podID="1182efca-c6ba-4b0f-9492-7a32d77ea693" containerID="06c7021d5fd2232f221160d8b121bb42c8f4795c391a971a9e4c4143df21fa60" exitCode=0 Apr 21 04:24:19.391745 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:19.391579 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4kbl" event={"ID":"1182efca-c6ba-4b0f-9492-7a32d77ea693","Type":"ContainerDied","Data":"06c7021d5fd2232f221160d8b121bb42c8f4795c391a971a9e4c4143df21fa60"} Apr 21 04:24:19.399338 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:19.399317 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tv4p6_1b413c11-6c0e-410d-bffc-fdd6ba8e6689/ovn-acl-logging/0.log" Apr 21 04:24:19.399664 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:19.399641 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" event={"ID":"1b413c11-6c0e-410d-bffc-fdd6ba8e6689","Type":"ContainerStarted","Data":"979bd163d925f3554b2deee27c36bfcc96c4d510d30c5ffbc247445084e75536"} Apr 21 04:24:19.399971 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:19.399954 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:24:19.426438 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:19.426404 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:24:19.435264 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:19.435213 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" podStartSLOduration=10.247902348 podStartE2EDuration="27.435197176s" podCreationTimestamp="2026-04-21 04:23:52 +0000 UTC" firstStartedPulling="2026-04-21 04:23:54.731500362 +0000 UTC m=+3.112603228" lastFinishedPulling="2026-04-21 04:24:11.918795176 +0000 UTC m=+20.299898056" observedRunningTime="2026-04-21 04:24:19.433739811 +0000 UTC m=+27.814842699" watchObservedRunningTime="2026-04-21 04:24:19.435197176 +0000 UTC m=+27.816300063" Apr 21 04:24:19.985380 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:19.985349 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g7q5r"] Apr 21 04:24:19.985538 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:19.985463 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:24:19.985606 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:19.985552 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7q5r" podUID="d0b4a7b4-e6a1-4816-a96e-0792f47539d9" Apr 21 04:24:19.988692 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:19.988657 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-77kmz"] Apr 21 04:24:19.988851 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:19.988769 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:24:19.988911 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:19.988849 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77kmz" podUID="2e9d379b-34d6-4a3f-8e2b-addd13dce02f" Apr 21 04:24:19.989269 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:19.989248 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qbdpv"] Apr 21 04:24:19.989430 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:19.989351 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:19.989494 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:19.989448 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qbdpv" podUID="3d73e541-4e3f-47bf-8031-d49890b7f8d2" Apr 21 04:24:21.172560 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:21.172523 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:21.173039 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:21.172657 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qbdpv" podUID="3d73e541-4e3f-47bf-8031-d49890b7f8d2" Apr 21 04:24:21.405888 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:21.405846 2574 generic.go:358] "Generic (PLEG): container finished" podID="1182efca-c6ba-4b0f-9492-7a32d77ea693" containerID="caae27a09741f6f6fa1b09f4a4ab4b1c4ba0c0e4071b4fcb9d0eed56e72c1bd0" exitCode=0 Apr 21 04:24:21.408781 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:21.405903 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4kbl" event={"ID":"1182efca-c6ba-4b0f-9492-7a32d77ea693","Type":"ContainerDied","Data":"caae27a09741f6f6fa1b09f4a4ab4b1c4ba0c0e4071b4fcb9d0eed56e72c1bd0"} Apr 21 04:24:22.173748 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:22.173551 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:24:22.174156 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:22.173646 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:24:22.174156 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:22.173935 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77kmz" podUID="2e9d379b-34d6-4a3f-8e2b-addd13dce02f" Apr 21 04:24:22.174156 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:22.173817 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7q5r" podUID="d0b4a7b4-e6a1-4816-a96e-0792f47539d9" Apr 21 04:24:23.172351 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:23.172320 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:23.172535 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:23.172452 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qbdpv" podUID="3d73e541-4e3f-47bf-8031-d49890b7f8d2" Apr 21 04:24:23.412544 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:23.412452 2574 generic.go:358] "Generic (PLEG): container finished" podID="1182efca-c6ba-4b0f-9492-7a32d77ea693" containerID="6faf3f4260cf40e5e191dffc41fb2068559dbe549dda31ca4cb75b097402eb04" exitCode=0 Apr 21 04:24:23.412544 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:23.412508 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4kbl" event={"ID":"1182efca-c6ba-4b0f-9492-7a32d77ea693","Type":"ContainerDied","Data":"6faf3f4260cf40e5e191dffc41fb2068559dbe549dda31ca4cb75b097402eb04"} Apr 21 04:24:24.172844 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:24.172806 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:24:24.173017 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:24.172865 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:24:24.173017 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:24.172961 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7q5r" podUID="d0b4a7b4-e6a1-4816-a96e-0792f47539d9" Apr 21 04:24:24.173142 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:24.173091 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-77kmz" podUID="2e9d379b-34d6-4a3f-8e2b-addd13dce02f" Apr 21 04:24:24.501093 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:24.501044 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3d73e541-4e3f-47bf-8031-d49890b7f8d2-original-pull-secret\") pod \"global-pull-secret-syncer-qbdpv\" (UID: \"3d73e541-4e3f-47bf-8031-d49890b7f8d2\") " pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:24.501698 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:24.501224 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 04:24:24.501698 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:24.501312 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d73e541-4e3f-47bf-8031-d49890b7f8d2-original-pull-secret podName:3d73e541-4e3f-47bf-8031-d49890b7f8d2 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:40.501289409 +0000 UTC m=+48.882392288 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3d73e541-4e3f-47bf-8031-d49890b7f8d2-original-pull-secret") pod "global-pull-secret-syncer-qbdpv" (UID: "3d73e541-4e3f-47bf-8031-d49890b7f8d2") : object "kube-system"/"original-pull-secret" not registered Apr 21 04:24:24.960654 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:24.960558 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-26.ec2.internal" event="NodeReady" Apr 21 04:24:24.960856 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:24.960745 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 04:24:25.002521 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.002486 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rn9gk"] Apr 21 04:24:25.021805 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.021770 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rqbs9"] Apr 21 04:24:25.022152 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.021977 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rn9gk" Apr 21 04:24:25.024295 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.024117 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vk74d\"" Apr 21 04:24:25.024295 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.024135 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 04:24:25.024475 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.024447 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 04:24:25.034372 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.034340 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rn9gk"] Apr 21 04:24:25.034372 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.034379 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rqbs9"] Apr 21 04:24:25.034571 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.034525 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rqbs9" Apr 21 04:24:25.037305 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.037275 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 04:24:25.037437 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.037379 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l78gs\"" Apr 21 04:24:25.037637 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.037556 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 04:24:25.037770 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.037683 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 04:24:25.172814 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.172772 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:25.175198 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.175170 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 04:24:25.205811 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.205777 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0a79ec68-2e66-4417-8b48-1d40b7272c91-tmp-dir\") pod \"dns-default-rn9gk\" (UID: \"0a79ec68-2e66-4417-8b48-1d40b7272c91\") " pod="openshift-dns/dns-default-rn9gk" Apr 21 04:24:25.205989 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.205823 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vrxx\" (UniqueName: \"kubernetes.io/projected/0a79ec68-2e66-4417-8b48-1d40b7272c91-kube-api-access-4vrxx\") pod \"dns-default-rn9gk\" (UID: \"0a79ec68-2e66-4417-8b48-1d40b7272c91\") " pod="openshift-dns/dns-default-rn9gk" Apr 21 04:24:25.205989 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.205880 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-455jb\" (UniqueName: \"kubernetes.io/projected/6252963e-17c6-4a33-86f2-a83c646d8f7c-kube-api-access-455jb\") pod \"ingress-canary-rqbs9\" (UID: \"6252963e-17c6-4a33-86f2-a83c646d8f7c\") " pod="openshift-ingress-canary/ingress-canary-rqbs9" Apr 21 04:24:25.205989 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.205916 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls\") pod \"dns-default-rn9gk\" (UID: \"0a79ec68-2e66-4417-8b48-1d40b7272c91\") " pod="openshift-dns/dns-default-rn9gk" Apr 21 04:24:25.206116 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.205986 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert\") pod \"ingress-canary-rqbs9\" (UID: \"6252963e-17c6-4a33-86f2-a83c646d8f7c\") " pod="openshift-ingress-canary/ingress-canary-rqbs9" Apr 21 04:24:25.206116 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.206050 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a79ec68-2e66-4417-8b48-1d40b7272c91-config-volume\") pod \"dns-default-rn9gk\" (UID: \"0a79ec68-2e66-4417-8b48-1d40b7272c91\") " pod="openshift-dns/dns-default-rn9gk" Apr 21 04:24:25.307209 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.307164 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vrxx\" (UniqueName: \"kubernetes.io/projected/0a79ec68-2e66-4417-8b48-1d40b7272c91-kube-api-access-4vrxx\") pod \"dns-default-rn9gk\" (UID: \"0a79ec68-2e66-4417-8b48-1d40b7272c91\") " pod="openshift-dns/dns-default-rn9gk" Apr 21 04:24:25.307395 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.307249 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-455jb\" (UniqueName: \"kubernetes.io/projected/6252963e-17c6-4a33-86f2-a83c646d8f7c-kube-api-access-455jb\") pod \"ingress-canary-rqbs9\" (UID: \"6252963e-17c6-4a33-86f2-a83c646d8f7c\") " pod="openshift-ingress-canary/ingress-canary-rqbs9" Apr 21 04:24:25.307395 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.307283 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls\") pod \"dns-default-rn9gk\" (UID: \"0a79ec68-2e66-4417-8b48-1d40b7272c91\") " pod="openshift-dns/dns-default-rn9gk" Apr 21 04:24:25.307395 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.307315 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert\") pod \"ingress-canary-rqbs9\" (UID: \"6252963e-17c6-4a33-86f2-a83c646d8f7c\") " pod="openshift-ingress-canary/ingress-canary-rqbs9" Apr 21 04:24:25.307563 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:25.307408 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:24:25.307563 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.307458 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a79ec68-2e66-4417-8b48-1d40b7272c91-config-volume\") pod \"dns-default-rn9gk\" (UID: \"0a79ec68-2e66-4417-8b48-1d40b7272c91\") " pod="openshift-dns/dns-default-rn9gk" Apr 21 04:24:25.307563 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:25.307470 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:24:25.307563 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:25.307475 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert podName:6252963e-17c6-4a33-86f2-a83c646d8f7c nodeName:}" failed. No retries permitted until 2026-04-21 04:24:25.807456128 +0000 UTC m=+34.188559001 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert") pod "ingress-canary-rqbs9" (UID: "6252963e-17c6-4a33-86f2-a83c646d8f7c") : secret "canary-serving-cert" not found Apr 21 04:24:25.307563 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:25.307552 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls podName:0a79ec68-2e66-4417-8b48-1d40b7272c91 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:25.80753424 +0000 UTC m=+34.188637116 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls") pod "dns-default-rn9gk" (UID: "0a79ec68-2e66-4417-8b48-1d40b7272c91") : secret "dns-default-metrics-tls" not found Apr 21 04:24:25.307838 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.307646 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0a79ec68-2e66-4417-8b48-1d40b7272c91-tmp-dir\") pod \"dns-default-rn9gk\" (UID: \"0a79ec68-2e66-4417-8b48-1d40b7272c91\") " pod="openshift-dns/dns-default-rn9gk" Apr 21 04:24:25.308049 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.307993 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0a79ec68-2e66-4417-8b48-1d40b7272c91-config-volume\") pod \"dns-default-rn9gk\" (UID: \"0a79ec68-2e66-4417-8b48-1d40b7272c91\") " pod="openshift-dns/dns-default-rn9gk" Apr 21 04:24:25.308049 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.308021 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0a79ec68-2e66-4417-8b48-1d40b7272c91-tmp-dir\") pod \"dns-default-rn9gk\" (UID: \"0a79ec68-2e66-4417-8b48-1d40b7272c91\") " pod="openshift-dns/dns-default-rn9gk" Apr 21 04:24:25.318445 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.318408 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vrxx\" (UniqueName: \"kubernetes.io/projected/0a79ec68-2e66-4417-8b48-1d40b7272c91-kube-api-access-4vrxx\") pod \"dns-default-rn9gk\" (UID: \"0a79ec68-2e66-4417-8b48-1d40b7272c91\") " pod="openshift-dns/dns-default-rn9gk" Apr 21 04:24:25.318632 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.318531 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-455jb\" (UniqueName: \"kubernetes.io/projected/6252963e-17c6-4a33-86f2-a83c646d8f7c-kube-api-access-455jb\") pod \"ingress-canary-rqbs9\" (UID: \"6252963e-17c6-4a33-86f2-a83c646d8f7c\") " pod="openshift-ingress-canary/ingress-canary-rqbs9" Apr 21 04:24:25.811195 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.811159 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hs5wp\" (UniqueName: \"kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp\") pod \"network-check-target-77kmz\" (UID: \"2e9d379b-34d6-4a3f-8e2b-addd13dce02f\") " pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:24:25.811906 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.811206 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls\") pod \"dns-default-rn9gk\" (UID: \"0a79ec68-2e66-4417-8b48-1d40b7272c91\") " pod="openshift-dns/dns-default-rn9gk" Apr 21 04:24:25.811906 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.811231 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert\") pod \"ingress-canary-rqbs9\" (UID: \"6252963e-17c6-4a33-86f2-a83c646d8f7c\") " pod="openshift-ingress-canary/ingress-canary-rqbs9" Apr 21 04:24:25.811906 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:25.811252 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs\") pod \"network-metrics-daemon-g7q5r\" (UID: \"d0b4a7b4-e6a1-4816-a96e-0792f47539d9\") " pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:24:25.811906 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:25.811347 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:24:25.811906 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:25.811354 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:24:25.811906 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:25.811356 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:24:25.811906 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:25.811406 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs podName:d0b4a7b4-e6a1-4816-a96e-0792f47539d9 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:57.811387702 +0000 UTC m=+66.192490584 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs") pod "network-metrics-daemon-g7q5r" (UID: "d0b4a7b4-e6a1-4816-a96e-0792f47539d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 04:24:25.811906 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:25.811411 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 04:24:25.811906 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:25.811428 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert podName:6252963e-17c6-4a33-86f2-a83c646d8f7c nodeName:}" failed. No retries permitted until 2026-04-21 04:24:26.811417119 +0000 UTC m=+35.192519997 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert") pod "ingress-canary-rqbs9" (UID: "6252963e-17c6-4a33-86f2-a83c646d8f7c") : secret "canary-serving-cert" not found Apr 21 04:24:25.811906 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:25.811429 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 04:24:25.811906 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:25.811444 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls podName:0a79ec68-2e66-4417-8b48-1d40b7272c91 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:26.81143643 +0000 UTC m=+35.192539296 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls") pod "dns-default-rn9gk" (UID: "0a79ec68-2e66-4417-8b48-1d40b7272c91") : secret "dns-default-metrics-tls" not found Apr 21 04:24:25.811906 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:25.811454 2574 projected.go:194] Error preparing data for projected volume kube-api-access-hs5wp for pod openshift-network-diagnostics/network-check-target-77kmz: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:24:25.811906 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:25.811509 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp podName:2e9d379b-34d6-4a3f-8e2b-addd13dce02f nodeName:}" failed. No retries permitted until 2026-04-21 04:24:57.811489981 +0000 UTC m=+66.192592847 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-hs5wp" (UniqueName: "kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp") pod "network-check-target-77kmz" (UID: "2e9d379b-34d6-4a3f-8e2b-addd13dce02f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 04:24:26.172496 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:26.172408 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:24:26.172496 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:26.172450 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:24:26.175491 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:26.175454 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 04:24:26.176187 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:26.175896 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 04:24:26.176187 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:26.176023 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tdp6m\"" Apr 21 04:24:26.176585 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:26.176403 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 04:24:26.179272 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:26.177107 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9c9js\"" Apr 21 04:24:26.819458 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:26.819417 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls\") pod \"dns-default-rn9gk\" (UID: \"0a79ec68-2e66-4417-8b48-1d40b7272c91\") " pod="openshift-dns/dns-default-rn9gk" Apr 21 04:24:26.819458 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:26.819462 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert\") pod \"ingress-canary-rqbs9\" (UID: \"6252963e-17c6-4a33-86f2-a83c646d8f7c\") " pod="openshift-ingress-canary/ingress-canary-rqbs9" Apr 21 04:24:26.820008 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:26.819571 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:24:26.820008 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:26.819574 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:24:26.820008 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:26.819657 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert podName:6252963e-17c6-4a33-86f2-a83c646d8f7c nodeName:}" failed. No retries permitted until 2026-04-21 04:24:28.819634761 +0000 UTC m=+37.200737630 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert") pod "ingress-canary-rqbs9" (UID: "6252963e-17c6-4a33-86f2-a83c646d8f7c") : secret "canary-serving-cert" not found Apr 21 04:24:26.820008 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:26.819676 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls podName:0a79ec68-2e66-4417-8b48-1d40b7272c91 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:28.819667357 +0000 UTC m=+37.200770222 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls") pod "dns-default-rn9gk" (UID: "0a79ec68-2e66-4417-8b48-1d40b7272c91") : secret "dns-default-metrics-tls" not found Apr 21 04:24:28.834907 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:28.834857 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls\") pod \"dns-default-rn9gk\" (UID: \"0a79ec68-2e66-4417-8b48-1d40b7272c91\") " pod="openshift-dns/dns-default-rn9gk" Apr 21 04:24:28.835485 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:28.834918 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert\") pod \"ingress-canary-rqbs9\" (UID: \"6252963e-17c6-4a33-86f2-a83c646d8f7c\") " pod="openshift-ingress-canary/ingress-canary-rqbs9" Apr 21 04:24:28.835485 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:28.834999 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:24:28.835485 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:28.835049 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:24:28.835485 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:28.835091 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls podName:0a79ec68-2e66-4417-8b48-1d40b7272c91 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:32.835067647 +0000 UTC m=+41.216170532 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls") pod "dns-default-rn9gk" (UID: "0a79ec68-2e66-4417-8b48-1d40b7272c91") : secret "dns-default-metrics-tls" not found Apr 21 04:24:28.835485 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:28.835113 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert podName:6252963e-17c6-4a33-86f2-a83c646d8f7c nodeName:}" failed. No retries permitted until 2026-04-21 04:24:32.835103917 +0000 UTC m=+41.216206787 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert") pod "ingress-canary-rqbs9" (UID: "6252963e-17c6-4a33-86f2-a83c646d8f7c") : secret "canary-serving-cert" not found Apr 21 04:24:29.429333 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:29.429299 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4kbl" event={"ID":"1182efca-c6ba-4b0f-9492-7a32d77ea693","Type":"ContainerStarted","Data":"4783fe1dfd92917f140af34753a806476d77845f6b35907ca9a32c2347dc5557"} Apr 21 04:24:30.433929 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:30.433889 2574 generic.go:358] "Generic (PLEG): container finished" podID="1182efca-c6ba-4b0f-9492-7a32d77ea693" containerID="4783fe1dfd92917f140af34753a806476d77845f6b35907ca9a32c2347dc5557" exitCode=0 Apr 21 04:24:30.434304 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:30.433969 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4kbl" event={"ID":"1182efca-c6ba-4b0f-9492-7a32d77ea693","Type":"ContainerDied","Data":"4783fe1dfd92917f140af34753a806476d77845f6b35907ca9a32c2347dc5557"} Apr 21 04:24:31.437746 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:31.437703 2574 generic.go:358] "Generic (PLEG): container finished" podID="1182efca-c6ba-4b0f-9492-7a32d77ea693" containerID="e9aad7007f166eb092293502bc2fbcc899afb4197e1ff830aac0ab015cf93925" exitCode=0 Apr 21 04:24:31.438131 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:31.437771 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4kbl" event={"ID":"1182efca-c6ba-4b0f-9492-7a32d77ea693","Type":"ContainerDied","Data":"e9aad7007f166eb092293502bc2fbcc899afb4197e1ff830aac0ab015cf93925"} Apr 21 04:24:32.442364 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:32.442330 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-d4kbl" event={"ID":"1182efca-c6ba-4b0f-9492-7a32d77ea693","Type":"ContainerStarted","Data":"94fb4601786d963a091d486f0c00083b7ad4a118aa8a2b772728384e33db21d0"} Apr 21 04:24:32.462909 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:32.462845 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-d4kbl" podStartSLOduration=5.92105336 podStartE2EDuration="40.462830093s" podCreationTimestamp="2026-04-21 04:23:52 +0000 UTC" firstStartedPulling="2026-04-21 04:23:54.73786749 +0000 UTC m=+3.118970362" lastFinishedPulling="2026-04-21 04:24:29.27964423 +0000 UTC m=+37.660747095" observedRunningTime="2026-04-21 04:24:32.461493011 +0000 UTC m=+40.842595900" watchObservedRunningTime="2026-04-21 04:24:32.462830093 +0000 UTC m=+40.843932981" Apr 21 04:24:32.864986 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:32.864949 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls\") pod \"dns-default-rn9gk\" (UID: \"0a79ec68-2e66-4417-8b48-1d40b7272c91\") " pod="openshift-dns/dns-default-rn9gk" Apr 21 04:24:32.864986 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:32.864993 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert\") pod \"ingress-canary-rqbs9\" (UID: \"6252963e-17c6-4a33-86f2-a83c646d8f7c\") " pod="openshift-ingress-canary/ingress-canary-rqbs9" Apr 21 04:24:32.865206 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:32.865110 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:24:32.865206 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:32.865162 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:24:32.865206 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:32.865179 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls podName:0a79ec68-2e66-4417-8b48-1d40b7272c91 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:40.865164223 +0000 UTC m=+49.246267093 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls") pod "dns-default-rn9gk" (UID: "0a79ec68-2e66-4417-8b48-1d40b7272c91") : secret "dns-default-metrics-tls" not found Apr 21 04:24:32.865312 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:32.865215 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert podName:6252963e-17c6-4a33-86f2-a83c646d8f7c nodeName:}" failed. No retries permitted until 2026-04-21 04:24:40.865202877 +0000 UTC m=+49.246305744 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert") pod "ingress-canary-rqbs9" (UID: "6252963e-17c6-4a33-86f2-a83c646d8f7c") : secret "canary-serving-cert" not found Apr 21 04:24:40.520972 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:40.520928 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3d73e541-4e3f-47bf-8031-d49890b7f8d2-original-pull-secret\") pod \"global-pull-secret-syncer-qbdpv\" (UID: \"3d73e541-4e3f-47bf-8031-d49890b7f8d2\") " pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:40.523495 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:40.523467 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3d73e541-4e3f-47bf-8031-d49890b7f8d2-original-pull-secret\") pod \"global-pull-secret-syncer-qbdpv\" (UID: \"3d73e541-4e3f-47bf-8031-d49890b7f8d2\") " pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:40.783445 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:40.783359 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qbdpv" Apr 21 04:24:40.923428 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:40.923374 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls\") pod \"dns-default-rn9gk\" (UID: \"0a79ec68-2e66-4417-8b48-1d40b7272c91\") " pod="openshift-dns/dns-default-rn9gk" Apr 21 04:24:40.923428 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:40.923431 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert\") pod \"ingress-canary-rqbs9\" (UID: \"6252963e-17c6-4a33-86f2-a83c646d8f7c\") " pod="openshift-ingress-canary/ingress-canary-rqbs9" Apr 21 04:24:40.923678 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:40.923535 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:24:40.923678 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:40.923536 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:24:40.923678 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:40.923612 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert podName:6252963e-17c6-4a33-86f2-a83c646d8f7c nodeName:}" failed. No retries permitted until 2026-04-21 04:24:56.923572539 +0000 UTC m=+65.304675404 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert") pod "ingress-canary-rqbs9" (UID: "6252963e-17c6-4a33-86f2-a83c646d8f7c") : secret "canary-serving-cert" not found Apr 21 04:24:40.923678 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:40.923630 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls podName:0a79ec68-2e66-4417-8b48-1d40b7272c91 nodeName:}" failed. No retries permitted until 2026-04-21 04:24:56.923621166 +0000 UTC m=+65.304724033 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls") pod "dns-default-rn9gk" (UID: "0a79ec68-2e66-4417-8b48-1d40b7272c91") : secret "dns-default-metrics-tls" not found Apr 21 04:24:40.934037 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:40.934008 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qbdpv"] Apr 21 04:24:41.464968 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:41.464933 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qbdpv" event={"ID":"3d73e541-4e3f-47bf-8031-d49890b7f8d2","Type":"ContainerStarted","Data":"49ae70cdb89e13e37ade07d47358c52716767387f37a558ae8f704ca6a3c65ad"} Apr 21 04:24:45.474209 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:45.474177 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qbdpv" event={"ID":"3d73e541-4e3f-47bf-8031-d49890b7f8d2","Type":"ContainerStarted","Data":"980ac53248623f337fc24194287dac1a598ee154c8201c240069c07902e7053b"} Apr 21 04:24:45.488775 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:45.488716 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-qbdpv" podStartSLOduration=33.414359618 podStartE2EDuration="37.488696174s" podCreationTimestamp="2026-04-21 04:24:08 +0000 UTC" firstStartedPulling="2026-04-21 04:24:40.939715338 +0000 UTC m=+49.320818204" lastFinishedPulling="2026-04-21 04:24:45.014051893 +0000 UTC m=+53.395154760" observedRunningTime="2026-04-21 04:24:45.487815111 +0000 UTC m=+53.868917997" watchObservedRunningTime="2026-04-21 04:24:45.488696174 +0000 UTC m=+53.869799061" Apr 21 04:24:51.420786 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:51.420757 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tv4p6" Apr 21 04:24:56.935175 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:56.935124 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert\") pod \"ingress-canary-rqbs9\" (UID: \"6252963e-17c6-4a33-86f2-a83c646d8f7c\") " pod="openshift-ingress-canary/ingress-canary-rqbs9" Apr 21 04:24:56.935569 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:56.935222 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls\") pod \"dns-default-rn9gk\" (UID: \"0a79ec68-2e66-4417-8b48-1d40b7272c91\") " pod="openshift-dns/dns-default-rn9gk" Apr 21 04:24:56.935569 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:56.935288 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:24:56.935569 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:56.935319 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:24:56.935569 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:56.935357 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert podName:6252963e-17c6-4a33-86f2-a83c646d8f7c nodeName:}" failed. No retries permitted until 2026-04-21 04:25:28.935341726 +0000 UTC m=+97.316444593 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert") pod "ingress-canary-rqbs9" (UID: "6252963e-17c6-4a33-86f2-a83c646d8f7c") : secret "canary-serving-cert" not found Apr 21 04:24:56.935569 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:56.935373 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls podName:0a79ec68-2e66-4417-8b48-1d40b7272c91 nodeName:}" failed. No retries permitted until 2026-04-21 04:25:28.935366989 +0000 UTC m=+97.316469855 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls") pod "dns-default-rn9gk" (UID: "0a79ec68-2e66-4417-8b48-1d40b7272c91") : secret "dns-default-metrics-tls" not found Apr 21 04:24:57.842343 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:57.842289 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hs5wp\" (UniqueName: \"kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp\") pod \"network-check-target-77kmz\" (UID: \"2e9d379b-34d6-4a3f-8e2b-addd13dce02f\") " pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:24:57.842343 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:57.842350 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs\") pod \"network-metrics-daemon-g7q5r\" (UID: \"d0b4a7b4-e6a1-4816-a96e-0792f47539d9\") " pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:24:57.844951 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:57.844926 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 04:24:57.845051 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:57.845033 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 04:24:57.852847 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:57.852823 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 04:24:57.852902 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:24:57.852892 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs podName:d0b4a7b4-e6a1-4816-a96e-0792f47539d9 nodeName:}" failed. No retries permitted until 2026-04-21 04:26:01.852875673 +0000 UTC m=+130.233978539 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs") pod "network-metrics-daemon-g7q5r" (UID: "d0b4a7b4-e6a1-4816-a96e-0792f47539d9") : secret "metrics-daemon-secret" not found Apr 21 04:24:57.854529 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:57.854506 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 04:24:57.865388 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:57.865353 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs5wp\" (UniqueName: \"kubernetes.io/projected/2e9d379b-34d6-4a3f-8e2b-addd13dce02f-kube-api-access-hs5wp\") pod \"network-check-target-77kmz\" (UID: \"2e9d379b-34d6-4a3f-8e2b-addd13dce02f\") " pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:24:57.997947 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:57.997916 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-tdp6m\"" Apr 21 04:24:58.005734 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:58.005696 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:24:58.123582 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:58.123502 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-77kmz"] Apr 21 04:24:58.129391 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:24:58.129342 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e9d379b_34d6_4a3f_8e2b_addd13dce02f.slice/crio-52c438db624795e78a260214c52fc4d4ef78116d5e7fb05f839c4c74f789a152 WatchSource:0}: Error finding container 52c438db624795e78a260214c52fc4d4ef78116d5e7fb05f839c4c74f789a152: Status 404 returned error can't find the container with id 52c438db624795e78a260214c52fc4d4ef78116d5e7fb05f839c4c74f789a152 Apr 21 04:24:58.500941 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:24:58.500901 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-77kmz" event={"ID":"2e9d379b-34d6-4a3f-8e2b-addd13dce02f","Type":"ContainerStarted","Data":"52c438db624795e78a260214c52fc4d4ef78116d5e7fb05f839c4c74f789a152"} Apr 21 04:25:01.508060 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:01.508022 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-77kmz" event={"ID":"2e9d379b-34d6-4a3f-8e2b-addd13dce02f","Type":"ContainerStarted","Data":"60fd0b681346c63f5b6066445bf212ea995d7afc7291c1c2f94e5e45e2855135"} Apr 21 04:25:01.508453 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:01.508139 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:25:01.522125 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:01.522064 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-77kmz" podStartSLOduration=66.970958234 podStartE2EDuration="1m9.522045485s" podCreationTimestamp="2026-04-21 04:23:52 +0000 UTC" firstStartedPulling="2026-04-21 04:24:58.132528383 +0000 UTC m=+66.513631250" lastFinishedPulling="2026-04-21 04:25:00.683615504 +0000 UTC m=+69.064718501" observedRunningTime="2026-04-21 04:25:01.521648003 +0000 UTC m=+69.902750890" watchObservedRunningTime="2026-04-21 04:25:01.522045485 +0000 UTC m=+69.903148376" Apr 21 04:25:28.970499 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:28.970354 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls\") pod \"dns-default-rn9gk\" (UID: \"0a79ec68-2e66-4417-8b48-1d40b7272c91\") " pod="openshift-dns/dns-default-rn9gk" Apr 21 04:25:28.970499 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:28.970414 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert\") pod \"ingress-canary-rqbs9\" (UID: \"6252963e-17c6-4a33-86f2-a83c646d8f7c\") " pod="openshift-ingress-canary/ingress-canary-rqbs9" Apr 21 04:25:28.970499 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:25:28.970503 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 04:25:28.971068 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:25:28.970539 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 04:25:28.971068 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:25:28.970571 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls podName:0a79ec68-2e66-4417-8b48-1d40b7272c91 nodeName:}" failed. No retries permitted until 2026-04-21 04:26:32.970553971 +0000 UTC m=+161.351656837 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls") pod "dns-default-rn9gk" (UID: "0a79ec68-2e66-4417-8b48-1d40b7272c91") : secret "dns-default-metrics-tls" not found Apr 21 04:25:28.971068 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:25:28.970617 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert podName:6252963e-17c6-4a33-86f2-a83c646d8f7c nodeName:}" failed. No retries permitted until 2026-04-21 04:26:32.970582533 +0000 UTC m=+161.351685412 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert") pod "ingress-canary-rqbs9" (UID: "6252963e-17c6-4a33-86f2-a83c646d8f7c") : secret "canary-serving-cert" not found Apr 21 04:25:32.513173 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:32.513139 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-77kmz" Apr 21 04:25:56.410770 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.410730 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m5z67"] Apr 21 04:25:56.414506 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.414482 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m5z67" Apr 21 04:25:56.416814 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.416770 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wfrb6"] Apr 21 04:25:56.416944 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.416886 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-8ncj8\"" Apr 21 04:25:56.416944 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.416935 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:25:56.417066 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.416938 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 21 04:25:56.417066 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.417036 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 21 04:25:56.419930 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.419911 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wfrb6" Apr 21 04:25:56.420574 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.420546 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m5z67"] Apr 21 04:25:56.422305 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.422285 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:25:56.422411 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.422326 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 21 04:25:56.422512 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.422495 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-52r9p\"" Apr 21 04:25:56.422577 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.422552 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 21 04:25:56.422716 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.422697 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 21 04:25:56.429691 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.429661 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wfrb6"] Apr 21 04:25:56.521726 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.521680 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vklr2"] Apr 21 04:25:56.524878 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.524859 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6btnv"] Apr 21 04:25:56.525022 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.525005 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vklr2" Apr 21 04:25:56.527332 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.527308 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 21 04:25:56.527459 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.527336 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-xnpk7\"" Apr 21 04:25:56.527459 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.527388 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:25:56.527459 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.527337 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 21 04:25:56.527459 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.527458 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 21 04:25:56.527851 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.527832 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6btnv" Apr 21 04:25:56.529838 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.529818 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:25:56.529838 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.529837 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-snhg9\"" Apr 21 04:25:56.530031 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.529826 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 21 04:25:56.533764 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.533746 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6btnv"] Apr 21 04:25:56.534632 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.534611 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vklr2"] Apr 21 04:25:56.561151 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.561118 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fb0b924-ea2b-484a-bf95-2438903d8e1b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m5z67\" (UID: \"2fb0b924-ea2b-484a-bf95-2438903d8e1b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m5z67" Apr 21 04:25:56.561318 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.561248 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghkp7\" (UniqueName: \"kubernetes.io/projected/2fb0b924-ea2b-484a-bf95-2438903d8e1b-kube-api-access-ghkp7\") pod \"cluster-samples-operator-6dc5bdb6b4-m5z67\" (UID: \"2fb0b924-ea2b-484a-bf95-2438903d8e1b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m5z67" Apr 21 04:25:56.561318 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.561286 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f8c49b5-891d-4368-a08c-06834d2dff98-serving-cert\") pod \"service-ca-operator-d6fc45fc5-wfrb6\" (UID: \"5f8c49b5-891d-4368-a08c-06834d2dff98\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wfrb6" Apr 21 04:25:56.561318 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.561308 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f8c49b5-891d-4368-a08c-06834d2dff98-config\") pod \"service-ca-operator-d6fc45fc5-wfrb6\" (UID: \"5f8c49b5-891d-4368-a08c-06834d2dff98\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wfrb6" Apr 21 04:25:56.561425 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.561331 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpph5\" (UniqueName: \"kubernetes.io/projected/5f8c49b5-891d-4368-a08c-06834d2dff98-kube-api-access-bpph5\") pod \"service-ca-operator-d6fc45fc5-wfrb6\" (UID: \"5f8c49b5-891d-4368-a08c-06834d2dff98\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wfrb6" Apr 21 04:25:56.661943 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.661839 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fb0b924-ea2b-484a-bf95-2438903d8e1b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m5z67\" (UID: \"2fb0b924-ea2b-484a-bf95-2438903d8e1b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m5z67" Apr 21 04:25:56.661943 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.661901 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs59w\" (UniqueName: \"kubernetes.io/projected/64eb89e0-020c-47da-9bd6-7413a505a504-kube-api-access-bs59w\") pod \"volume-data-source-validator-7c6cbb6c87-6btnv\" (UID: \"64eb89e0-020c-47da-9bd6-7413a505a504\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6btnv" Apr 21 04:25:56.661943 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.661930 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8d6m\" (UniqueName: \"kubernetes.io/projected/0bb93254-dca8-4e59-9bd6-90ac24699232-kube-api-access-l8d6m\") pod \"kube-storage-version-migrator-operator-6769c5d45-vklr2\" (UID: \"0bb93254-dca8-4e59-9bd6-90ac24699232\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vklr2" Apr 21 04:25:56.662222 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.661972 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghkp7\" (UniqueName: \"kubernetes.io/projected/2fb0b924-ea2b-484a-bf95-2438903d8e1b-kube-api-access-ghkp7\") pod \"cluster-samples-operator-6dc5bdb6b4-m5z67\" (UID: \"2fb0b924-ea2b-484a-bf95-2438903d8e1b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m5z67" Apr 21 04:25:56.662222 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:25:56.661983 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 04:25:56.662222 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:25:56.662052 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fb0b924-ea2b-484a-bf95-2438903d8e1b-samples-operator-tls podName:2fb0b924-ea2b-484a-bf95-2438903d8e1b nodeName:}" failed. No retries permitted until 2026-04-21 04:25:57.162035096 +0000 UTC m=+125.543137967 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2fb0b924-ea2b-484a-bf95-2438903d8e1b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-m5z67" (UID: "2fb0b924-ea2b-484a-bf95-2438903d8e1b") : secret "samples-operator-tls" not found Apr 21 04:25:56.662222 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.661990 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bb93254-dca8-4e59-9bd6-90ac24699232-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-vklr2\" (UID: \"0bb93254-dca8-4e59-9bd6-90ac24699232\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vklr2" Apr 21 04:25:56.662222 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.662135 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f8c49b5-891d-4368-a08c-06834d2dff98-serving-cert\") pod \"service-ca-operator-d6fc45fc5-wfrb6\" (UID: \"5f8c49b5-891d-4368-a08c-06834d2dff98\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wfrb6" Apr 21 04:25:56.662222 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.662175 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f8c49b5-891d-4368-a08c-06834d2dff98-config\") pod \"service-ca-operator-d6fc45fc5-wfrb6\" (UID: \"5f8c49b5-891d-4368-a08c-06834d2dff98\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wfrb6" Apr 21 04:25:56.662222 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.662215 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bb93254-dca8-4e59-9bd6-90ac24699232-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-vklr2\" (UID: \"0bb93254-dca8-4e59-9bd6-90ac24699232\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vklr2" Apr 21 04:25:56.662557 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.662250 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bpph5\" (UniqueName: \"kubernetes.io/projected/5f8c49b5-891d-4368-a08c-06834d2dff98-kube-api-access-bpph5\") pod \"service-ca-operator-d6fc45fc5-wfrb6\" (UID: \"5f8c49b5-891d-4368-a08c-06834d2dff98\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wfrb6" Apr 21 04:25:56.663403 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.663378 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f8c49b5-891d-4368-a08c-06834d2dff98-config\") pod \"service-ca-operator-d6fc45fc5-wfrb6\" (UID: \"5f8c49b5-891d-4368-a08c-06834d2dff98\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wfrb6" Apr 21 04:25:56.664436 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.664420 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f8c49b5-891d-4368-a08c-06834d2dff98-serving-cert\") pod \"service-ca-operator-d6fc45fc5-wfrb6\" (UID: \"5f8c49b5-891d-4368-a08c-06834d2dff98\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wfrb6" Apr 21 04:25:56.672360 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.672335 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpph5\" (UniqueName: \"kubernetes.io/projected/5f8c49b5-891d-4368-a08c-06834d2dff98-kube-api-access-bpph5\") pod \"service-ca-operator-d6fc45fc5-wfrb6\" (UID: \"5f8c49b5-891d-4368-a08c-06834d2dff98\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wfrb6" Apr 21 04:25:56.672465 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.672401 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghkp7\" (UniqueName: \"kubernetes.io/projected/2fb0b924-ea2b-484a-bf95-2438903d8e1b-kube-api-access-ghkp7\") pod \"cluster-samples-operator-6dc5bdb6b4-m5z67\" (UID: \"2fb0b924-ea2b-484a-bf95-2438903d8e1b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m5z67" Apr 21 04:25:56.731532 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.731496 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wfrb6" Apr 21 04:25:56.763176 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.763128 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bb93254-dca8-4e59-9bd6-90ac24699232-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-vklr2\" (UID: \"0bb93254-dca8-4e59-9bd6-90ac24699232\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vklr2" Apr 21 04:25:56.763371 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.763271 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bs59w\" (UniqueName: \"kubernetes.io/projected/64eb89e0-020c-47da-9bd6-7413a505a504-kube-api-access-bs59w\") pod \"volume-data-source-validator-7c6cbb6c87-6btnv\" (UID: \"64eb89e0-020c-47da-9bd6-7413a505a504\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6btnv" Apr 21 04:25:56.763371 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.763301 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8d6m\" (UniqueName: \"kubernetes.io/projected/0bb93254-dca8-4e59-9bd6-90ac24699232-kube-api-access-l8d6m\") pod \"kube-storage-version-migrator-operator-6769c5d45-vklr2\" (UID: \"0bb93254-dca8-4e59-9bd6-90ac24699232\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vklr2" Apr 21 04:25:56.763371 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.763344 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bb93254-dca8-4e59-9bd6-90ac24699232-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-vklr2\" (UID: \"0bb93254-dca8-4e59-9bd6-90ac24699232\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vklr2" Apr 21 04:25:56.763732 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.763712 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bb93254-dca8-4e59-9bd6-90ac24699232-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-vklr2\" (UID: \"0bb93254-dca8-4e59-9bd6-90ac24699232\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vklr2" Apr 21 04:25:56.765577 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.765550 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bb93254-dca8-4e59-9bd6-90ac24699232-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-vklr2\" (UID: \"0bb93254-dca8-4e59-9bd6-90ac24699232\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vklr2" Apr 21 04:25:56.770719 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.770693 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8d6m\" (UniqueName: \"kubernetes.io/projected/0bb93254-dca8-4e59-9bd6-90ac24699232-kube-api-access-l8d6m\") pod \"kube-storage-version-migrator-operator-6769c5d45-vklr2\" (UID: \"0bb93254-dca8-4e59-9bd6-90ac24699232\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vklr2" Apr 21 04:25:56.770856 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.770780 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs59w\" (UniqueName: \"kubernetes.io/projected/64eb89e0-020c-47da-9bd6-7413a505a504-kube-api-access-bs59w\") pod \"volume-data-source-validator-7c6cbb6c87-6btnv\" (UID: \"64eb89e0-020c-47da-9bd6-7413a505a504\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6btnv" Apr 21 04:25:56.835179 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.835141 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vklr2" Apr 21 04:25:56.840927 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.840899 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6btnv" Apr 21 04:25:56.844483 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.844459 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wfrb6"] Apr 21 04:25:56.848037 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:25:56.848008 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f8c49b5_891d_4368_a08c_06834d2dff98.slice/crio-63cafea5681c9eaa9bb911420e5afde7c4773b62870e9246c61a64241fe3e584 WatchSource:0}: Error finding container 63cafea5681c9eaa9bb911420e5afde7c4773b62870e9246c61a64241fe3e584: Status 404 returned error can't find the container with id 63cafea5681c9eaa9bb911420e5afde7c4773b62870e9246c61a64241fe3e584 Apr 21 04:25:56.962395 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.962362 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6btnv"] Apr 21 04:25:56.965677 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:25:56.965651 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64eb89e0_020c_47da_9bd6_7413a505a504.slice/crio-f3d3638d366261e6225b8e9c3a8738ef337e786aecab88747f835a61e3ae0fbd WatchSource:0}: Error finding container f3d3638d366261e6225b8e9c3a8738ef337e786aecab88747f835a61e3ae0fbd: Status 404 returned error can't find the container with id f3d3638d366261e6225b8e9c3a8738ef337e786aecab88747f835a61e3ae0fbd Apr 21 04:25:56.977503 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:56.977471 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vklr2"] Apr 21 04:25:56.981101 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:25:56.981071 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bb93254_dca8_4e59_9bd6_90ac24699232.slice/crio-d91a263fa22a5a8c245b48665fec81ae544489e657ac6ef1d8c2506bbe524fd5 WatchSource:0}: Error finding container d91a263fa22a5a8c245b48665fec81ae544489e657ac6ef1d8c2506bbe524fd5: Status 404 returned error can't find the container with id d91a263fa22a5a8c245b48665fec81ae544489e657ac6ef1d8c2506bbe524fd5 Apr 21 04:25:57.166817 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:57.166768 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fb0b924-ea2b-484a-bf95-2438903d8e1b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m5z67\" (UID: \"2fb0b924-ea2b-484a-bf95-2438903d8e1b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m5z67" Apr 21 04:25:57.167012 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:25:57.166906 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 04:25:57.167012 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:25:57.166961 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fb0b924-ea2b-484a-bf95-2438903d8e1b-samples-operator-tls podName:2fb0b924-ea2b-484a-bf95-2438903d8e1b nodeName:}" failed. No retries permitted until 2026-04-21 04:25:58.166946288 +0000 UTC m=+126.548049153 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2fb0b924-ea2b-484a-bf95-2438903d8e1b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-m5z67" (UID: "2fb0b924-ea2b-484a-bf95-2438903d8e1b") : secret "samples-operator-tls" not found Apr 21 04:25:57.619015 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:57.618942 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6btnv" event={"ID":"64eb89e0-020c-47da-9bd6-7413a505a504","Type":"ContainerStarted","Data":"f3d3638d366261e6225b8e9c3a8738ef337e786aecab88747f835a61e3ae0fbd"} Apr 21 04:25:57.620731 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:57.620702 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wfrb6" event={"ID":"5f8c49b5-891d-4368-a08c-06834d2dff98","Type":"ContainerStarted","Data":"63cafea5681c9eaa9bb911420e5afde7c4773b62870e9246c61a64241fe3e584"} Apr 21 04:25:57.622428 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:57.622399 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vklr2" event={"ID":"0bb93254-dca8-4e59-9bd6-90ac24699232","Type":"ContainerStarted","Data":"d91a263fa22a5a8c245b48665fec81ae544489e657ac6ef1d8c2506bbe524fd5"} Apr 21 04:25:58.175939 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:58.175892 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fb0b924-ea2b-484a-bf95-2438903d8e1b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m5z67\" (UID: \"2fb0b924-ea2b-484a-bf95-2438903d8e1b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m5z67" Apr 21 04:25:58.176168 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:25:58.176076 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 04:25:58.176168 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:25:58.176153 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fb0b924-ea2b-484a-bf95-2438903d8e1b-samples-operator-tls podName:2fb0b924-ea2b-484a-bf95-2438903d8e1b nodeName:}" failed. No retries permitted until 2026-04-21 04:26:00.176131993 +0000 UTC m=+128.557234863 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2fb0b924-ea2b-484a-bf95-2438903d8e1b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-m5z67" (UID: "2fb0b924-ea2b-484a-bf95-2438903d8e1b") : secret "samples-operator-tls" not found Apr 21 04:25:59.628080 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:59.627983 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6btnv" event={"ID":"64eb89e0-020c-47da-9bd6-7413a505a504","Type":"ContainerStarted","Data":"f282df903b0eff10442d8d2e1023aab21c705f3f71269f78788f839ef8e229c6"} Apr 21 04:25:59.629322 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:59.629290 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wfrb6" event={"ID":"5f8c49b5-891d-4368-a08c-06834d2dff98","Type":"ContainerStarted","Data":"6368c5962dbb0b5ff2d4ea7005401bdd05c357aa8260b1eb20074aa0ba9f6023"} Apr 21 04:25:59.630431 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:59.630409 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vklr2" event={"ID":"0bb93254-dca8-4e59-9bd6-90ac24699232","Type":"ContainerStarted","Data":"238b4610742ced37a868111e8a70641f03ed7a85895ec14dd70b4593b5c0bb99"} Apr 21 04:25:59.642720 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:59.642669 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-6btnv" podStartSLOduration=1.8210659040000001 podStartE2EDuration="3.642654742s" podCreationTimestamp="2026-04-21 04:25:56 +0000 UTC" firstStartedPulling="2026-04-21 04:25:56.967564406 +0000 UTC m=+125.348667274" lastFinishedPulling="2026-04-21 04:25:58.789153246 +0000 UTC m=+127.170256112" observedRunningTime="2026-04-21 04:25:59.642077672 +0000 UTC m=+128.023180561" watchObservedRunningTime="2026-04-21 04:25:59.642654742 +0000 UTC m=+128.023757650" Apr 21 04:25:59.658315 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:59.658267 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vklr2" podStartSLOduration=1.346546914 podStartE2EDuration="3.658256243s" podCreationTimestamp="2026-04-21 04:25:56 +0000 UTC" firstStartedPulling="2026-04-21 04:25:56.98273283 +0000 UTC m=+125.363835697" lastFinishedPulling="2026-04-21 04:25:59.294442153 +0000 UTC m=+127.675545026" observedRunningTime="2026-04-21 04:25:59.657696559 +0000 UTC m=+128.038799445" watchObservedRunningTime="2026-04-21 04:25:59.658256243 +0000 UTC m=+128.039359130" Apr 21 04:25:59.671169 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:25:59.671119 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wfrb6" podStartSLOduration=1.22899835 podStartE2EDuration="3.671107088s" podCreationTimestamp="2026-04-21 04:25:56 +0000 UTC" firstStartedPulling="2026-04-21 04:25:56.849987899 +0000 UTC m=+125.231090765" lastFinishedPulling="2026-04-21 04:25:59.292096636 +0000 UTC m=+127.673199503" observedRunningTime="2026-04-21 04:25:59.671066156 +0000 UTC m=+128.052169047" watchObservedRunningTime="2026-04-21 04:25:59.671107088 +0000 UTC m=+128.052209958" Apr 21 04:26:00.193828 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:00.193780 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fb0b924-ea2b-484a-bf95-2438903d8e1b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m5z67\" (UID: \"2fb0b924-ea2b-484a-bf95-2438903d8e1b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m5z67" Apr 21 04:26:00.194131 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:26:00.193923 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 04:26:00.194131 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:26:00.193991 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fb0b924-ea2b-484a-bf95-2438903d8e1b-samples-operator-tls podName:2fb0b924-ea2b-484a-bf95-2438903d8e1b nodeName:}" failed. No retries permitted until 2026-04-21 04:26:04.193975116 +0000 UTC m=+132.575077982 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2fb0b924-ea2b-484a-bf95-2438903d8e1b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-m5z67" (UID: "2fb0b924-ea2b-484a-bf95-2438903d8e1b") : secret "samples-operator-tls" not found Apr 21 04:26:00.564816 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:00.564778 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-8t9hm"] Apr 21 04:26:00.568205 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:00.568183 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8t9hm" Apr 21 04:26:00.570205 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:00.570187 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-2dzwp\"" Apr 21 04:26:00.573534 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:00.573511 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-8t9hm"] Apr 21 04:26:00.698791 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:00.698751 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pt7n\" (UniqueName: \"kubernetes.io/projected/ba58cd2a-0cf9-40c9-8464-6a303aafd0c8-kube-api-access-2pt7n\") pod \"network-check-source-8894fc9bd-8t9hm\" (UID: \"ba58cd2a-0cf9-40c9-8464-6a303aafd0c8\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8t9hm" Apr 21 04:26:00.799585 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:00.799548 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2pt7n\" (UniqueName: \"kubernetes.io/projected/ba58cd2a-0cf9-40c9-8464-6a303aafd0c8-kube-api-access-2pt7n\") pod \"network-check-source-8894fc9bd-8t9hm\" (UID: \"ba58cd2a-0cf9-40c9-8464-6a303aafd0c8\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8t9hm" Apr 21 04:26:00.816652 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:00.816562 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pt7n\" (UniqueName: \"kubernetes.io/projected/ba58cd2a-0cf9-40c9-8464-6a303aafd0c8-kube-api-access-2pt7n\") pod \"network-check-source-8894fc9bd-8t9hm\" (UID: \"ba58cd2a-0cf9-40c9-8464-6a303aafd0c8\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8t9hm" Apr 21 04:26:00.870951 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:00.870907 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-jwcj4"] Apr 21 04:26:00.874078 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:00.874052 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jwcj4" Apr 21 04:26:00.876400 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:00.876371 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 21 04:26:00.876583 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:00.876373 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-j86bq\"" Apr 21 04:26:00.877073 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:00.877054 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 21 04:26:00.877752 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:00.877733 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8t9hm" Apr 21 04:26:00.885127 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:00.885086 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-jwcj4"] Apr 21 04:26:01.001479 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:01.001430 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frmds\" (UniqueName: \"kubernetes.io/projected/1dc5b891-b82e-43cb-9d7b-5575fff9501b-kube-api-access-frmds\") pod \"migrator-74bb7799d9-jwcj4\" (UID: \"1dc5b891-b82e-43cb-9d7b-5575fff9501b\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jwcj4" Apr 21 04:26:01.025684 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:01.025651 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-8t9hm"] Apr 21 04:26:01.028921 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:26:01.028888 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba58cd2a_0cf9_40c9_8464_6a303aafd0c8.slice/crio-3196f6640ba8f70531103c6c1c867359b7d85b82fda2ddd0ab1ef29d0004e632 WatchSource:0}: Error finding container 3196f6640ba8f70531103c6c1c867359b7d85b82fda2ddd0ab1ef29d0004e632: Status 404 returned error can't find the container with id 3196f6640ba8f70531103c6c1c867359b7d85b82fda2ddd0ab1ef29d0004e632 Apr 21 04:26:01.102257 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:01.101946 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frmds\" (UniqueName: \"kubernetes.io/projected/1dc5b891-b82e-43cb-9d7b-5575fff9501b-kube-api-access-frmds\") pod \"migrator-74bb7799d9-jwcj4\" (UID: \"1dc5b891-b82e-43cb-9d7b-5575fff9501b\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jwcj4" Apr 21 04:26:01.110424 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:01.110378 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frmds\" (UniqueName: \"kubernetes.io/projected/1dc5b891-b82e-43cb-9d7b-5575fff9501b-kube-api-access-frmds\") pod \"migrator-74bb7799d9-jwcj4\" (UID: \"1dc5b891-b82e-43cb-9d7b-5575fff9501b\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jwcj4" Apr 21 04:26:01.187674 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:01.187642 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jwcj4" Apr 21 04:26:01.304704 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:01.304671 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-jwcj4"] Apr 21 04:26:01.308943 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:26:01.308914 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dc5b891_b82e_43cb_9d7b_5575fff9501b.slice/crio-f754f2e6cce53c8b70482a107f7bfd50063eb7a1d1203ed4be1b9849588f87de WatchSource:0}: Error finding container f754f2e6cce53c8b70482a107f7bfd50063eb7a1d1203ed4be1b9849588f87de: Status 404 returned error can't find the container with id f754f2e6cce53c8b70482a107f7bfd50063eb7a1d1203ed4be1b9849588f87de Apr 21 04:26:01.636649 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:01.636612 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jwcj4" event={"ID":"1dc5b891-b82e-43cb-9d7b-5575fff9501b","Type":"ContainerStarted","Data":"f754f2e6cce53c8b70482a107f7bfd50063eb7a1d1203ed4be1b9849588f87de"} Apr 21 04:26:01.637774 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:01.637751 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8t9hm" event={"ID":"ba58cd2a-0cf9-40c9-8464-6a303aafd0c8","Type":"ContainerStarted","Data":"75e7630a41cd81e67f6aa79e0cf1fc22200174d5aec1f647898b28a2f3be666d"} Apr 21 04:26:01.637915 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:01.637779 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8t9hm" event={"ID":"ba58cd2a-0cf9-40c9-8464-6a303aafd0c8","Type":"ContainerStarted","Data":"3196f6640ba8f70531103c6c1c867359b7d85b82fda2ddd0ab1ef29d0004e632"} Apr 21 04:26:01.650950 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:01.650833 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-8t9hm" podStartSLOduration=1.650817611 podStartE2EDuration="1.650817611s" podCreationTimestamp="2026-04-21 04:26:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:26:01.650508691 +0000 UTC m=+130.031611578" watchObservedRunningTime="2026-04-21 04:26:01.650817611 +0000 UTC m=+130.031920501" Apr 21 04:26:01.909622 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:01.907988 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs\") pod \"network-metrics-daemon-g7q5r\" (UID: \"d0b4a7b4-e6a1-4816-a96e-0792f47539d9\") " pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:26:01.909622 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:26:01.908341 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 04:26:01.909622 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:26:01.908415 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs podName:d0b4a7b4-e6a1-4816-a96e-0792f47539d9 nodeName:}" failed. No retries permitted until 2026-04-21 04:28:03.908395071 +0000 UTC m=+252.289497952 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs") pod "network-metrics-daemon-g7q5r" (UID: "d0b4a7b4-e6a1-4816-a96e-0792f47539d9") : secret "metrics-daemon-secret" not found Apr 21 04:26:01.962996 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:01.962966 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hpx9f_50d19486-d861-4624-a572-7c1d8e897542/dns-node-resolver/0.log" Apr 21 04:26:02.642513 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:02.642479 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jwcj4" event={"ID":"1dc5b891-b82e-43cb-9d7b-5575fff9501b","Type":"ContainerStarted","Data":"80ed8d1c13915526deb7abd233502d948a024b01edb5c2229a3fc25a3ff93bde"} Apr 21 04:26:02.762186 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:02.762160 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rnwcg_493c7d5b-0f42-40a4-ab37-19d6681834e3/node-ca/0.log" Apr 21 04:26:03.647846 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:03.647799 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jwcj4" event={"ID":"1dc5b891-b82e-43cb-9d7b-5575fff9501b","Type":"ContainerStarted","Data":"af2c6e18ce714d93d2ecc4b182eef44fc7421b53e9380323c0c5b6aba632b3e1"} Apr 21 04:26:03.661522 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:03.661468 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-jwcj4" podStartSLOduration=2.424052139 podStartE2EDuration="3.661453729s" podCreationTimestamp="2026-04-21 04:26:00 +0000 UTC" firstStartedPulling="2026-04-21 04:26:01.31075423 +0000 UTC m=+129.691857108" lastFinishedPulling="2026-04-21 04:26:02.548155819 +0000 UTC m=+130.929258698" observedRunningTime="2026-04-21 04:26:03.660994696 +0000 UTC m=+132.042097584" watchObservedRunningTime="2026-04-21 04:26:03.661453729 +0000 UTC m=+132.042556616" Apr 21 04:26:04.224963 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:04.224917 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fb0b924-ea2b-484a-bf95-2438903d8e1b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m5z67\" (UID: \"2fb0b924-ea2b-484a-bf95-2438903d8e1b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m5z67" Apr 21 04:26:04.225184 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:26:04.225072 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 04:26:04.225184 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:26:04.225146 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2fb0b924-ea2b-484a-bf95-2438903d8e1b-samples-operator-tls podName:2fb0b924-ea2b-484a-bf95-2438903d8e1b nodeName:}" failed. No retries permitted until 2026-04-21 04:26:12.225125897 +0000 UTC m=+140.606228766 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2fb0b924-ea2b-484a-bf95-2438903d8e1b-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-m5z67" (UID: "2fb0b924-ea2b-484a-bf95-2438903d8e1b") : secret "samples-operator-tls" not found Apr 21 04:26:12.281499 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:12.281455 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fb0b924-ea2b-484a-bf95-2438903d8e1b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m5z67\" (UID: \"2fb0b924-ea2b-484a-bf95-2438903d8e1b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m5z67" Apr 21 04:26:12.283850 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:12.283818 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2fb0b924-ea2b-484a-bf95-2438903d8e1b-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-m5z67\" (UID: \"2fb0b924-ea2b-484a-bf95-2438903d8e1b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m5z67" Apr 21 04:26:12.324754 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:12.324700 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m5z67" Apr 21 04:26:12.464730 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:12.464698 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m5z67"] Apr 21 04:26:12.669653 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:12.669556 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m5z67" event={"ID":"2fb0b924-ea2b-484a-bf95-2438903d8e1b","Type":"ContainerStarted","Data":"0a84523059076a9f40de09d05d84fce71ddda45206cf85ef2218efd985300672"} Apr 21 04:26:14.675805 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:14.675769 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m5z67" event={"ID":"2fb0b924-ea2b-484a-bf95-2438903d8e1b","Type":"ContainerStarted","Data":"ebfe3e34fbb920875988b8323fe5921f2fa8760db6250bff1ecfb1f9db4d4891"} Apr 21 04:26:14.675805 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:14.675807 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m5z67" event={"ID":"2fb0b924-ea2b-484a-bf95-2438903d8e1b","Type":"ContainerStarted","Data":"aac58823ba6d76852b02078821ca6a8f58fe463a1519a400cdf198072a10daf2"} Apr 21 04:26:14.691330 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:14.691277 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-m5z67" podStartSLOduration=17.011176082 podStartE2EDuration="18.691261775s" podCreationTimestamp="2026-04-21 04:25:56 +0000 UTC" firstStartedPulling="2026-04-21 04:26:12.505543696 +0000 UTC m=+140.886646563" lastFinishedPulling="2026-04-21 04:26:14.18562939 +0000 UTC m=+142.566732256" observedRunningTime="2026-04-21 04:26:14.690924587 +0000 UTC m=+143.072027478" watchObservedRunningTime="2026-04-21 04:26:14.691261775 +0000 UTC m=+143.072364663" Apr 21 04:26:23.442010 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.441973 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-425bd"] Apr 21 04:26:23.478245 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.478210 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-425bd"] Apr 21 04:26:23.478406 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.478359 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-425bd" Apr 21 04:26:23.481298 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.481276 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 04:26:23.481298 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.481290 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 04:26:23.482079 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.482059 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qm4pr\"" Apr 21 04:26:23.482079 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.482077 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 04:26:23.482290 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.482096 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 04:26:23.565643 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.565586 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0abaf46b-78e1-4455-b661-112f2e50f6af-data-volume\") pod \"insights-runtime-extractor-425bd\" (UID: \"0abaf46b-78e1-4455-b661-112f2e50f6af\") " pod="openshift-insights/insights-runtime-extractor-425bd" Apr 21 04:26:23.565817 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.565650 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfw5b\" (UniqueName: \"kubernetes.io/projected/0abaf46b-78e1-4455-b661-112f2e50f6af-kube-api-access-gfw5b\") pod \"insights-runtime-extractor-425bd\" (UID: \"0abaf46b-78e1-4455-b661-112f2e50f6af\") " pod="openshift-insights/insights-runtime-extractor-425bd" Apr 21 04:26:23.565817 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.565742 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0abaf46b-78e1-4455-b661-112f2e50f6af-crio-socket\") pod \"insights-runtime-extractor-425bd\" (UID: \"0abaf46b-78e1-4455-b661-112f2e50f6af\") " pod="openshift-insights/insights-runtime-extractor-425bd" Apr 21 04:26:23.565817 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.565771 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0abaf46b-78e1-4455-b661-112f2e50f6af-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-425bd\" (UID: \"0abaf46b-78e1-4455-b661-112f2e50f6af\") " pod="openshift-insights/insights-runtime-extractor-425bd" Apr 21 04:26:23.565817 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.565791 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0abaf46b-78e1-4455-b661-112f2e50f6af-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-425bd\" (UID: \"0abaf46b-78e1-4455-b661-112f2e50f6af\") " pod="openshift-insights/insights-runtime-extractor-425bd" Apr 21 04:26:23.666666 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.666617 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0abaf46b-78e1-4455-b661-112f2e50f6af-crio-socket\") pod \"insights-runtime-extractor-425bd\" (UID: \"0abaf46b-78e1-4455-b661-112f2e50f6af\") " pod="openshift-insights/insights-runtime-extractor-425bd" Apr 21 04:26:23.666666 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.666660 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0abaf46b-78e1-4455-b661-112f2e50f6af-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-425bd\" (UID: \"0abaf46b-78e1-4455-b661-112f2e50f6af\") " pod="openshift-insights/insights-runtime-extractor-425bd" Apr 21 04:26:23.666934 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.666684 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0abaf46b-78e1-4455-b661-112f2e50f6af-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-425bd\" (UID: \"0abaf46b-78e1-4455-b661-112f2e50f6af\") " pod="openshift-insights/insights-runtime-extractor-425bd" Apr 21 04:26:23.666934 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.666773 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/0abaf46b-78e1-4455-b661-112f2e50f6af-crio-socket\") pod \"insights-runtime-extractor-425bd\" (UID: \"0abaf46b-78e1-4455-b661-112f2e50f6af\") " pod="openshift-insights/insights-runtime-extractor-425bd" Apr 21 04:26:23.666934 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.666798 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0abaf46b-78e1-4455-b661-112f2e50f6af-data-volume\") pod \"insights-runtime-extractor-425bd\" (UID: \"0abaf46b-78e1-4455-b661-112f2e50f6af\") " pod="openshift-insights/insights-runtime-extractor-425bd" Apr 21 04:26:23.666934 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.666844 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfw5b\" (UniqueName: \"kubernetes.io/projected/0abaf46b-78e1-4455-b661-112f2e50f6af-kube-api-access-gfw5b\") pod \"insights-runtime-extractor-425bd\" (UID: \"0abaf46b-78e1-4455-b661-112f2e50f6af\") " pod="openshift-insights/insights-runtime-extractor-425bd" Apr 21 04:26:23.667119 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.667096 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/0abaf46b-78e1-4455-b661-112f2e50f6af-data-volume\") pod \"insights-runtime-extractor-425bd\" (UID: \"0abaf46b-78e1-4455-b661-112f2e50f6af\") " pod="openshift-insights/insights-runtime-extractor-425bd" Apr 21 04:26:23.667361 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.667323 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/0abaf46b-78e1-4455-b661-112f2e50f6af-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-425bd\" (UID: \"0abaf46b-78e1-4455-b661-112f2e50f6af\") " pod="openshift-insights/insights-runtime-extractor-425bd" Apr 21 04:26:23.669197 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.669174 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/0abaf46b-78e1-4455-b661-112f2e50f6af-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-425bd\" (UID: \"0abaf46b-78e1-4455-b661-112f2e50f6af\") " pod="openshift-insights/insights-runtime-extractor-425bd" Apr 21 04:26:23.677173 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.677147 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfw5b\" (UniqueName: \"kubernetes.io/projected/0abaf46b-78e1-4455-b661-112f2e50f6af-kube-api-access-gfw5b\") pod \"insights-runtime-extractor-425bd\" (UID: \"0abaf46b-78e1-4455-b661-112f2e50f6af\") " pod="openshift-insights/insights-runtime-extractor-425bd" Apr 21 04:26:23.787114 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.787079 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-425bd" Apr 21 04:26:23.908989 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:23.908955 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-425bd"] Apr 21 04:26:23.912023 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:26:23.911991 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0abaf46b_78e1_4455_b661_112f2e50f6af.slice/crio-3184be5356768a48f57f34cfeeba38e98c4c4b6ed5dc215146f439762bbfe3a5 WatchSource:0}: Error finding container 3184be5356768a48f57f34cfeeba38e98c4c4b6ed5dc215146f439762bbfe3a5: Status 404 returned error can't find the container with id 3184be5356768a48f57f34cfeeba38e98c4c4b6ed5dc215146f439762bbfe3a5 Apr 21 04:26:24.702169 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:24.702137 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-425bd" event={"ID":"0abaf46b-78e1-4455-b661-112f2e50f6af","Type":"ContainerStarted","Data":"345baa4f89056385f7afb26b06383a3607dfa1bef71119df8173cbf3368bb419"} Apr 21 04:26:24.702610 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:24.702177 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-425bd" event={"ID":"0abaf46b-78e1-4455-b661-112f2e50f6af","Type":"ContainerStarted","Data":"3184be5356768a48f57f34cfeeba38e98c4c4b6ed5dc215146f439762bbfe3a5"} Apr 21 04:26:25.707167 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:25.707131 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-425bd" event={"ID":"0abaf46b-78e1-4455-b661-112f2e50f6af","Type":"ContainerStarted","Data":"3abdfa67a4169f30d23f12143a56f1faf0d8779ac6b53fcd048954b03383c6ab"} Apr 21 04:26:26.711355 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:26.711317 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-425bd" event={"ID":"0abaf46b-78e1-4455-b661-112f2e50f6af","Type":"ContainerStarted","Data":"e1610aa36a0cbcf8965491128103a21281b62c3fb74edb96ad401a52338eb2c3"} Apr 21 04:26:26.728233 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:26.728131 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-425bd" podStartSLOduration=1.312712109 podStartE2EDuration="3.728117648s" podCreationTimestamp="2026-04-21 04:26:23 +0000 UTC" firstStartedPulling="2026-04-21 04:26:24.051771499 +0000 UTC m=+152.432874365" lastFinishedPulling="2026-04-21 04:26:26.467177037 +0000 UTC m=+154.848279904" observedRunningTime="2026-04-21 04:26:26.727130141 +0000 UTC m=+155.108233071" watchObservedRunningTime="2026-04-21 04:26:26.728117648 +0000 UTC m=+155.109220536" Apr 21 04:26:28.034133 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:26:28.034085 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-rn9gk" podUID="0a79ec68-2e66-4417-8b48-1d40b7272c91" Apr 21 04:26:28.045295 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:26:28.045259 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-rqbs9" podUID="6252963e-17c6-4a33-86f2-a83c646d8f7c" Apr 21 04:26:28.716029 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:28.715998 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rn9gk" Apr 21 04:26:29.188771 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:26:29.188674 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-g7q5r" podUID="d0b4a7b4-e6a1-4816-a96e-0792f47539d9" Apr 21 04:26:30.857891 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:30.857858 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ncwvv"] Apr 21 04:26:30.861054 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:30.861030 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ncwvv" Apr 21 04:26:30.863376 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:30.863347 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 21 04:26:30.863496 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:30.863418 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-6m4s5\"" Apr 21 04:26:30.868064 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:30.867937 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ncwvv"] Apr 21 04:26:30.922350 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:30.922311 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/63181619-2d3e-4acb-842b-6eb60832feee-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-ncwvv\" (UID: \"63181619-2d3e-4acb-842b-6eb60832feee\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ncwvv" Apr 21 04:26:31.022914 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:31.022880 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/63181619-2d3e-4acb-842b-6eb60832feee-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-ncwvv\" (UID: \"63181619-2d3e-4acb-842b-6eb60832feee\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ncwvv" Apr 21 04:26:31.023078 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:26:31.023037 2574 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 21 04:26:31.023138 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:26:31.023101 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63181619-2d3e-4acb-842b-6eb60832feee-tls-certificates podName:63181619-2d3e-4acb-842b-6eb60832feee nodeName:}" failed. No retries permitted until 2026-04-21 04:26:31.523084081 +0000 UTC m=+159.904186950 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/63181619-2d3e-4acb-842b-6eb60832feee-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-ncwvv" (UID: "63181619-2d3e-4acb-842b-6eb60832feee") : secret "prometheus-operator-admission-webhook-tls" not found Apr 21 04:26:31.525794 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:31.525756 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/63181619-2d3e-4acb-842b-6eb60832feee-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-ncwvv\" (UID: \"63181619-2d3e-4acb-842b-6eb60832feee\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ncwvv" Apr 21 04:26:31.528222 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:31.528193 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/63181619-2d3e-4acb-842b-6eb60832feee-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-ncwvv\" (UID: \"63181619-2d3e-4acb-842b-6eb60832feee\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ncwvv" Apr 21 04:26:31.770384 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:31.770333 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ncwvv" Apr 21 04:26:31.891994 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:31.891961 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ncwvv"] Apr 21 04:26:31.894959 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:26:31.894927 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63181619_2d3e_4acb_842b_6eb60832feee.slice/crio-d665ad119a881b31761e79a071eae7bf5f2121493aa761dc6dc691896df0b4f6 WatchSource:0}: Error finding container d665ad119a881b31761e79a071eae7bf5f2121493aa761dc6dc691896df0b4f6: Status 404 returned error can't find the container with id d665ad119a881b31761e79a071eae7bf5f2121493aa761dc6dc691896df0b4f6 Apr 21 04:26:32.727788 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:32.727700 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ncwvv" event={"ID":"63181619-2d3e-4acb-842b-6eb60832feee","Type":"ContainerStarted","Data":"d665ad119a881b31761e79a071eae7bf5f2121493aa761dc6dc691896df0b4f6"} Apr 21 04:26:33.039772 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:33.039739 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls\") pod \"dns-default-rn9gk\" (UID: \"0a79ec68-2e66-4417-8b48-1d40b7272c91\") " pod="openshift-dns/dns-default-rn9gk" Apr 21 04:26:33.040161 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:33.039788 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert\") pod \"ingress-canary-rqbs9\" (UID: \"6252963e-17c6-4a33-86f2-a83c646d8f7c\") " pod="openshift-ingress-canary/ingress-canary-rqbs9" Apr 21 04:26:33.042079 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:33.042051 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6252963e-17c6-4a33-86f2-a83c646d8f7c-cert\") pod \"ingress-canary-rqbs9\" (UID: \"6252963e-17c6-4a33-86f2-a83c646d8f7c\") " pod="openshift-ingress-canary/ingress-canary-rqbs9" Apr 21 04:26:33.042411 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:33.042392 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a79ec68-2e66-4417-8b48-1d40b7272c91-metrics-tls\") pod \"dns-default-rn9gk\" (UID: \"0a79ec68-2e66-4417-8b48-1d40b7272c91\") " pod="openshift-dns/dns-default-rn9gk" Apr 21 04:26:33.219666 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:33.219630 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vk74d\"" Apr 21 04:26:33.228368 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:33.228331 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rn9gk" Apr 21 04:26:33.346199 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:33.346175 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rn9gk"] Apr 21 04:26:33.348404 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:26:33.348374 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a79ec68_2e66_4417_8b48_1d40b7272c91.slice/crio-8cbba515873a0f5d8503ca6db56c63f611aaf8b8e3c82dafe8fbe64008bdcc01 WatchSource:0}: Error finding container 8cbba515873a0f5d8503ca6db56c63f611aaf8b8e3c82dafe8fbe64008bdcc01: Status 404 returned error can't find the container with id 8cbba515873a0f5d8503ca6db56c63f611aaf8b8e3c82dafe8fbe64008bdcc01 Apr 21 04:26:33.731292 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:33.731258 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ncwvv" event={"ID":"63181619-2d3e-4acb-842b-6eb60832feee","Type":"ContainerStarted","Data":"064358d7b66f21521669b81f9d24b081e6eaaea3165fa33887fe38596965499b"} Apr 21 04:26:33.731480 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:33.731453 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ncwvv" Apr 21 04:26:33.732337 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:33.732312 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rn9gk" event={"ID":"0a79ec68-2e66-4417-8b48-1d40b7272c91","Type":"ContainerStarted","Data":"8cbba515873a0f5d8503ca6db56c63f611aaf8b8e3c82dafe8fbe64008bdcc01"} Apr 21 04:26:33.736263 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:33.736243 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ncwvv" Apr 21 04:26:33.746604 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:33.746541 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-ncwvv" podStartSLOduration=2.668218614 podStartE2EDuration="3.746527188s" podCreationTimestamp="2026-04-21 04:26:30 +0000 UTC" firstStartedPulling="2026-04-21 04:26:31.896953779 +0000 UTC m=+160.278056646" lastFinishedPulling="2026-04-21 04:26:32.975262351 +0000 UTC m=+161.356365220" observedRunningTime="2026-04-21 04:26:33.74585671 +0000 UTC m=+162.126959601" watchObservedRunningTime="2026-04-21 04:26:33.746527188 +0000 UTC m=+162.127630077" Apr 21 04:26:33.917771 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:33.917731 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-2frq4"] Apr 21 04:26:33.920817 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:33.920792 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-2frq4" Apr 21 04:26:33.923041 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:33.923011 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 21 04:26:33.923041 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:33.923036 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 04:26:33.923223 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:33.923209 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 21 04:26:33.923279 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:33.923214 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-zqhwj\"" Apr 21 04:26:33.923389 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:33.923373 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 04:26:33.923584 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:33.923564 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 04:26:33.927958 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:33.927882 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-2frq4"] Apr 21 04:26:34.048282 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:34.048201 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqz68\" (UniqueName: \"kubernetes.io/projected/bc41b2e4-073e-4fce-a850-7e5fa29c9da4-kube-api-access-fqz68\") pod \"prometheus-operator-5676c8c784-2frq4\" (UID: \"bc41b2e4-073e-4fce-a850-7e5fa29c9da4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2frq4" Apr 21 04:26:34.048282 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:34.048253 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bc41b2e4-073e-4fce-a850-7e5fa29c9da4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-2frq4\" (UID: \"bc41b2e4-073e-4fce-a850-7e5fa29c9da4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2frq4" Apr 21 04:26:34.048734 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:34.048286 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc41b2e4-073e-4fce-a850-7e5fa29c9da4-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-2frq4\" (UID: \"bc41b2e4-073e-4fce-a850-7e5fa29c9da4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2frq4" Apr 21 04:26:34.048734 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:34.048359 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc41b2e4-073e-4fce-a850-7e5fa29c9da4-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-2frq4\" (UID: \"bc41b2e4-073e-4fce-a850-7e5fa29c9da4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2frq4" Apr 21 04:26:34.149542 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:34.149503 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc41b2e4-073e-4fce-a850-7e5fa29c9da4-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-2frq4\" (UID: \"bc41b2e4-073e-4fce-a850-7e5fa29c9da4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2frq4" Apr 21 04:26:34.149738 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:34.149584 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqz68\" (UniqueName: \"kubernetes.io/projected/bc41b2e4-073e-4fce-a850-7e5fa29c9da4-kube-api-access-fqz68\") pod \"prometheus-operator-5676c8c784-2frq4\" (UID: \"bc41b2e4-073e-4fce-a850-7e5fa29c9da4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2frq4" Apr 21 04:26:34.149738 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:34.149642 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bc41b2e4-073e-4fce-a850-7e5fa29c9da4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-2frq4\" (UID: \"bc41b2e4-073e-4fce-a850-7e5fa29c9da4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2frq4" Apr 21 04:26:34.149738 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:34.149670 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc41b2e4-073e-4fce-a850-7e5fa29c9da4-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-2frq4\" (UID: \"bc41b2e4-073e-4fce-a850-7e5fa29c9da4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2frq4" Apr 21 04:26:34.150522 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:34.150493 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc41b2e4-073e-4fce-a850-7e5fa29c9da4-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-2frq4\" (UID: \"bc41b2e4-073e-4fce-a850-7e5fa29c9da4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2frq4" Apr 21 04:26:34.152488 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:34.152451 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bc41b2e4-073e-4fce-a850-7e5fa29c9da4-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-2frq4\" (UID: \"bc41b2e4-073e-4fce-a850-7e5fa29c9da4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2frq4" Apr 21 04:26:34.152641 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:34.152490 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc41b2e4-073e-4fce-a850-7e5fa29c9da4-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-2frq4\" (UID: \"bc41b2e4-073e-4fce-a850-7e5fa29c9da4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2frq4" Apr 21 04:26:34.158244 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:34.158203 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqz68\" (UniqueName: \"kubernetes.io/projected/bc41b2e4-073e-4fce-a850-7e5fa29c9da4-kube-api-access-fqz68\") pod \"prometheus-operator-5676c8c784-2frq4\" (UID: \"bc41b2e4-073e-4fce-a850-7e5fa29c9da4\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2frq4" Apr 21 04:26:34.230242 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:34.230132 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-2frq4" Apr 21 04:26:34.360320 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:34.360104 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-2frq4"] Apr 21 04:26:34.617329 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:26:34.617260 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc41b2e4_073e_4fce_a850_7e5fa29c9da4.slice/crio-11b7d621afe09ecd3c47ad517963beaef3eb616cf7a8f8df4b568e398e5e27af WatchSource:0}: Error finding container 11b7d621afe09ecd3c47ad517963beaef3eb616cf7a8f8df4b568e398e5e27af: Status 404 returned error can't find the container with id 11b7d621afe09ecd3c47ad517963beaef3eb616cf7a8f8df4b568e398e5e27af Apr 21 04:26:34.736230 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:34.736203 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-2frq4" event={"ID":"bc41b2e4-073e-4fce-a850-7e5fa29c9da4","Type":"ContainerStarted","Data":"11b7d621afe09ecd3c47ad517963beaef3eb616cf7a8f8df4b568e398e5e27af"} Apr 21 04:26:35.742048 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:35.742006 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rn9gk" event={"ID":"0a79ec68-2e66-4417-8b48-1d40b7272c91","Type":"ContainerStarted","Data":"f6d157fd2d80f9e7fd3bb3f025d399407837ad22e8643cae105db936656bbefe"} Apr 21 04:26:35.742048 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:35.742056 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rn9gk" event={"ID":"0a79ec68-2e66-4417-8b48-1d40b7272c91","Type":"ContainerStarted","Data":"c84a66702758fbc540dbb6efc52d59689fa89e8440b6c2dda590c5b5711e8814"} Apr 21 04:26:35.742623 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:35.742171 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-rn9gk" Apr 21 04:26:35.758514 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:35.758443 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rn9gk" podStartSLOduration=130.444952905 podStartE2EDuration="2m11.758422374s" podCreationTimestamp="2026-04-21 04:24:24 +0000 UTC" firstStartedPulling="2026-04-21 04:26:33.350276196 +0000 UTC m=+161.731379062" lastFinishedPulling="2026-04-21 04:26:34.663745665 +0000 UTC m=+163.044848531" observedRunningTime="2026-04-21 04:26:35.757574343 +0000 UTC m=+164.138677234" watchObservedRunningTime="2026-04-21 04:26:35.758422374 +0000 UTC m=+164.139525267" Apr 21 04:26:36.746174 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:36.746137 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-2frq4" event={"ID":"bc41b2e4-073e-4fce-a850-7e5fa29c9da4","Type":"ContainerStarted","Data":"516633416bac46394093ca6834b6402fd04046202192c82e50128de3a748a44f"} Apr 21 04:26:36.746174 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:36.746184 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-2frq4" event={"ID":"bc41b2e4-073e-4fce-a850-7e5fa29c9da4","Type":"ContainerStarted","Data":"3e2f0d1753e25996020791e912cc37951786ac5aa645fdeec8c2cc66ea9eaa63"} Apr 21 04:26:36.764853 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:36.764799 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-2frq4" podStartSLOduration=2.4961200359999998 podStartE2EDuration="3.764782976s" podCreationTimestamp="2026-04-21 04:26:33 +0000 UTC" firstStartedPulling="2026-04-21 04:26:34.619038736 +0000 UTC m=+163.000141603" lastFinishedPulling="2026-04-21 04:26:35.887701674 +0000 UTC m=+164.268804543" observedRunningTime="2026-04-21 04:26:36.76386647 +0000 UTC m=+165.144969360" watchObservedRunningTime="2026-04-21 04:26:36.764782976 +0000 UTC m=+165.145885864" Apr 21 04:26:38.295458 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.295417 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-m5jgk"] Apr 21 04:26:38.299153 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.299124 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-6xx4x"] Apr 21 04:26:38.299334 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.299312 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" Apr 21 04:26:38.302101 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.302080 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.302310 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.302267 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 21 04:26:38.302310 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.302273 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 21 04:26:38.302484 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.302358 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-7xfr6\"" Apr 21 04:26:38.302572 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.302556 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 21 04:26:38.304166 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.304144 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 04:26:38.304166 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.304154 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-bm2fq\"" Apr 21 04:26:38.304315 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.304192 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 04:26:38.304315 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.304221 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 04:26:38.308292 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.308271 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-m5jgk"] Apr 21 04:26:38.384348 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.384307 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dc428b9d-35a8-45e5-b688-c59395e673af-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-m5jgk\" (UID: \"dc428b9d-35a8-45e5-b688-c59395e673af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" Apr 21 04:26:38.384535 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.384359 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b807a52f-f037-480f-8383-561de6752c10-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.384535 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.384412 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b807a52f-f037-480f-8383-561de6752c10-node-exporter-textfile\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.384535 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.384436 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b807a52f-f037-480f-8383-561de6752c10-node-exporter-wtmp\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.384535 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.384456 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b807a52f-f037-480f-8383-561de6752c10-metrics-client-ca\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.384535 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.384494 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/dc428b9d-35a8-45e5-b688-c59395e673af-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-m5jgk\" (UID: \"dc428b9d-35a8-45e5-b688-c59395e673af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" Apr 21 04:26:38.384782 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.384544 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dc428b9d-35a8-45e5-b688-c59395e673af-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-m5jgk\" (UID: \"dc428b9d-35a8-45e5-b688-c59395e673af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" Apr 21 04:26:38.384782 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.384606 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b807a52f-f037-480f-8383-561de6752c10-sys\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.384782 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.384641 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/dc428b9d-35a8-45e5-b688-c59395e673af-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-m5jgk\" (UID: \"dc428b9d-35a8-45e5-b688-c59395e673af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" Apr 21 04:26:38.384782 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.384669 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b807a52f-f037-480f-8383-561de6752c10-node-exporter-accelerators-collector-config\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.384782 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.384710 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dc428b9d-35a8-45e5-b688-c59395e673af-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-m5jgk\" (UID: \"dc428b9d-35a8-45e5-b688-c59395e673af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" Apr 21 04:26:38.384782 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.384726 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b807a52f-f037-480f-8383-561de6752c10-node-exporter-tls\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.384782 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.384745 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svlp5\" (UniqueName: \"kubernetes.io/projected/dc428b9d-35a8-45e5-b688-c59395e673af-kube-api-access-svlp5\") pod \"kube-state-metrics-69db897b98-m5jgk\" (UID: \"dc428b9d-35a8-45e5-b688-c59395e673af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" Apr 21 04:26:38.384782 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.384775 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b807a52f-f037-480f-8383-561de6752c10-root\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.385013 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.384792 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nnv7\" (UniqueName: \"kubernetes.io/projected/b807a52f-f037-480f-8383-561de6752c10-kube-api-access-9nnv7\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.485386 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.485346 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/dc428b9d-35a8-45e5-b688-c59395e673af-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-m5jgk\" (UID: \"dc428b9d-35a8-45e5-b688-c59395e673af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" Apr 21 04:26:38.485386 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.485390 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dc428b9d-35a8-45e5-b688-c59395e673af-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-m5jgk\" (UID: \"dc428b9d-35a8-45e5-b688-c59395e673af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" Apr 21 04:26:38.485656 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.485413 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b807a52f-f037-480f-8383-561de6752c10-sys\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.485656 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.485433 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/dc428b9d-35a8-45e5-b688-c59395e673af-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-m5jgk\" (UID: \"dc428b9d-35a8-45e5-b688-c59395e673af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" Apr 21 04:26:38.485656 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.485496 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b807a52f-f037-480f-8383-561de6752c10-sys\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.485656 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.485564 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b807a52f-f037-480f-8383-561de6752c10-node-exporter-accelerators-collector-config\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.485656 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.485636 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dc428b9d-35a8-45e5-b688-c59395e673af-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-m5jgk\" (UID: \"dc428b9d-35a8-45e5-b688-c59395e673af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" Apr 21 04:26:38.485887 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.485666 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b807a52f-f037-480f-8383-561de6752c10-node-exporter-tls\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.485887 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.485694 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svlp5\" (UniqueName: \"kubernetes.io/projected/dc428b9d-35a8-45e5-b688-c59395e673af-kube-api-access-svlp5\") pod \"kube-state-metrics-69db897b98-m5jgk\" (UID: \"dc428b9d-35a8-45e5-b688-c59395e673af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" Apr 21 04:26:38.485887 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.485741 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b807a52f-f037-480f-8383-561de6752c10-root\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.485887 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.485770 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9nnv7\" (UniqueName: \"kubernetes.io/projected/b807a52f-f037-480f-8383-561de6752c10-kube-api-access-9nnv7\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.485887 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.485808 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dc428b9d-35a8-45e5-b688-c59395e673af-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-m5jgk\" (UID: \"dc428b9d-35a8-45e5-b688-c59395e673af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" Apr 21 04:26:38.485887 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:26:38.485838 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 21 04:26:38.485887 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.485875 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/dc428b9d-35a8-45e5-b688-c59395e673af-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-m5jgk\" (UID: \"dc428b9d-35a8-45e5-b688-c59395e673af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" Apr 21 04:26:38.486234 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:26:38.485923 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b807a52f-f037-480f-8383-561de6752c10-node-exporter-tls podName:b807a52f-f037-480f-8383-561de6752c10 nodeName:}" failed. No retries permitted until 2026-04-21 04:26:38.985899311 +0000 UTC m=+167.367002183 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/b807a52f-f037-480f-8383-561de6752c10-node-exporter-tls") pod "node-exporter-6xx4x" (UID: "b807a52f-f037-480f-8383-561de6752c10") : secret "node-exporter-tls" not found Apr 21 04:26:38.486234 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.486184 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b807a52f-f037-480f-8383-561de6752c10-node-exporter-accelerators-collector-config\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.486353 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.486238 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/dc428b9d-35a8-45e5-b688-c59395e673af-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-m5jgk\" (UID: \"dc428b9d-35a8-45e5-b688-c59395e673af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" Apr 21 04:26:38.486410 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.486393 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dc428b9d-35a8-45e5-b688-c59395e673af-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-m5jgk\" (UID: \"dc428b9d-35a8-45e5-b688-c59395e673af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" Apr 21 04:26:38.486637 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.486618 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b807a52f-f037-480f-8383-561de6752c10-root\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.486761 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.486741 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b807a52f-f037-480f-8383-561de6752c10-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.486855 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.486834 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b807a52f-f037-480f-8383-561de6752c10-node-exporter-textfile\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.486926 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.486884 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b807a52f-f037-480f-8383-561de6752c10-node-exporter-wtmp\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.486926 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.486908 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b807a52f-f037-480f-8383-561de6752c10-metrics-client-ca\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.487102 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.487076 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b807a52f-f037-480f-8383-561de6752c10-node-exporter-wtmp\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.487180 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.487088 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b807a52f-f037-480f-8383-561de6752c10-node-exporter-textfile\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.487441 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.487421 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b807a52f-f037-480f-8383-561de6752c10-metrics-client-ca\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.488150 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.488125 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dc428b9d-35a8-45e5-b688-c59395e673af-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-m5jgk\" (UID: \"dc428b9d-35a8-45e5-b688-c59395e673af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" Apr 21 04:26:38.488980 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.488959 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b807a52f-f037-480f-8383-561de6752c10-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.489083 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.489058 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dc428b9d-35a8-45e5-b688-c59395e673af-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-m5jgk\" (UID: \"dc428b9d-35a8-45e5-b688-c59395e673af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" Apr 21 04:26:38.499482 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.499453 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nnv7\" (UniqueName: \"kubernetes.io/projected/b807a52f-f037-480f-8383-561de6752c10-kube-api-access-9nnv7\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.499729 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.499712 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svlp5\" (UniqueName: \"kubernetes.io/projected/dc428b9d-35a8-45e5-b688-c59395e673af-kube-api-access-svlp5\") pod \"kube-state-metrics-69db897b98-m5jgk\" (UID: \"dc428b9d-35a8-45e5-b688-c59395e673af\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" Apr 21 04:26:38.610296 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.610207 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" Apr 21 04:26:38.728587 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.728548 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-m5jgk"] Apr 21 04:26:38.731945 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:26:38.731911 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc428b9d_35a8_45e5_b688_c59395e673af.slice/crio-e29001edb1cc2d4acda97b918fb21bc2812fc20e88736d60223e030cae506dd3 WatchSource:0}: Error finding container e29001edb1cc2d4acda97b918fb21bc2812fc20e88736d60223e030cae506dd3: Status 404 returned error can't find the container with id e29001edb1cc2d4acda97b918fb21bc2812fc20e88736d60223e030cae506dd3 Apr 21 04:26:38.752850 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.752820 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" event={"ID":"dc428b9d-35a8-45e5-b688-c59395e673af","Type":"ContainerStarted","Data":"e29001edb1cc2d4acda97b918fb21bc2812fc20e88736d60223e030cae506dd3"} Apr 21 04:26:38.991943 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.991911 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b807a52f-f037-480f-8383-561de6752c10-node-exporter-tls\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:38.994620 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:38.994582 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b807a52f-f037-480f-8383-561de6752c10-node-exporter-tls\") pod \"node-exporter-6xx4x\" (UID: \"b807a52f-f037-480f-8383-561de6752c10\") " pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:39.215982 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:39.215952 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-6xx4x" Apr 21 04:26:39.224449 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:26:39.224416 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb807a52f_f037_480f_8383_561de6752c10.slice/crio-58da8e10ff422d6a8b2df445cd87af6fd9e759f9f1743f33fc04865d3808663c WatchSource:0}: Error finding container 58da8e10ff422d6a8b2df445cd87af6fd9e759f9f1743f33fc04865d3808663c: Status 404 returned error can't find the container with id 58da8e10ff422d6a8b2df445cd87af6fd9e759f9f1743f33fc04865d3808663c Apr 21 04:26:39.757552 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:39.757515 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6xx4x" event={"ID":"b807a52f-f037-480f-8383-561de6752c10","Type":"ContainerStarted","Data":"58da8e10ff422d6a8b2df445cd87af6fd9e759f9f1743f33fc04865d3808663c"} Apr 21 04:26:40.176432 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:40.176407 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:26:40.762638 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:40.762522 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" event={"ID":"dc428b9d-35a8-45e5-b688-c59395e673af","Type":"ContainerStarted","Data":"2a961af822833f06135ac6d32c00f6bf20c2237b09e52334b5376482eb8e7026"} Apr 21 04:26:40.762638 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:40.762567 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" event={"ID":"dc428b9d-35a8-45e5-b688-c59395e673af","Type":"ContainerStarted","Data":"46637fcb5cbbd19e88946207cffa7bfdc443810bd5d9e237d6f3667a1b7efe6a"} Apr 21 04:26:40.762638 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:40.762578 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" event={"ID":"dc428b9d-35a8-45e5-b688-c59395e673af","Type":"ContainerStarted","Data":"b2c9d68bf85c0864353eeb49e44ea50b3d6b2cf9f7215b7ee1e341318c9f55fd"} Apr 21 04:26:40.763910 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:40.763886 2574 generic.go:358] "Generic (PLEG): container finished" podID="b807a52f-f037-480f-8383-561de6752c10" containerID="225c2be50dbe46cb34fd0f3bb04a892f50f9570096ce2d546357004086d2fc5f" exitCode=0 Apr 21 04:26:40.764016 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:40.763943 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6xx4x" event={"ID":"b807a52f-f037-480f-8383-561de6752c10","Type":"ContainerDied","Data":"225c2be50dbe46cb34fd0f3bb04a892f50f9570096ce2d546357004086d2fc5f"} Apr 21 04:26:40.779756 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:40.779712 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-m5jgk" podStartSLOduration=1.341932962 podStartE2EDuration="2.779697759s" podCreationTimestamp="2026-04-21 04:26:38 +0000 UTC" firstStartedPulling="2026-04-21 04:26:38.733771144 +0000 UTC m=+167.114874013" lastFinishedPulling="2026-04-21 04:26:40.171535939 +0000 UTC m=+168.552638810" observedRunningTime="2026-04-21 04:26:40.778637074 +0000 UTC m=+169.159739962" watchObservedRunningTime="2026-04-21 04:26:40.779697759 +0000 UTC m=+169.160800647" Apr 21 04:26:41.769606 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:41.769568 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6xx4x" event={"ID":"b807a52f-f037-480f-8383-561de6752c10","Type":"ContainerStarted","Data":"433119c5f256289b810bf88aa6df162a452d69cf0866170dc224f0ce73e94b8c"} Apr 21 04:26:41.769985 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:41.769618 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-6xx4x" event={"ID":"b807a52f-f037-480f-8383-561de6752c10","Type":"ContainerStarted","Data":"77a69beb78de444ed080cc61a5aa27c46a490a7503fe401b55090fc040229188"} Apr 21 04:26:41.797402 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:41.797345 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-6xx4x" podStartSLOduration=2.850111831 podStartE2EDuration="3.797328927s" podCreationTimestamp="2026-04-21 04:26:38 +0000 UTC" firstStartedPulling="2026-04-21 04:26:39.226080084 +0000 UTC m=+167.607182953" lastFinishedPulling="2026-04-21 04:26:40.173297183 +0000 UTC m=+168.554400049" observedRunningTime="2026-04-21 04:26:41.795289294 +0000 UTC m=+170.176392204" watchObservedRunningTime="2026-04-21 04:26:41.797328927 +0000 UTC m=+170.178431814" Apr 21 04:26:42.173806 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.173716 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rqbs9" Apr 21 04:26:42.176399 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.176375 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l78gs\"" Apr 21 04:26:42.184119 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.184096 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rqbs9" Apr 21 04:26:42.305198 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.305163 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rqbs9"] Apr 21 04:26:42.308052 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:26:42.308024 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6252963e_17c6_4a33_86f2_a83c646d8f7c.slice/crio-91d511dd48b6c52aa2f316dd08cbef0b66e789cea1b7ee565550d08dfc02e221 WatchSource:0}: Error finding container 91d511dd48b6c52aa2f316dd08cbef0b66e789cea1b7ee565550d08dfc02e221: Status 404 returned error can't find the container with id 91d511dd48b6c52aa2f316dd08cbef0b66e789cea1b7ee565550d08dfc02e221 Apr 21 04:26:42.571413 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.571379 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-57f8b777bc-fmfdt"] Apr 21 04:26:42.575862 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.575837 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:42.578482 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.578449 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 21 04:26:42.578482 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.578478 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 21 04:26:42.578711 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.578519 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 21 04:26:42.579199 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.579180 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 04:26:42.579199 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.579196 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-f4sgw\"" Apr 21 04:26:42.579373 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.579295 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-9nabjsg7folp7\"" Apr 21 04:26:42.590275 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.590248 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-57f8b777bc-fmfdt"] Apr 21 04:26:42.726719 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.726656 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e6975186-f2b2-47eb-9ec1-e32179e2d5b9-secret-metrics-server-client-certs\") pod \"metrics-server-57f8b777bc-fmfdt\" (UID: \"e6975186-f2b2-47eb-9ec1-e32179e2d5b9\") " pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:42.726905 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.726756 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e6975186-f2b2-47eb-9ec1-e32179e2d5b9-secret-metrics-server-tls\") pod \"metrics-server-57f8b777bc-fmfdt\" (UID: \"e6975186-f2b2-47eb-9ec1-e32179e2d5b9\") " pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:42.726905 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.726836 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e6975186-f2b2-47eb-9ec1-e32179e2d5b9-audit-log\") pod \"metrics-server-57f8b777bc-fmfdt\" (UID: \"e6975186-f2b2-47eb-9ec1-e32179e2d5b9\") " pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:42.726905 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.726875 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq8dr\" (UniqueName: \"kubernetes.io/projected/e6975186-f2b2-47eb-9ec1-e32179e2d5b9-kube-api-access-kq8dr\") pod \"metrics-server-57f8b777bc-fmfdt\" (UID: \"e6975186-f2b2-47eb-9ec1-e32179e2d5b9\") " pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:42.727082 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.727005 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e6975186-f2b2-47eb-9ec1-e32179e2d5b9-metrics-server-audit-profiles\") pod \"metrics-server-57f8b777bc-fmfdt\" (UID: \"e6975186-f2b2-47eb-9ec1-e32179e2d5b9\") " pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:42.727082 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.727063 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6975186-f2b2-47eb-9ec1-e32179e2d5b9-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57f8b777bc-fmfdt\" (UID: \"e6975186-f2b2-47eb-9ec1-e32179e2d5b9\") " pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:42.727194 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.727124 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6975186-f2b2-47eb-9ec1-e32179e2d5b9-client-ca-bundle\") pod \"metrics-server-57f8b777bc-fmfdt\" (UID: \"e6975186-f2b2-47eb-9ec1-e32179e2d5b9\") " pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:42.773656 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.773613 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rqbs9" event={"ID":"6252963e-17c6-4a33-86f2-a83c646d8f7c","Type":"ContainerStarted","Data":"91d511dd48b6c52aa2f316dd08cbef0b66e789cea1b7ee565550d08dfc02e221"} Apr 21 04:26:42.827674 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.827570 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e6975186-f2b2-47eb-9ec1-e32179e2d5b9-secret-metrics-server-tls\") pod \"metrics-server-57f8b777bc-fmfdt\" (UID: \"e6975186-f2b2-47eb-9ec1-e32179e2d5b9\") " pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:42.827856 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.827731 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e6975186-f2b2-47eb-9ec1-e32179e2d5b9-audit-log\") pod \"metrics-server-57f8b777bc-fmfdt\" (UID: \"e6975186-f2b2-47eb-9ec1-e32179e2d5b9\") " pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:42.827856 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.827765 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kq8dr\" (UniqueName: \"kubernetes.io/projected/e6975186-f2b2-47eb-9ec1-e32179e2d5b9-kube-api-access-kq8dr\") pod \"metrics-server-57f8b777bc-fmfdt\" (UID: \"e6975186-f2b2-47eb-9ec1-e32179e2d5b9\") " pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:42.827856 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.827842 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e6975186-f2b2-47eb-9ec1-e32179e2d5b9-metrics-server-audit-profiles\") pod \"metrics-server-57f8b777bc-fmfdt\" (UID: \"e6975186-f2b2-47eb-9ec1-e32179e2d5b9\") " pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:42.828006 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.827892 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6975186-f2b2-47eb-9ec1-e32179e2d5b9-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57f8b777bc-fmfdt\" (UID: \"e6975186-f2b2-47eb-9ec1-e32179e2d5b9\") " pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:42.828006 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.827946 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6975186-f2b2-47eb-9ec1-e32179e2d5b9-client-ca-bundle\") pod \"metrics-server-57f8b777bc-fmfdt\" (UID: \"e6975186-f2b2-47eb-9ec1-e32179e2d5b9\") " pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:42.828098 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.828012 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e6975186-f2b2-47eb-9ec1-e32179e2d5b9-secret-metrics-server-client-certs\") pod \"metrics-server-57f8b777bc-fmfdt\" (UID: \"e6975186-f2b2-47eb-9ec1-e32179e2d5b9\") " pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:42.828581 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.828203 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e6975186-f2b2-47eb-9ec1-e32179e2d5b9-audit-log\") pod \"metrics-server-57f8b777bc-fmfdt\" (UID: \"e6975186-f2b2-47eb-9ec1-e32179e2d5b9\") " pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:42.828788 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.828762 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6975186-f2b2-47eb-9ec1-e32179e2d5b9-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57f8b777bc-fmfdt\" (UID: \"e6975186-f2b2-47eb-9ec1-e32179e2d5b9\") " pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:42.829040 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.828991 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e6975186-f2b2-47eb-9ec1-e32179e2d5b9-metrics-server-audit-profiles\") pod \"metrics-server-57f8b777bc-fmfdt\" (UID: \"e6975186-f2b2-47eb-9ec1-e32179e2d5b9\") " pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:42.830726 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.830683 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e6975186-f2b2-47eb-9ec1-e32179e2d5b9-secret-metrics-server-client-certs\") pod \"metrics-server-57f8b777bc-fmfdt\" (UID: \"e6975186-f2b2-47eb-9ec1-e32179e2d5b9\") " pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:42.830927 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.830906 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e6975186-f2b2-47eb-9ec1-e32179e2d5b9-secret-metrics-server-tls\") pod \"metrics-server-57f8b777bc-fmfdt\" (UID: \"e6975186-f2b2-47eb-9ec1-e32179e2d5b9\") " pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:42.831065 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.831043 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6975186-f2b2-47eb-9ec1-e32179e2d5b9-client-ca-bundle\") pod \"metrics-server-57f8b777bc-fmfdt\" (UID: \"e6975186-f2b2-47eb-9ec1-e32179e2d5b9\") " pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:42.835548 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.835524 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq8dr\" (UniqueName: \"kubernetes.io/projected/e6975186-f2b2-47eb-9ec1-e32179e2d5b9-kube-api-access-kq8dr\") pod \"metrics-server-57f8b777bc-fmfdt\" (UID: \"e6975186-f2b2-47eb-9ec1-e32179e2d5b9\") " pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:42.885707 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:42.885661 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:26:43.037450 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:43.037358 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-57f8b777bc-fmfdt"] Apr 21 04:26:43.040464 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:26:43.040429 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6975186_f2b2_47eb_9ec1_e32179e2d5b9.slice/crio-911ee36f6d8affe10ac9de91ae640fe517feecf2e2ccdd873d096bde547c5fcf WatchSource:0}: Error finding container 911ee36f6d8affe10ac9de91ae640fe517feecf2e2ccdd873d096bde547c5fcf: Status 404 returned error can't find the container with id 911ee36f6d8affe10ac9de91ae640fe517feecf2e2ccdd873d096bde547c5fcf Apr 21 04:26:43.042959 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:43.042931 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-nj9bs"] Apr 21 04:26:43.047930 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:43.047898 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nj9bs" Apr 21 04:26:43.050567 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:43.050493 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 21 04:26:43.050701 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:43.050631 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-dcwhh\"" Apr 21 04:26:43.052416 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:43.052391 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-nj9bs"] Apr 21 04:26:43.131093 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:43.131005 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ae39b787-f2c0-425c-87ca-a4a8a1f8e0df-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-nj9bs\" (UID: \"ae39b787-f2c0-425c-87ca-a4a8a1f8e0df\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nj9bs" Apr 21 04:26:43.232127 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:43.232083 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ae39b787-f2c0-425c-87ca-a4a8a1f8e0df-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-nj9bs\" (UID: \"ae39b787-f2c0-425c-87ca-a4a8a1f8e0df\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nj9bs" Apr 21 04:26:43.232301 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:26:43.232282 2574 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 21 04:26:43.232385 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:26:43.232371 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae39b787-f2c0-425c-87ca-a4a8a1f8e0df-monitoring-plugin-cert podName:ae39b787-f2c0-425c-87ca-a4a8a1f8e0df nodeName:}" failed. No retries permitted until 2026-04-21 04:26:43.732348819 +0000 UTC m=+172.113451711 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/ae39b787-f2c0-425c-87ca-a4a8a1f8e0df-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-nj9bs" (UID: "ae39b787-f2c0-425c-87ca-a4a8a1f8e0df") : secret "monitoring-plugin-cert" not found Apr 21 04:26:43.737798 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:43.737757 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ae39b787-f2c0-425c-87ca-a4a8a1f8e0df-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-nj9bs\" (UID: \"ae39b787-f2c0-425c-87ca-a4a8a1f8e0df\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nj9bs" Apr 21 04:26:43.741139 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:43.741107 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ae39b787-f2c0-425c-87ca-a4a8a1f8e0df-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-nj9bs\" (UID: \"ae39b787-f2c0-425c-87ca-a4a8a1f8e0df\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nj9bs" Apr 21 04:26:43.778128 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:43.778078 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" event={"ID":"e6975186-f2b2-47eb-9ec1-e32179e2d5b9","Type":"ContainerStarted","Data":"911ee36f6d8affe10ac9de91ae640fe517feecf2e2ccdd873d096bde547c5fcf"} Apr 21 04:26:43.962352 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:43.962172 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nj9bs" Apr 21 04:26:44.488906 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.488871 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:26:44.503021 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.502996 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.504798 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.504768 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:26:44.508712 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.506410 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 04:26:44.508712 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.506779 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-mdtzn\"" Apr 21 04:26:44.510820 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.509379 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 04:26:44.510820 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.509673 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 04:26:44.510820 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.509796 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-atld98bogklo2\"" Apr 21 04:26:44.510820 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.509901 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 04:26:44.510820 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.510120 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 04:26:44.510820 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.509762 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 04:26:44.510820 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.510267 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 04:26:44.510820 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.510633 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 04:26:44.511681 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.510886 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 04:26:44.512630 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.512578 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 04:26:44.514392 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.514355 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 04:26:44.527108 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.527082 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 04:26:44.617781 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.617748 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-nj9bs"] Apr 21 04:26:44.621866 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:26:44.621822 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae39b787_f2c0_425c_87ca_a4a8a1f8e0df.slice/crio-0d41f96fec481bd817bb000c8ab042f41b671ff10e96617b9eaa944a3319fc0b WatchSource:0}: Error finding container 0d41f96fec481bd817bb000c8ab042f41b671ff10e96617b9eaa944a3319fc0b: Status 404 returned error can't find the container with id 0d41f96fec481bd817bb000c8ab042f41b671ff10e96617b9eaa944a3319fc0b Apr 21 04:26:44.646032 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.645994 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.646223 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.646040 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.646223 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.646065 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.646223 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.646101 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.646223 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.646129 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.646223 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.646147 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.646223 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.646164 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0687fbfc-67d2-4e25-9013-7470a6158b64-config-out\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.646223 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.646180 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-web-config\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.646223 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.646194 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5gqz\" (UniqueName: \"kubernetes.io/projected/0687fbfc-67d2-4e25-9013-7470a6158b64-kube-api-access-m5gqz\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.646627 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.646267 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.646627 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.646315 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.646627 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.646343 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0687fbfc-67d2-4e25-9013-7470a6158b64-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.646627 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.646369 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-config\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.646627 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.646407 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.646627 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.646433 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.646627 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.646477 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0687fbfc-67d2-4e25-9013-7470a6158b64-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.646627 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.646568 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.646627 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.646612 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.747680 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.747632 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.747871 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.747687 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.747871 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.747719 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.747871 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.747757 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.747871 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.747784 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.747871 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.747806 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.747871 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.747833 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.747871 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.747854 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.748223 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.747884 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0687fbfc-67d2-4e25-9013-7470a6158b64-config-out\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.748223 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.747908 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-web-config\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.748223 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.747930 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5gqz\" (UniqueName: \"kubernetes.io/projected/0687fbfc-67d2-4e25-9013-7470a6158b64-kube-api-access-m5gqz\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.748223 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.747963 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.748223 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.747995 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.748223 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.748019 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0687fbfc-67d2-4e25-9013-7470a6158b64-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.748223 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.748046 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-config\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.748223 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.748077 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.748223 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.748104 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.748223 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.748140 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0687fbfc-67d2-4e25-9013-7470a6158b64-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.748685 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.748636 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0687fbfc-67d2-4e25-9013-7470a6158b64-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.749302 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.749273 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.750301 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.750272 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.750731 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.750708 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.751083 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.751056 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.753424 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.753126 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.753670 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.753642 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-config\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.756537 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.754248 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-web-config\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.756537 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.755500 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0687fbfc-67d2-4e25-9013-7470a6158b64-config-out\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.756537 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.755889 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0687fbfc-67d2-4e25-9013-7470a6158b64-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.756537 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.756230 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.756537 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.756314 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.756882 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.756792 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.757426 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.757401 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.757517 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.757448 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.757632 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.757612 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.757888 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.757862 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.758638 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.758461 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5gqz\" (UniqueName: \"kubernetes.io/projected/0687fbfc-67d2-4e25-9013-7470a6158b64-kube-api-access-m5gqz\") pod \"prometheus-k8s-0\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:44.783948 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.783356 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rqbs9" event={"ID":"6252963e-17c6-4a33-86f2-a83c646d8f7c","Type":"ContainerStarted","Data":"79505437b980f438b50a115315ff6cef35a8193cfabb0dfd71d2bee227deb0f4"} Apr 21 04:26:44.785825 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.785787 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" event={"ID":"e6975186-f2b2-47eb-9ec1-e32179e2d5b9","Type":"ContainerStarted","Data":"52589dc673f5267e5d1ee7313910d0a8d46371aa8221db4ebddaf380f83f2e72"} Apr 21 04:26:44.786857 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.786829 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nj9bs" event={"ID":"ae39b787-f2c0-425c-87ca-a4a8a1f8e0df","Type":"ContainerStarted","Data":"0d41f96fec481bd817bb000c8ab042f41b671ff10e96617b9eaa944a3319fc0b"} Apr 21 04:26:44.815823 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.814267 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rqbs9" podStartSLOduration=138.664686561 podStartE2EDuration="2m20.814246411s" podCreationTimestamp="2026-04-21 04:24:24 +0000 UTC" firstStartedPulling="2026-04-21 04:26:42.309810076 +0000 UTC m=+170.690912943" lastFinishedPulling="2026-04-21 04:26:44.459369911 +0000 UTC m=+172.840472793" observedRunningTime="2026-04-21 04:26:44.797705342 +0000 UTC m=+173.178808231" watchObservedRunningTime="2026-04-21 04:26:44.814246411 +0000 UTC m=+173.195349300" Apr 21 04:26:44.815823 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.815607 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" podStartSLOduration=1.398891198 podStartE2EDuration="2.815581812s" podCreationTimestamp="2026-04-21 04:26:42 +0000 UTC" firstStartedPulling="2026-04-21 04:26:43.043082814 +0000 UTC m=+171.424185682" lastFinishedPulling="2026-04-21 04:26:44.459773411 +0000 UTC m=+172.840876296" observedRunningTime="2026-04-21 04:26:44.813545254 +0000 UTC m=+173.194648155" watchObservedRunningTime="2026-04-21 04:26:44.815581812 +0000 UTC m=+173.196684701" Apr 21 04:26:44.854109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:44.854070 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:26:45.010393 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:45.010279 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:26:45.014786 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:26:45.014749 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0687fbfc_67d2_4e25_9013_7470a6158b64.slice/crio-3857a34a042578247493b4cd10e52f2db56036e589faa875ee1035b8fde7eb3c WatchSource:0}: Error finding container 3857a34a042578247493b4cd10e52f2db56036e589faa875ee1035b8fde7eb3c: Status 404 returned error can't find the container with id 3857a34a042578247493b4cd10e52f2db56036e589faa875ee1035b8fde7eb3c Apr 21 04:26:45.749519 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:45.749480 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rn9gk" Apr 21 04:26:45.791876 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:45.791819 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687fbfc-67d2-4e25-9013-7470a6158b64","Type":"ContainerStarted","Data":"3857a34a042578247493b4cd10e52f2db56036e589faa875ee1035b8fde7eb3c"} Apr 21 04:26:46.795302 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:46.795262 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nj9bs" event={"ID":"ae39b787-f2c0-425c-87ca-a4a8a1f8e0df","Type":"ContainerStarted","Data":"e722e74b1b4d8992835e1534ecd50592bfe36691bb57824ca160dc8acc6dd381"} Apr 21 04:26:46.795760 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:46.795477 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nj9bs" Apr 21 04:26:46.797017 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:46.796987 2574 generic.go:358] "Generic (PLEG): container finished" podID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerID="904bb17d5ad1b59943f32b774fb4a1788d8099d460b2d50c1f629e6c1db63938" exitCode=0 Apr 21 04:26:46.797165 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:46.797071 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687fbfc-67d2-4e25-9013-7470a6158b64","Type":"ContainerDied","Data":"904bb17d5ad1b59943f32b774fb4a1788d8099d460b2d50c1f629e6c1db63938"} Apr 21 04:26:46.801168 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:46.801148 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nj9bs" Apr 21 04:26:46.810668 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:46.810614 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-nj9bs" podStartSLOduration=2.244875381 podStartE2EDuration="3.810583287s" podCreationTimestamp="2026-04-21 04:26:43 +0000 UTC" firstStartedPulling="2026-04-21 04:26:44.624035712 +0000 UTC m=+173.005138579" lastFinishedPulling="2026-04-21 04:26:46.189743614 +0000 UTC m=+174.570846485" observedRunningTime="2026-04-21 04:26:46.808852623 +0000 UTC m=+175.189955512" watchObservedRunningTime="2026-04-21 04:26:46.810583287 +0000 UTC m=+175.191686238" Apr 21 04:26:49.808958 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:49.808870 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687fbfc-67d2-4e25-9013-7470a6158b64","Type":"ContainerStarted","Data":"66ee7f1f166961c0bcc215641425f92023ae00c0d671636cb1571854243aa9fd"} Apr 21 04:26:49.808958 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:49.808906 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687fbfc-67d2-4e25-9013-7470a6158b64","Type":"ContainerStarted","Data":"90714eace617720fa910bc1326f0fcbbb96947ad7395c41c3bcbde943e86c893"} Apr 21 04:26:51.817979 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:51.817891 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687fbfc-67d2-4e25-9013-7470a6158b64","Type":"ContainerStarted","Data":"ee1e8b78c24881d75278344f464c7b0ed8f4207f2d73c14f50066bb4640b211e"} Apr 21 04:26:51.817979 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:51.817927 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687fbfc-67d2-4e25-9013-7470a6158b64","Type":"ContainerStarted","Data":"4e0a992a588dbc1e8dd6ad1c241af2ff794fe6f05b5dfcf51074ea57c31f5aa5"} Apr 21 04:26:51.817979 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:51.817938 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687fbfc-67d2-4e25-9013-7470a6158b64","Type":"ContainerStarted","Data":"3b31724a1cccf993288813ca7684cfbf8d2550846e97276cad1cf70b93820907"} Apr 21 04:26:51.817979 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:51.817946 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687fbfc-67d2-4e25-9013-7470a6158b64","Type":"ContainerStarted","Data":"88a1b9d017c0ec031ffa7dfc59780dbe8b11c18e6cec80cfd279cc68d5ab727b"} Apr 21 04:26:51.852949 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:51.852884 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.651082884 podStartE2EDuration="7.852864148s" podCreationTimestamp="2026-04-21 04:26:44 +0000 UTC" firstStartedPulling="2026-04-21 04:26:45.016902933 +0000 UTC m=+173.398005805" lastFinishedPulling="2026-04-21 04:26:51.218684198 +0000 UTC m=+179.599787069" observedRunningTime="2026-04-21 04:26:51.852428353 +0000 UTC m=+180.233531263" watchObservedRunningTime="2026-04-21 04:26:51.852864148 +0000 UTC m=+180.233967037" Apr 21 04:26:54.854458 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:26:54.854413 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:27:02.886314 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:27:02.886276 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:27:02.886314 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:27:02.886321 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:27:11.344745 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:27:11.344709 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rqbs9_6252963e-17c6-4a33-86f2-a83c646d8f7c/serve-healthcheck-canary/0.log" Apr 21 04:27:20.904842 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:27:20.904808 2574 generic.go:358] "Generic (PLEG): container finished" podID="0bb93254-dca8-4e59-9bd6-90ac24699232" containerID="238b4610742ced37a868111e8a70641f03ed7a85895ec14dd70b4593b5c0bb99" exitCode=0 Apr 21 04:27:20.905306 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:27:20.904882 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vklr2" event={"ID":"0bb93254-dca8-4e59-9bd6-90ac24699232","Type":"ContainerDied","Data":"238b4610742ced37a868111e8a70641f03ed7a85895ec14dd70b4593b5c0bb99"} Apr 21 04:27:20.905306 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:27:20.905284 2574 scope.go:117] "RemoveContainer" containerID="238b4610742ced37a868111e8a70641f03ed7a85895ec14dd70b4593b5c0bb99" Apr 21 04:27:21.909629 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:27:21.909571 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-vklr2" event={"ID":"0bb93254-dca8-4e59-9bd6-90ac24699232","Type":"ContainerStarted","Data":"31a74e020eb0d21f0d42b89f4cef589bd8a967213ed053b57a9fd9c1ad95c547"} Apr 21 04:27:22.893343 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:27:22.893286 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:27:22.902974 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:27:22.902936 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-57f8b777bc-fmfdt" Apr 21 04:27:30.939396 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:27:30.939360 2574 generic.go:358] "Generic (PLEG): container finished" podID="5f8c49b5-891d-4368-a08c-06834d2dff98" containerID="6368c5962dbb0b5ff2d4ea7005401bdd05c357aa8260b1eb20074aa0ba9f6023" exitCode=0 Apr 21 04:27:30.939824 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:27:30.939430 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wfrb6" event={"ID":"5f8c49b5-891d-4368-a08c-06834d2dff98","Type":"ContainerDied","Data":"6368c5962dbb0b5ff2d4ea7005401bdd05c357aa8260b1eb20074aa0ba9f6023"} Apr 21 04:27:30.939824 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:27:30.939755 2574 scope.go:117] "RemoveContainer" containerID="6368c5962dbb0b5ff2d4ea7005401bdd05c357aa8260b1eb20074aa0ba9f6023" Apr 21 04:27:31.946154 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:27:31.946118 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-wfrb6" event={"ID":"5f8c49b5-891d-4368-a08c-06834d2dff98","Type":"ContainerStarted","Data":"9542db092ee5ab987c752914929087de2091fbd2eb7cb08e244cda68945b11db"} Apr 21 04:27:44.855378 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:27:44.855329 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:27:44.874773 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:27:44.874740 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:27:44.997770 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:27:44.997738 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:02.820907 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:02.820814 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:28:02.821403 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:02.821365 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="thanos-sidecar" containerID="cri-o://88a1b9d017c0ec031ffa7dfc59780dbe8b11c18e6cec80cfd279cc68d5ab727b" gracePeriod=600 Apr 21 04:28:02.821483 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:02.821377 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="kube-rbac-proxy-thanos" containerID="cri-o://ee1e8b78c24881d75278344f464c7b0ed8f4207f2d73c14f50066bb4640b211e" gracePeriod=600 Apr 21 04:28:02.821483 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:02.821322 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="prometheus" containerID="cri-o://90714eace617720fa910bc1326f0fcbbb96947ad7395c41c3bcbde943e86c893" gracePeriod=600 Apr 21 04:28:02.821563 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:02.821364 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="kube-rbac-proxy" containerID="cri-o://4e0a992a588dbc1e8dd6ad1c241af2ff794fe6f05b5dfcf51074ea57c31f5aa5" gracePeriod=600 Apr 21 04:28:02.821563 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:02.821369 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="config-reloader" containerID="cri-o://66ee7f1f166961c0bcc215641425f92023ae00c0d671636cb1571854243aa9fd" gracePeriod=600 Apr 21 04:28:02.821563 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:02.821362 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="kube-rbac-proxy-web" containerID="cri-o://3b31724a1cccf993288813ca7684cfbf8d2550846e97276cad1cf70b93820907" gracePeriod=600 Apr 21 04:28:03.038738 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.038702 2574 generic.go:358] "Generic (PLEG): container finished" podID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerID="ee1e8b78c24881d75278344f464c7b0ed8f4207f2d73c14f50066bb4640b211e" exitCode=0 Apr 21 04:28:03.038738 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.038732 2574 generic.go:358] "Generic (PLEG): container finished" podID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerID="4e0a992a588dbc1e8dd6ad1c241af2ff794fe6f05b5dfcf51074ea57c31f5aa5" exitCode=0 Apr 21 04:28:03.038738 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.038740 2574 generic.go:358] "Generic (PLEG): container finished" podID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerID="3b31724a1cccf993288813ca7684cfbf8d2550846e97276cad1cf70b93820907" exitCode=0 Apr 21 04:28:03.038988 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.038748 2574 generic.go:358] "Generic (PLEG): container finished" podID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerID="88a1b9d017c0ec031ffa7dfc59780dbe8b11c18e6cec80cfd279cc68d5ab727b" exitCode=0 Apr 21 04:28:03.038988 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.038756 2574 generic.go:358] "Generic (PLEG): container finished" podID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerID="66ee7f1f166961c0bcc215641425f92023ae00c0d671636cb1571854243aa9fd" exitCode=0 Apr 21 04:28:03.038988 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.038764 2574 generic.go:358] "Generic (PLEG): container finished" podID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerID="90714eace617720fa910bc1326f0fcbbb96947ad7395c41c3bcbde943e86c893" exitCode=0 Apr 21 04:28:03.038988 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.038775 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687fbfc-67d2-4e25-9013-7470a6158b64","Type":"ContainerDied","Data":"ee1e8b78c24881d75278344f464c7b0ed8f4207f2d73c14f50066bb4640b211e"} Apr 21 04:28:03.038988 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.038816 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687fbfc-67d2-4e25-9013-7470a6158b64","Type":"ContainerDied","Data":"4e0a992a588dbc1e8dd6ad1c241af2ff794fe6f05b5dfcf51074ea57c31f5aa5"} Apr 21 04:28:03.038988 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.038830 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687fbfc-67d2-4e25-9013-7470a6158b64","Type":"ContainerDied","Data":"3b31724a1cccf993288813ca7684cfbf8d2550846e97276cad1cf70b93820907"} Apr 21 04:28:03.038988 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.038842 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687fbfc-67d2-4e25-9013-7470a6158b64","Type":"ContainerDied","Data":"88a1b9d017c0ec031ffa7dfc59780dbe8b11c18e6cec80cfd279cc68d5ab727b"} Apr 21 04:28:03.038988 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.038855 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687fbfc-67d2-4e25-9013-7470a6158b64","Type":"ContainerDied","Data":"66ee7f1f166961c0bcc215641425f92023ae00c0d671636cb1571854243aa9fd"} Apr 21 04:28:03.038988 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.038868 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687fbfc-67d2-4e25-9013-7470a6158b64","Type":"ContainerDied","Data":"90714eace617720fa910bc1326f0fcbbb96947ad7395c41c3bcbde943e86c893"} Apr 21 04:28:03.081466 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.081436 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:03.145710 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.145680 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-web-config\") pod \"0687fbfc-67d2-4e25-9013-7470a6158b64\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " Apr 21 04:28:03.145899 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.145722 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"0687fbfc-67d2-4e25-9013-7470a6158b64\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " Apr 21 04:28:03.145899 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.145745 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-configmap-metrics-client-ca\") pod \"0687fbfc-67d2-4e25-9013-7470a6158b64\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " Apr 21 04:28:03.145899 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.145764 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-prometheus-k8s-tls\") pod \"0687fbfc-67d2-4e25-9013-7470a6158b64\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " Apr 21 04:28:03.145899 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.145785 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-config\") pod \"0687fbfc-67d2-4e25-9013-7470a6158b64\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " Apr 21 04:28:03.146120 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.145971 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-configmap-serving-certs-ca-bundle\") pod \"0687fbfc-67d2-4e25-9013-7470a6158b64\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " Apr 21 04:28:03.146120 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.146025 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-kube-rbac-proxy\") pod \"0687fbfc-67d2-4e25-9013-7470a6158b64\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " Apr 21 04:28:03.146120 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.146078 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0687fbfc-67d2-4e25-9013-7470a6158b64-tls-assets\") pod \"0687fbfc-67d2-4e25-9013-7470a6158b64\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " Apr 21 04:28:03.146120 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.146107 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0687fbfc-67d2-4e25-9013-7470a6158b64-config-out\") pod \"0687fbfc-67d2-4e25-9013-7470a6158b64\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " Apr 21 04:28:03.146314 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.146134 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-configmap-kubelet-serving-ca-bundle\") pod \"0687fbfc-67d2-4e25-9013-7470a6158b64\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " Apr 21 04:28:03.146314 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.146175 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-grpc-tls\") pod \"0687fbfc-67d2-4e25-9013-7470a6158b64\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " Apr 21 04:28:03.146314 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.146200 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-thanos-prometheus-http-client-file\") pod \"0687fbfc-67d2-4e25-9013-7470a6158b64\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " Apr 21 04:28:03.146314 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.146270 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"0687fbfc-67d2-4e25-9013-7470a6158b64\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " Apr 21 04:28:03.146314 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.146294 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5gqz\" (UniqueName: \"kubernetes.io/projected/0687fbfc-67d2-4e25-9013-7470a6158b64-kube-api-access-m5gqz\") pod \"0687fbfc-67d2-4e25-9013-7470a6158b64\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " Apr 21 04:28:03.146554 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.146319 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-prometheus-trusted-ca-bundle\") pod \"0687fbfc-67d2-4e25-9013-7470a6158b64\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " Apr 21 04:28:03.146554 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.146346 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0687fbfc-67d2-4e25-9013-7470a6158b64-prometheus-k8s-db\") pod \"0687fbfc-67d2-4e25-9013-7470a6158b64\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " Apr 21 04:28:03.146554 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.146375 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-prometheus-k8s-rulefiles-0\") pod \"0687fbfc-67d2-4e25-9013-7470a6158b64\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " Apr 21 04:28:03.146554 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.146412 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-metrics-client-certs\") pod \"0687fbfc-67d2-4e25-9013-7470a6158b64\" (UID: \"0687fbfc-67d2-4e25-9013-7470a6158b64\") " Apr 21 04:28:03.146554 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.146434 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "0687fbfc-67d2-4e25-9013-7470a6158b64" (UID: "0687fbfc-67d2-4e25-9013-7470a6158b64"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:28:03.146845 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.146706 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:28:03.146845 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.146709 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "0687fbfc-67d2-4e25-9013-7470a6158b64" (UID: "0687fbfc-67d2-4e25-9013-7470a6158b64"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:28:03.148057 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.147718 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "0687fbfc-67d2-4e25-9013-7470a6158b64" (UID: "0687fbfc-67d2-4e25-9013-7470a6158b64"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:28:03.148057 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.147750 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "0687fbfc-67d2-4e25-9013-7470a6158b64" (UID: "0687fbfc-67d2-4e25-9013-7470a6158b64"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:28:03.150253 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.150225 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "0687fbfc-67d2-4e25-9013-7470a6158b64" (UID: "0687fbfc-67d2-4e25-9013-7470a6158b64"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:28:03.150398 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.150252 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "0687fbfc-67d2-4e25-9013-7470a6158b64" (UID: "0687fbfc-67d2-4e25-9013-7470a6158b64"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:28:03.150501 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.150302 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-config" (OuterVolumeSpecName: "config") pod "0687fbfc-67d2-4e25-9013-7470a6158b64" (UID: "0687fbfc-67d2-4e25-9013-7470a6158b64"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:28:03.150585 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.150500 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0687fbfc-67d2-4e25-9013-7470a6158b64-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "0687fbfc-67d2-4e25-9013-7470a6158b64" (UID: "0687fbfc-67d2-4e25-9013-7470a6158b64"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:28:03.151019 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.150994 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "0687fbfc-67d2-4e25-9013-7470a6158b64" (UID: "0687fbfc-67d2-4e25-9013-7470a6158b64"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:28:03.151124 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.151035 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "0687fbfc-67d2-4e25-9013-7470a6158b64" (UID: "0687fbfc-67d2-4e25-9013-7470a6158b64"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:28:03.151124 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.151099 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0687fbfc-67d2-4e25-9013-7470a6158b64-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "0687fbfc-67d2-4e25-9013-7470a6158b64" (UID: "0687fbfc-67d2-4e25-9013-7470a6158b64"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:28:03.151249 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.151146 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "0687fbfc-67d2-4e25-9013-7470a6158b64" (UID: "0687fbfc-67d2-4e25-9013-7470a6158b64"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:28:03.151643 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.151617 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0687fbfc-67d2-4e25-9013-7470a6158b64-config-out" (OuterVolumeSpecName: "config-out") pod "0687fbfc-67d2-4e25-9013-7470a6158b64" (UID: "0687fbfc-67d2-4e25-9013-7470a6158b64"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 04:28:03.151859 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.151831 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "0687fbfc-67d2-4e25-9013-7470a6158b64" (UID: "0687fbfc-67d2-4e25-9013-7470a6158b64"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:28:03.152064 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.152044 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "0687fbfc-67d2-4e25-9013-7470a6158b64" (UID: "0687fbfc-67d2-4e25-9013-7470a6158b64"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:28:03.152426 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.152407 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "0687fbfc-67d2-4e25-9013-7470a6158b64" (UID: "0687fbfc-67d2-4e25-9013-7470a6158b64"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:28:03.152912 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.152890 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0687fbfc-67d2-4e25-9013-7470a6158b64-kube-api-access-m5gqz" (OuterVolumeSpecName: "kube-api-access-m5gqz") pod "0687fbfc-67d2-4e25-9013-7470a6158b64" (UID: "0687fbfc-67d2-4e25-9013-7470a6158b64"). InnerVolumeSpecName "kube-api-access-m5gqz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:28:03.160217 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.160188 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-web-config" (OuterVolumeSpecName: "web-config") pod "0687fbfc-67d2-4e25-9013-7470a6158b64" (UID: "0687fbfc-67d2-4e25-9013-7470a6158b64"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:28:03.247916 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.247876 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-configmap-metrics-client-ca\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:28:03.247916 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.247908 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-prometheus-k8s-tls\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:28:03.247916 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.247921 2574 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-config\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:28:03.248182 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.247934 2574 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-kube-rbac-proxy\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:28:03.248182 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.247947 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0687fbfc-67d2-4e25-9013-7470a6158b64-tls-assets\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:28:03.248182 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.247960 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0687fbfc-67d2-4e25-9013-7470a6158b64-config-out\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:28:03.248182 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.247973 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:28:03.248182 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.247984 2574 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-grpc-tls\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:28:03.248182 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.247995 2574 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-thanos-prometheus-http-client-file\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:28:03.248182 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.248008 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:28:03.248182 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.248023 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m5gqz\" (UniqueName: \"kubernetes.io/projected/0687fbfc-67d2-4e25-9013-7470a6158b64-kube-api-access-m5gqz\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:28:03.248182 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.248036 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-prometheus-trusted-ca-bundle\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:28:03.248182 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.248049 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0687fbfc-67d2-4e25-9013-7470a6158b64-prometheus-k8s-db\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:28:03.248182 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.248061 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0687fbfc-67d2-4e25-9013-7470a6158b64-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:28:03.248182 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.248074 2574 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-metrics-client-certs\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:28:03.248182 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.248087 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-web-config\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:28:03.248182 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.248100 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0687fbfc-67d2-4e25-9013-7470a6158b64-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:28:03.952567 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.952528 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs\") pod \"network-metrics-daemon-g7q5r\" (UID: \"d0b4a7b4-e6a1-4816-a96e-0792f47539d9\") " pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:28:03.954869 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:03.954838 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d0b4a7b4-e6a1-4816-a96e-0792f47539d9-metrics-certs\") pod \"network-metrics-daemon-g7q5r\" (UID: \"d0b4a7b4-e6a1-4816-a96e-0792f47539d9\") " pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:28:04.044361 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.044322 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0687fbfc-67d2-4e25-9013-7470a6158b64","Type":"ContainerDied","Data":"3857a34a042578247493b4cd10e52f2db56036e589faa875ee1035b8fde7eb3c"} Apr 21 04:28:04.044361 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.044371 2574 scope.go:117] "RemoveContainer" containerID="ee1e8b78c24881d75278344f464c7b0ed8f4207f2d73c14f50066bb4640b211e" Apr 21 04:28:04.044624 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.044374 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.052872 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.052853 2574 scope.go:117] "RemoveContainer" containerID="4e0a992a588dbc1e8dd6ad1c241af2ff794fe6f05b5dfcf51074ea57c31f5aa5" Apr 21 04:28:04.059735 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.059717 2574 scope.go:117] "RemoveContainer" containerID="3b31724a1cccf993288813ca7684cfbf8d2550846e97276cad1cf70b93820907" Apr 21 04:28:04.066536 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.066503 2574 scope.go:117] "RemoveContainer" containerID="88a1b9d017c0ec031ffa7dfc59780dbe8b11c18e6cec80cfd279cc68d5ab727b" Apr 21 04:28:04.071101 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.071077 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:28:04.072450 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.072425 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:28:04.073954 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.073938 2574 scope.go:117] "RemoveContainer" containerID="66ee7f1f166961c0bcc215641425f92023ae00c0d671636cb1571854243aa9fd" Apr 21 04:28:04.080317 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.080298 2574 scope.go:117] "RemoveContainer" containerID="90714eace617720fa910bc1326f0fcbbb96947ad7395c41c3bcbde943e86c893" Apr 21 04:28:04.087170 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.087149 2574 scope.go:117] "RemoveContainer" containerID="904bb17d5ad1b59943f32b774fb4a1788d8099d460b2d50c1f629e6c1db63938" Apr 21 04:28:04.096579 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.096556 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:28:04.096886 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.096872 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="kube-rbac-proxy-thanos" Apr 21 04:28:04.096927 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.096888 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="kube-rbac-proxy-thanos" Apr 21 04:28:04.096927 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.096899 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="init-config-reloader" Apr 21 04:28:04.096927 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.096904 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="init-config-reloader" Apr 21 04:28:04.096927 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.096911 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="thanos-sidecar" Apr 21 04:28:04.096927 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.096917 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="thanos-sidecar" Apr 21 04:28:04.096927 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.096926 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="kube-rbac-proxy-web" Apr 21 04:28:04.097103 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.096931 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="kube-rbac-proxy-web" Apr 21 04:28:04.097103 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.096938 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="prometheus" Apr 21 04:28:04.097103 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.096943 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="prometheus" Apr 21 04:28:04.097103 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.096949 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="config-reloader" Apr 21 04:28:04.097103 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.096954 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="config-reloader" Apr 21 04:28:04.097103 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.096963 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="kube-rbac-proxy" Apr 21 04:28:04.097103 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.096968 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="kube-rbac-proxy" Apr 21 04:28:04.097103 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.097011 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="thanos-sidecar" Apr 21 04:28:04.097103 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.097018 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="kube-rbac-proxy-thanos" Apr 21 04:28:04.097103 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.097024 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="prometheus" Apr 21 04:28:04.097103 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.097031 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="kube-rbac-proxy" Apr 21 04:28:04.097103 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.097037 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="kube-rbac-proxy-web" Apr 21 04:28:04.097103 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.097043 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" containerName="config-reloader" Apr 21 04:28:04.102752 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.102731 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.107466 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.107439 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 21 04:28:04.107652 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.107630 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 21 04:28:04.107778 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.107760 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 21 04:28:04.107911 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.107893 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 21 04:28:04.108081 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.108063 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 21 04:28:04.108151 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.108135 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 21 04:28:04.108416 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.108396 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 21 04:28:04.108494 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.108414 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 21 04:28:04.108626 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.108580 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-mdtzn\"" Apr 21 04:28:04.108741 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.108721 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-atld98bogklo2\"" Apr 21 04:28:04.108798 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.108772 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 21 04:28:04.110067 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.110051 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 21 04:28:04.112000 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.111979 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 21 04:28:04.112754 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.112733 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:28:04.114111 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.114091 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 21 04:28:04.154294 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.154255 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d34b0eee-6deb-439e-9d04-8b7bb13c4408-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.154294 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.154295 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d34b0eee-6deb-439e-9d04-8b7bb13c4408-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.154294 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.154317 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.154543 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.154349 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d34b0eee-6deb-439e-9d04-8b7bb13c4408-config-out\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.154543 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.154373 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5pzm\" (UniqueName: \"kubernetes.io/projected/d34b0eee-6deb-439e-9d04-8b7bb13c4408-kube-api-access-s5pzm\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.154543 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.154399 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d34b0eee-6deb-439e-9d04-8b7bb13c4408-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.154543 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.154475 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.154543 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.154497 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d34b0eee-6deb-439e-9d04-8b7bb13c4408-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.154543 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.154515 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.154543 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.154531 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d34b0eee-6deb-439e-9d04-8b7bb13c4408-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.154788 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.154549 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.154788 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.154628 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.154788 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.154672 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.154788 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.154690 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d34b0eee-6deb-439e-9d04-8b7bb13c4408-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.154788 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.154707 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-web-config\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.154788 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.154722 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d34b0eee-6deb-439e-9d04-8b7bb13c4408-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.154788 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.154747 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-config\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.154788 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.154761 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.177510 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.177335 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0687fbfc-67d2-4e25-9013-7470a6158b64" path="/var/lib/kubelet/pods/0687fbfc-67d2-4e25-9013-7470a6158b64/volumes" Apr 21 04:28:04.179689 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.179668 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9c9js\"" Apr 21 04:28:04.188356 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.188340 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7q5r" Apr 21 04:28:04.255904 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.255867 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.256064 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.255914 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d34b0eee-6deb-439e-9d04-8b7bb13c4408-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.256064 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.255948 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.256064 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.255979 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d34b0eee-6deb-439e-9d04-8b7bb13c4408-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.256064 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.256007 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.256064 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.256031 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.256064 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.256062 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.256369 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.256087 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d34b0eee-6deb-439e-9d04-8b7bb13c4408-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.256929 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.256905 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d34b0eee-6deb-439e-9d04-8b7bb13c4408-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.257088 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.257063 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d34b0eee-6deb-439e-9d04-8b7bb13c4408-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.257223 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.257204 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-web-config\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.257626 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.257327 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d34b0eee-6deb-439e-9d04-8b7bb13c4408-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.257626 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.257391 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-config\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.257626 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.257450 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.257626 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.257502 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d34b0eee-6deb-439e-9d04-8b7bb13c4408-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.257626 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.257528 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d34b0eee-6deb-439e-9d04-8b7bb13c4408-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.257626 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.257579 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.257961 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.257637 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d34b0eee-6deb-439e-9d04-8b7bb13c4408-config-out\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.257961 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.257662 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5pzm\" (UniqueName: \"kubernetes.io/projected/d34b0eee-6deb-439e-9d04-8b7bb13c4408-kube-api-access-s5pzm\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.257961 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.257710 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d34b0eee-6deb-439e-9d04-8b7bb13c4408-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.259208 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.259123 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.260156 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.259378 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.260156 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.259439 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.260156 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.259533 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.260156 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.259897 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d34b0eee-6deb-439e-9d04-8b7bb13c4408-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.260156 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.259927 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d34b0eee-6deb-439e-9d04-8b7bb13c4408-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.260455 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.260192 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d34b0eee-6deb-439e-9d04-8b7bb13c4408-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.260988 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.260962 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d34b0eee-6deb-439e-9d04-8b7bb13c4408-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.261167 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.261144 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d34b0eee-6deb-439e-9d04-8b7bb13c4408-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.262133 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.262075 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.262304 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.262285 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-config\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.262442 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.262419 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-web-config\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.262682 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.262664 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d34b0eee-6deb-439e-9d04-8b7bb13c4408-config-out\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.262904 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.262888 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.263283 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.263265 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d34b0eee-6deb-439e-9d04-8b7bb13c4408-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.268367 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.268343 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5pzm\" (UniqueName: \"kubernetes.io/projected/d34b0eee-6deb-439e-9d04-8b7bb13c4408-kube-api-access-s5pzm\") pod \"prometheus-k8s-0\" (UID: \"d34b0eee-6deb-439e-9d04-8b7bb13c4408\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.311776 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.311748 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g7q5r"] Apr 21 04:28:04.313688 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:28:04.313657 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0b4a7b4_e6a1_4816_a96e_0792f47539d9.slice/crio-1ef229729bbc50e4bd0b16bad4bf8a0d7e5e71bc8f3c2f51643957b98ad9d282 WatchSource:0}: Error finding container 1ef229729bbc50e4bd0b16bad4bf8a0d7e5e71bc8f3c2f51643957b98ad9d282: Status 404 returned error can't find the container with id 1ef229729bbc50e4bd0b16bad4bf8a0d7e5e71bc8f3c2f51643957b98ad9d282 Apr 21 04:28:04.418759 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.418671 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:04.558091 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:04.558042 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 21 04:28:04.560617 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:28:04.560575 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd34b0eee_6deb_439e_9d04_8b7bb13c4408.slice/crio-acc5230295beb746f3242aa3be7ac63ebbae08ba2278b239a6d0d055d7e885da WatchSource:0}: Error finding container acc5230295beb746f3242aa3be7ac63ebbae08ba2278b239a6d0d055d7e885da: Status 404 returned error can't find the container with id acc5230295beb746f3242aa3be7ac63ebbae08ba2278b239a6d0d055d7e885da Apr 21 04:28:05.048660 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:05.048575 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g7q5r" event={"ID":"d0b4a7b4-e6a1-4816-a96e-0792f47539d9","Type":"ContainerStarted","Data":"1ef229729bbc50e4bd0b16bad4bf8a0d7e5e71bc8f3c2f51643957b98ad9d282"} Apr 21 04:28:05.050963 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:05.050934 2574 generic.go:358] "Generic (PLEG): container finished" podID="d34b0eee-6deb-439e-9d04-8b7bb13c4408" containerID="51bf7478ca6e316306737e56e91a0ed2d395c5caf43c464f4647c8471e93cced" exitCode=0 Apr 21 04:28:05.051107 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:05.051005 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d34b0eee-6deb-439e-9d04-8b7bb13c4408","Type":"ContainerDied","Data":"51bf7478ca6e316306737e56e91a0ed2d395c5caf43c464f4647c8471e93cced"} Apr 21 04:28:05.051107 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:05.051027 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d34b0eee-6deb-439e-9d04-8b7bb13c4408","Type":"ContainerStarted","Data":"acc5230295beb746f3242aa3be7ac63ebbae08ba2278b239a6d0d055d7e885da"} Apr 21 04:28:06.060937 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:06.060901 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d34b0eee-6deb-439e-9d04-8b7bb13c4408","Type":"ContainerStarted","Data":"07f06b9d70c0e954cfa341beaf99857e1db62880138fc3089dc3159359353ac3"} Apr 21 04:28:06.060937 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:06.060941 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d34b0eee-6deb-439e-9d04-8b7bb13c4408","Type":"ContainerStarted","Data":"35fc1182a4d12db4035c88e31bba696563c651ebb6443fd702fe4bb5519d973b"} Apr 21 04:28:06.061419 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:06.060952 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d34b0eee-6deb-439e-9d04-8b7bb13c4408","Type":"ContainerStarted","Data":"84292849e86d43de7ed05856f150e6b6dabb89308aacd71d192b1fd326b29d42"} Apr 21 04:28:06.061419 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:06.060964 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d34b0eee-6deb-439e-9d04-8b7bb13c4408","Type":"ContainerStarted","Data":"921c4b1a9b59a9f7705a17034a2f40fd16a3207446046f4b01455a24672c8271"} Apr 21 04:28:06.061419 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:06.060976 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d34b0eee-6deb-439e-9d04-8b7bb13c4408","Type":"ContainerStarted","Data":"668e54c786fb009037ed8500a716eb945b891da3c6a702e0511c0dd662b7c6e5"} Apr 21 04:28:06.061419 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:06.060989 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d34b0eee-6deb-439e-9d04-8b7bb13c4408","Type":"ContainerStarted","Data":"e3850005e7e3afa18d1cf192e80ef5a10e18ec902b6fd817decf4a44a608f0f0"} Apr 21 04:28:06.062440 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:06.062410 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g7q5r" event={"ID":"d0b4a7b4-e6a1-4816-a96e-0792f47539d9","Type":"ContainerStarted","Data":"b1f1e7ef5adb0c21501d4f648820e929f85cd6c4428fe257d86d8800f0b9b5c6"} Apr 21 04:28:06.062539 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:06.062449 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g7q5r" event={"ID":"d0b4a7b4-e6a1-4816-a96e-0792f47539d9","Type":"ContainerStarted","Data":"a83207f8cd89398e95e5367885e3261d64ef289748d5af7689e9ae71de795e22"} Apr 21 04:28:06.085214 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:06.085157 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.085139771 podStartE2EDuration="2.085139771s" podCreationTimestamp="2026-04-21 04:28:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:28:06.083157178 +0000 UTC m=+254.464260065" watchObservedRunningTime="2026-04-21 04:28:06.085139771 +0000 UTC m=+254.466242704" Apr 21 04:28:06.098150 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:06.098094 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-g7q5r" podStartSLOduration=253.174342939 podStartE2EDuration="4m14.098076489s" podCreationTimestamp="2026-04-21 04:23:52 +0000 UTC" firstStartedPulling="2026-04-21 04:28:04.315424235 +0000 UTC m=+252.696527102" lastFinishedPulling="2026-04-21 04:28:05.239157772 +0000 UTC m=+253.620260652" observedRunningTime="2026-04-21 04:28:06.097094576 +0000 UTC m=+254.478197464" watchObservedRunningTime="2026-04-21 04:28:06.098076489 +0000 UTC m=+254.479179377" Apr 21 04:28:09.419603 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:09.419566 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:28:52.056853 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:52.056822 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tv4p6_1b413c11-6c0e-410d-bffc-fdd6ba8e6689/ovn-acl-logging/0.log" Apr 21 04:28:52.057268 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:52.056919 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tv4p6_1b413c11-6c0e-410d-bffc-fdd6ba8e6689/ovn-acl-logging/0.log" Apr 21 04:28:52.059828 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:28:52.059807 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 04:29:04.419678 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:29:04.419642 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:29:04.435768 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:29:04.435735 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:29:05.254643 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:29:05.254613 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 21 04:30:44.795301 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:30:44.795263 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-2dd27"] Apr 21 04:30:44.798551 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:30:44.798531 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2dd27" Apr 21 04:30:44.800671 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:30:44.800648 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 21 04:30:44.801474 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:30:44.801457 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-tccpt\"" Apr 21 04:30:44.801564 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:30:44.801471 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 21 04:30:44.813906 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:30:44.813873 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-2dd27"] Apr 21 04:30:44.966417 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:30:44.966373 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph7t6\" (UniqueName: \"kubernetes.io/projected/ab4ccffc-f76f-4f2b-aa81-9b9a24c60963-kube-api-access-ph7t6\") pod \"openshift-lws-operator-bfc7f696d-2dd27\" (UID: \"ab4ccffc-f76f-4f2b-aa81-9b9a24c60963\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2dd27" Apr 21 04:30:44.966627 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:30:44.966445 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab4ccffc-f76f-4f2b-aa81-9b9a24c60963-tmp\") pod \"openshift-lws-operator-bfc7f696d-2dd27\" (UID: \"ab4ccffc-f76f-4f2b-aa81-9b9a24c60963\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2dd27" Apr 21 04:30:45.067386 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:30:45.067298 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab4ccffc-f76f-4f2b-aa81-9b9a24c60963-tmp\") pod \"openshift-lws-operator-bfc7f696d-2dd27\" (UID: \"ab4ccffc-f76f-4f2b-aa81-9b9a24c60963\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2dd27" Apr 21 04:30:45.067541 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:30:45.067393 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ph7t6\" (UniqueName: \"kubernetes.io/projected/ab4ccffc-f76f-4f2b-aa81-9b9a24c60963-kube-api-access-ph7t6\") pod \"openshift-lws-operator-bfc7f696d-2dd27\" (UID: \"ab4ccffc-f76f-4f2b-aa81-9b9a24c60963\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2dd27" Apr 21 04:30:45.067834 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:30:45.067810 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ab4ccffc-f76f-4f2b-aa81-9b9a24c60963-tmp\") pod \"openshift-lws-operator-bfc7f696d-2dd27\" (UID: \"ab4ccffc-f76f-4f2b-aa81-9b9a24c60963\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2dd27" Apr 21 04:30:45.075038 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:30:45.075007 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph7t6\" (UniqueName: \"kubernetes.io/projected/ab4ccffc-f76f-4f2b-aa81-9b9a24c60963-kube-api-access-ph7t6\") pod \"openshift-lws-operator-bfc7f696d-2dd27\" (UID: \"ab4ccffc-f76f-4f2b-aa81-9b9a24c60963\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2dd27" Apr 21 04:30:45.119527 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:30:45.119484 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2dd27" Apr 21 04:30:45.239936 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:30:45.239899 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-2dd27"] Apr 21 04:30:45.243237 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:30:45.243203 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab4ccffc_f76f_4f2b_aa81_9b9a24c60963.slice/crio-7f9eccbc2d4e85b1be491528dfeef468be8a117133b499a888f3affe044b2c3e WatchSource:0}: Error finding container 7f9eccbc2d4e85b1be491528dfeef468be8a117133b499a888f3affe044b2c3e: Status 404 returned error can't find the container with id 7f9eccbc2d4e85b1be491528dfeef468be8a117133b499a888f3affe044b2c3e Apr 21 04:30:45.244722 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:30:45.244702 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:30:45.533724 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:30:45.533686 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2dd27" event={"ID":"ab4ccffc-f76f-4f2b-aa81-9b9a24c60963","Type":"ContainerStarted","Data":"7f9eccbc2d4e85b1be491528dfeef468be8a117133b499a888f3affe044b2c3e"} Apr 21 04:30:48.544860 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:30:48.544818 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2dd27" event={"ID":"ab4ccffc-f76f-4f2b-aa81-9b9a24c60963","Type":"ContainerStarted","Data":"60ae689f31d6a83d7b740453c30c56edc2f2ce61a4a89688b4973b9e85b90337"} Apr 21 04:30:48.563958 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:30:48.563895 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-2dd27" podStartSLOduration=1.444082508 podStartE2EDuration="4.563877721s" podCreationTimestamp="2026-04-21 04:30:44 +0000 UTC" firstStartedPulling="2026-04-21 04:30:45.244859035 +0000 UTC m=+413.625961902" lastFinishedPulling="2026-04-21 04:30:48.364654246 +0000 UTC m=+416.745757115" observedRunningTime="2026-04-21 04:30:48.562867385 +0000 UTC m=+416.943970275" watchObservedRunningTime="2026-04-21 04:30:48.563877721 +0000 UTC m=+416.944980608" Apr 21 04:31:07.822245 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:07.822166 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-52l7q"] Apr 21 04:31:07.825466 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:07.825446 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-52l7q" Apr 21 04:31:07.827850 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:07.827825 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 21 04:31:07.829415 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:07.829393 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 21 04:31:07.829521 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:07.829419 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-zqtks\"" Apr 21 04:31:07.829647 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:07.829630 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 21 04:31:07.829728 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:07.829670 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 21 04:31:07.840104 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:07.840078 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-52l7q"] Apr 21 04:31:07.856715 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:07.856686 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b5b6e822-b2b1-4333-a8b1-22d85ee382ff-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-52l7q\" (UID: \"b5b6e822-b2b1-4333-a8b1-22d85ee382ff\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-52l7q" Apr 21 04:31:07.856856 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:07.856721 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htxb7\" (UniqueName: \"kubernetes.io/projected/b5b6e822-b2b1-4333-a8b1-22d85ee382ff-kube-api-access-htxb7\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-52l7q\" (UID: \"b5b6e822-b2b1-4333-a8b1-22d85ee382ff\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-52l7q" Apr 21 04:31:07.856856 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:07.856800 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b5b6e822-b2b1-4333-a8b1-22d85ee382ff-webhook-cert\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-52l7q\" (UID: \"b5b6e822-b2b1-4333-a8b1-22d85ee382ff\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-52l7q" Apr 21 04:31:07.957253 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:07.957214 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b5b6e822-b2b1-4333-a8b1-22d85ee382ff-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-52l7q\" (UID: \"b5b6e822-b2b1-4333-a8b1-22d85ee382ff\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-52l7q" Apr 21 04:31:07.957253 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:07.957255 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-htxb7\" (UniqueName: \"kubernetes.io/projected/b5b6e822-b2b1-4333-a8b1-22d85ee382ff-kube-api-access-htxb7\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-52l7q\" (UID: \"b5b6e822-b2b1-4333-a8b1-22d85ee382ff\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-52l7q" Apr 21 04:31:07.957464 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:07.957315 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b5b6e822-b2b1-4333-a8b1-22d85ee382ff-webhook-cert\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-52l7q\" (UID: \"b5b6e822-b2b1-4333-a8b1-22d85ee382ff\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-52l7q" Apr 21 04:31:07.959765 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:07.959740 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b5b6e822-b2b1-4333-a8b1-22d85ee382ff-webhook-cert\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-52l7q\" (UID: \"b5b6e822-b2b1-4333-a8b1-22d85ee382ff\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-52l7q" Apr 21 04:31:07.959877 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:07.959770 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b5b6e822-b2b1-4333-a8b1-22d85ee382ff-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-52l7q\" (UID: \"b5b6e822-b2b1-4333-a8b1-22d85ee382ff\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-52l7q" Apr 21 04:31:07.965495 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:07.965474 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-htxb7\" (UniqueName: \"kubernetes.io/projected/b5b6e822-b2b1-4333-a8b1-22d85ee382ff-kube-api-access-htxb7\") pod \"opendatahub-operator-controller-manager-5b6f69cdb8-52l7q\" (UID: \"b5b6e822-b2b1-4333-a8b1-22d85ee382ff\") " pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-52l7q" Apr 21 04:31:08.136299 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:08.136205 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-52l7q" Apr 21 04:31:08.258666 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:08.258516 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-52l7q"] Apr 21 04:31:08.261234 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:31:08.261197 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5b6e822_b2b1_4333_a8b1_22d85ee382ff.slice/crio-f08b6ca05ff488f1f957e9bbd540042468f9ec7fd0c749a58f188bf65d23d225 WatchSource:0}: Error finding container f08b6ca05ff488f1f957e9bbd540042468f9ec7fd0c749a58f188bf65d23d225: Status 404 returned error can't find the container with id f08b6ca05ff488f1f957e9bbd540042468f9ec7fd0c749a58f188bf65d23d225 Apr 21 04:31:08.605548 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:08.605508 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-52l7q" event={"ID":"b5b6e822-b2b1-4333-a8b1-22d85ee382ff","Type":"ContainerStarted","Data":"f08b6ca05ff488f1f957e9bbd540042468f9ec7fd0c749a58f188bf65d23d225"} Apr 21 04:31:11.617619 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:11.617561 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-52l7q" event={"ID":"b5b6e822-b2b1-4333-a8b1-22d85ee382ff","Type":"ContainerStarted","Data":"e097ca6221de09d5fec97463d9bddfed2fbdaa6c786ac5546ce1ada8b2e907eb"} Apr 21 04:31:11.618100 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:11.617667 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-52l7q" Apr 21 04:31:11.635132 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:11.635077 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-52l7q" podStartSLOduration=2.101534976 podStartE2EDuration="4.63506455s" podCreationTimestamp="2026-04-21 04:31:07 +0000 UTC" firstStartedPulling="2026-04-21 04:31:08.262943415 +0000 UTC m=+436.644046284" lastFinishedPulling="2026-04-21 04:31:10.796472981 +0000 UTC m=+439.177575858" observedRunningTime="2026-04-21 04:31:11.634042684 +0000 UTC m=+440.015145569" watchObservedRunningTime="2026-04-21 04:31:11.63506455 +0000 UTC m=+440.016167438" Apr 21 04:31:13.655814 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:13.655776 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-579f6d4cb9-664wz"] Apr 21 04:31:13.659281 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:13.659261 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-664wz" Apr 21 04:31:13.662171 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:13.662150 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 21 04:31:13.662286 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:13.662193 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-z9g9q\"" Apr 21 04:31:13.662286 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:13.662227 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 21 04:31:13.662286 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:13.662235 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 21 04:31:13.668387 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:13.668362 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-579f6d4cb9-664wz"] Apr 21 04:31:13.705433 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:13.705397 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6eebfacc-87b9-41a5-9b26-5e2d40d339f1-metrics-cert\") pod \"lws-controller-manager-579f6d4cb9-664wz\" (UID: \"6eebfacc-87b9-41a5-9b26-5e2d40d339f1\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-664wz" Apr 21 04:31:13.705433 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:13.705431 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vr8g\" (UniqueName: \"kubernetes.io/projected/6eebfacc-87b9-41a5-9b26-5e2d40d339f1-kube-api-access-2vr8g\") pod \"lws-controller-manager-579f6d4cb9-664wz\" (UID: \"6eebfacc-87b9-41a5-9b26-5e2d40d339f1\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-664wz" Apr 21 04:31:13.705715 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:13.705469 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6eebfacc-87b9-41a5-9b26-5e2d40d339f1-cert\") pod \"lws-controller-manager-579f6d4cb9-664wz\" (UID: \"6eebfacc-87b9-41a5-9b26-5e2d40d339f1\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-664wz" Apr 21 04:31:13.705715 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:13.705686 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6eebfacc-87b9-41a5-9b26-5e2d40d339f1-manager-config\") pod \"lws-controller-manager-579f6d4cb9-664wz\" (UID: \"6eebfacc-87b9-41a5-9b26-5e2d40d339f1\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-664wz" Apr 21 04:31:13.806537 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:13.806502 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6eebfacc-87b9-41a5-9b26-5e2d40d339f1-metrics-cert\") pod \"lws-controller-manager-579f6d4cb9-664wz\" (UID: \"6eebfacc-87b9-41a5-9b26-5e2d40d339f1\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-664wz" Apr 21 04:31:13.806537 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:13.806549 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vr8g\" (UniqueName: \"kubernetes.io/projected/6eebfacc-87b9-41a5-9b26-5e2d40d339f1-kube-api-access-2vr8g\") pod \"lws-controller-manager-579f6d4cb9-664wz\" (UID: \"6eebfacc-87b9-41a5-9b26-5e2d40d339f1\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-664wz" Apr 21 04:31:13.806811 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:13.806623 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6eebfacc-87b9-41a5-9b26-5e2d40d339f1-cert\") pod \"lws-controller-manager-579f6d4cb9-664wz\" (UID: \"6eebfacc-87b9-41a5-9b26-5e2d40d339f1\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-664wz" Apr 21 04:31:13.806811 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:13.806670 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6eebfacc-87b9-41a5-9b26-5e2d40d339f1-manager-config\") pod \"lws-controller-manager-579f6d4cb9-664wz\" (UID: \"6eebfacc-87b9-41a5-9b26-5e2d40d339f1\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-664wz" Apr 21 04:31:13.807460 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:13.807436 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6eebfacc-87b9-41a5-9b26-5e2d40d339f1-manager-config\") pod \"lws-controller-manager-579f6d4cb9-664wz\" (UID: \"6eebfacc-87b9-41a5-9b26-5e2d40d339f1\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-664wz" Apr 21 04:31:13.809299 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:13.809277 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6eebfacc-87b9-41a5-9b26-5e2d40d339f1-metrics-cert\") pod \"lws-controller-manager-579f6d4cb9-664wz\" (UID: \"6eebfacc-87b9-41a5-9b26-5e2d40d339f1\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-664wz" Apr 21 04:31:13.810100 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:13.810080 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6eebfacc-87b9-41a5-9b26-5e2d40d339f1-cert\") pod \"lws-controller-manager-579f6d4cb9-664wz\" (UID: \"6eebfacc-87b9-41a5-9b26-5e2d40d339f1\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-664wz" Apr 21 04:31:13.819168 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:13.819137 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vr8g\" (UniqueName: \"kubernetes.io/projected/6eebfacc-87b9-41a5-9b26-5e2d40d339f1-kube-api-access-2vr8g\") pod \"lws-controller-manager-579f6d4cb9-664wz\" (UID: \"6eebfacc-87b9-41a5-9b26-5e2d40d339f1\") " pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-664wz" Apr 21 04:31:13.969481 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:13.969442 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-664wz" Apr 21 04:31:14.088634 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:14.088548 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-579f6d4cb9-664wz"] Apr 21 04:31:14.091080 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:31:14.091042 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eebfacc_87b9_41a5_9b26_5e2d40d339f1.slice/crio-e6acddd8b48a3d9808048d260338c901b230ec8b28d44fdd78132e17c3c6b2cf WatchSource:0}: Error finding container e6acddd8b48a3d9808048d260338c901b230ec8b28d44fdd78132e17c3c6b2cf: Status 404 returned error can't find the container with id e6acddd8b48a3d9808048d260338c901b230ec8b28d44fdd78132e17c3c6b2cf Apr 21 04:31:14.627620 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:14.627568 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-664wz" event={"ID":"6eebfacc-87b9-41a5-9b26-5e2d40d339f1","Type":"ContainerStarted","Data":"e6acddd8b48a3d9808048d260338c901b230ec8b28d44fdd78132e17c3c6b2cf"} Apr 21 04:31:16.635550 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:16.635516 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-664wz" event={"ID":"6eebfacc-87b9-41a5-9b26-5e2d40d339f1","Type":"ContainerStarted","Data":"f26f31095e40d0c47063859c4a6fbdde8077fd5f23307208acf3cc90848f896a"} Apr 21 04:31:16.635962 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:16.635580 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-664wz" Apr 21 04:31:16.651904 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:16.651856 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-664wz" podStartSLOduration=2.087925516 podStartE2EDuration="3.651842097s" podCreationTimestamp="2026-04-21 04:31:13 +0000 UTC" firstStartedPulling="2026-04-21 04:31:14.092948987 +0000 UTC m=+442.474051853" lastFinishedPulling="2026-04-21 04:31:15.656865564 +0000 UTC m=+444.037968434" observedRunningTime="2026-04-21 04:31:16.650884354 +0000 UTC m=+445.031987241" watchObservedRunningTime="2026-04-21 04:31:16.651842097 +0000 UTC m=+445.032944984" Apr 21 04:31:22.623620 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:22.623560 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5b6f69cdb8-52l7q" Apr 21 04:31:25.413479 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:25.413436 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-5f98864f9-t6qwc"] Apr 21 04:31:25.418430 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:25.418401 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5f98864f9-t6qwc" Apr 21 04:31:25.420638 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:25.420610 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 21 04:31:25.420777 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:25.420585 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 04:31:25.421383 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:25.421351 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-m778c\"" Apr 21 04:31:25.421477 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:25.421385 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 04:31:25.421477 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:25.421453 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 21 04:31:25.424330 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:25.424308 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5f98864f9-t6qwc"] Apr 21 04:31:25.503963 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:25.503916 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d01edc4d-43c6-4980-9d79-1d640eee85db-tmp\") pod \"kube-auth-proxy-5f98864f9-t6qwc\" (UID: \"d01edc4d-43c6-4980-9d79-1d640eee85db\") " pod="openshift-ingress/kube-auth-proxy-5f98864f9-t6qwc" Apr 21 04:31:25.503963 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:25.503967 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d01edc4d-43c6-4980-9d79-1d640eee85db-tls-certs\") pod \"kube-auth-proxy-5f98864f9-t6qwc\" (UID: \"d01edc4d-43c6-4980-9d79-1d640eee85db\") " pod="openshift-ingress/kube-auth-proxy-5f98864f9-t6qwc" Apr 21 04:31:25.504182 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:25.503985 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp9nt\" (UniqueName: \"kubernetes.io/projected/d01edc4d-43c6-4980-9d79-1d640eee85db-kube-api-access-mp9nt\") pod \"kube-auth-proxy-5f98864f9-t6qwc\" (UID: \"d01edc4d-43c6-4980-9d79-1d640eee85db\") " pod="openshift-ingress/kube-auth-proxy-5f98864f9-t6qwc" Apr 21 04:31:25.604502 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:25.604458 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d01edc4d-43c6-4980-9d79-1d640eee85db-tmp\") pod \"kube-auth-proxy-5f98864f9-t6qwc\" (UID: \"d01edc4d-43c6-4980-9d79-1d640eee85db\") " pod="openshift-ingress/kube-auth-proxy-5f98864f9-t6qwc" Apr 21 04:31:25.604701 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:25.604510 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d01edc4d-43c6-4980-9d79-1d640eee85db-tls-certs\") pod \"kube-auth-proxy-5f98864f9-t6qwc\" (UID: \"d01edc4d-43c6-4980-9d79-1d640eee85db\") " pod="openshift-ingress/kube-auth-proxy-5f98864f9-t6qwc" Apr 21 04:31:25.604701 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:25.604539 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mp9nt\" (UniqueName: \"kubernetes.io/projected/d01edc4d-43c6-4980-9d79-1d640eee85db-kube-api-access-mp9nt\") pod \"kube-auth-proxy-5f98864f9-t6qwc\" (UID: \"d01edc4d-43c6-4980-9d79-1d640eee85db\") " pod="openshift-ingress/kube-auth-proxy-5f98864f9-t6qwc" Apr 21 04:31:25.606929 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:25.606890 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d01edc4d-43c6-4980-9d79-1d640eee85db-tmp\") pod \"kube-auth-proxy-5f98864f9-t6qwc\" (UID: \"d01edc4d-43c6-4980-9d79-1d640eee85db\") " pod="openshift-ingress/kube-auth-proxy-5f98864f9-t6qwc" Apr 21 04:31:25.607091 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:25.607068 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d01edc4d-43c6-4980-9d79-1d640eee85db-tls-certs\") pod \"kube-auth-proxy-5f98864f9-t6qwc\" (UID: \"d01edc4d-43c6-4980-9d79-1d640eee85db\") " pod="openshift-ingress/kube-auth-proxy-5f98864f9-t6qwc" Apr 21 04:31:25.611888 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:25.611860 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp9nt\" (UniqueName: \"kubernetes.io/projected/d01edc4d-43c6-4980-9d79-1d640eee85db-kube-api-access-mp9nt\") pod \"kube-auth-proxy-5f98864f9-t6qwc\" (UID: \"d01edc4d-43c6-4980-9d79-1d640eee85db\") " pod="openshift-ingress/kube-auth-proxy-5f98864f9-t6qwc" Apr 21 04:31:25.731210 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:25.731172 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-5f98864f9-t6qwc" Apr 21 04:31:25.855773 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:25.855737 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-5f98864f9-t6qwc"] Apr 21 04:31:25.859016 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:31:25.858987 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd01edc4d_43c6_4980_9d79_1d640eee85db.slice/crio-7b0d1b7605e32f467098049747e61cbc6e81ced4c9f29562bc9ddd33e6e2b5a4 WatchSource:0}: Error finding container 7b0d1b7605e32f467098049747e61cbc6e81ced4c9f29562bc9ddd33e6e2b5a4: Status 404 returned error can't find the container with id 7b0d1b7605e32f467098049747e61cbc6e81ced4c9f29562bc9ddd33e6e2b5a4 Apr 21 04:31:26.670615 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:26.670477 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5f98864f9-t6qwc" event={"ID":"d01edc4d-43c6-4980-9d79-1d640eee85db","Type":"ContainerStarted","Data":"7b0d1b7605e32f467098049747e61cbc6e81ced4c9f29562bc9ddd33e6e2b5a4"} Apr 21 04:31:27.641860 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:27.641831 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-579f6d4cb9-664wz" Apr 21 04:31:29.682613 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:29.682562 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-5f98864f9-t6qwc" event={"ID":"d01edc4d-43c6-4980-9d79-1d640eee85db","Type":"ContainerStarted","Data":"8a9c8483cb07fa4751d43142504e037366d137939e4c267abff2c8bc49cf99de"} Apr 21 04:31:29.697576 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:31:29.697441 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-5f98864f9-t6qwc" podStartSLOduration=1.530787241 podStartE2EDuration="4.697422487s" podCreationTimestamp="2026-04-21 04:31:25 +0000 UTC" firstStartedPulling="2026-04-21 04:31:25.860672391 +0000 UTC m=+454.241775257" lastFinishedPulling="2026-04-21 04:31:29.027307637 +0000 UTC m=+457.408410503" observedRunningTime="2026-04-21 04:31:29.697295705 +0000 UTC m=+458.078398594" watchObservedRunningTime="2026-04-21 04:31:29.697422487 +0000 UTC m=+458.078525377" Apr 21 04:33:14.922007 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:14.921964 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-4dhfg"] Apr 21 04:33:14.925096 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:14.925069 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-4dhfg" Apr 21 04:33:14.927547 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:14.927510 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 21 04:33:14.927708 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:14.927539 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 21 04:33:14.927708 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:14.927543 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-cbpgz\"" Apr 21 04:33:14.927708 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:14.927509 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 21 04:33:14.927708 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:14.927606 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 21 04:33:14.933771 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:14.933746 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-4dhfg"] Apr 21 04:33:15.024662 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:15.024579 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/40cd261b-27b6-495b-87b6-5b7734d535cd-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-4dhfg\" (UID: \"40cd261b-27b6-495b-87b6-5b7734d535cd\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-4dhfg" Apr 21 04:33:15.024844 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:15.024742 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/40cd261b-27b6-495b-87b6-5b7734d535cd-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-4dhfg\" (UID: \"40cd261b-27b6-495b-87b6-5b7734d535cd\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-4dhfg" Apr 21 04:33:15.024844 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:15.024780 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9h4b\" (UniqueName: \"kubernetes.io/projected/40cd261b-27b6-495b-87b6-5b7734d535cd-kube-api-access-z9h4b\") pod \"kuadrant-console-plugin-6cb54b5c86-4dhfg\" (UID: \"40cd261b-27b6-495b-87b6-5b7734d535cd\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-4dhfg" Apr 21 04:33:15.125169 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:15.125129 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/40cd261b-27b6-495b-87b6-5b7734d535cd-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-4dhfg\" (UID: \"40cd261b-27b6-495b-87b6-5b7734d535cd\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-4dhfg" Apr 21 04:33:15.125374 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:15.125196 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/40cd261b-27b6-495b-87b6-5b7734d535cd-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-4dhfg\" (UID: \"40cd261b-27b6-495b-87b6-5b7734d535cd\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-4dhfg" Apr 21 04:33:15.125374 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:15.125220 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9h4b\" (UniqueName: \"kubernetes.io/projected/40cd261b-27b6-495b-87b6-5b7734d535cd-kube-api-access-z9h4b\") pod \"kuadrant-console-plugin-6cb54b5c86-4dhfg\" (UID: \"40cd261b-27b6-495b-87b6-5b7734d535cd\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-4dhfg" Apr 21 04:33:15.125920 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:15.125888 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/40cd261b-27b6-495b-87b6-5b7734d535cd-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-4dhfg\" (UID: \"40cd261b-27b6-495b-87b6-5b7734d535cd\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-4dhfg" Apr 21 04:33:15.127715 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:15.127692 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/40cd261b-27b6-495b-87b6-5b7734d535cd-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-4dhfg\" (UID: \"40cd261b-27b6-495b-87b6-5b7734d535cd\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-4dhfg" Apr 21 04:33:15.136302 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:15.136268 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9h4b\" (UniqueName: \"kubernetes.io/projected/40cd261b-27b6-495b-87b6-5b7734d535cd-kube-api-access-z9h4b\") pod \"kuadrant-console-plugin-6cb54b5c86-4dhfg\" (UID: \"40cd261b-27b6-495b-87b6-5b7734d535cd\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-4dhfg" Apr 21 04:33:15.234804 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:15.234769 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-4dhfg" Apr 21 04:33:15.356615 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:15.356559 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-4dhfg"] Apr 21 04:33:16.039169 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:16.039125 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-4dhfg" event={"ID":"40cd261b-27b6-495b-87b6-5b7734d535cd","Type":"ContainerStarted","Data":"f99cd1c60c6d01fe0625f8978d7c1b7f0698e912321575790b60246e4d7d5d82"} Apr 21 04:33:40.132168 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:40.132128 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-4dhfg" event={"ID":"40cd261b-27b6-495b-87b6-5b7734d535cd","Type":"ContainerStarted","Data":"0665fd6d55514471f02b57146f6418441b205c0081591044f2ba46b5262d196e"} Apr 21 04:33:40.147531 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:40.147452 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-4dhfg" podStartSLOduration=2.077657351 podStartE2EDuration="26.147428443s" podCreationTimestamp="2026-04-21 04:33:14 +0000 UTC" firstStartedPulling="2026-04-21 04:33:15.362205722 +0000 UTC m=+563.743308588" lastFinishedPulling="2026-04-21 04:33:39.431976808 +0000 UTC m=+587.813079680" observedRunningTime="2026-04-21 04:33:40.145777597 +0000 UTC m=+588.526880499" watchObservedRunningTime="2026-04-21 04:33:40.147428443 +0000 UTC m=+588.528531330" Apr 21 04:33:40.850988 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:40.850951 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xblq8"] Apr 21 04:33:40.878914 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:40.878877 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xblq8"] Apr 21 04:33:40.879063 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:40.879003 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xblq8" Apr 21 04:33:40.881515 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:40.881483 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-vc2h7\"" Apr 21 04:33:40.963307 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:40.963268 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a7a9d821-f536-4e4e-96e1-3822ebb3972e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-xblq8\" (UID: \"a7a9d821-f536-4e4e-96e1-3822ebb3972e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xblq8" Apr 21 04:33:40.963466 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:40.963371 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sv7z\" (UniqueName: \"kubernetes.io/projected/a7a9d821-f536-4e4e-96e1-3822ebb3972e-kube-api-access-5sv7z\") pod \"kuadrant-operator-controller-manager-55c7f4c975-xblq8\" (UID: \"a7a9d821-f536-4e4e-96e1-3822ebb3972e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xblq8" Apr 21 04:33:41.064162 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:41.064125 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5sv7z\" (UniqueName: \"kubernetes.io/projected/a7a9d821-f536-4e4e-96e1-3822ebb3972e-kube-api-access-5sv7z\") pod \"kuadrant-operator-controller-manager-55c7f4c975-xblq8\" (UID: \"a7a9d821-f536-4e4e-96e1-3822ebb3972e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xblq8" Apr 21 04:33:41.064323 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:41.064185 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a7a9d821-f536-4e4e-96e1-3822ebb3972e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-xblq8\" (UID: \"a7a9d821-f536-4e4e-96e1-3822ebb3972e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xblq8" Apr 21 04:33:41.064475 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:41.064458 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/a7a9d821-f536-4e4e-96e1-3822ebb3972e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-xblq8\" (UID: \"a7a9d821-f536-4e4e-96e1-3822ebb3972e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xblq8" Apr 21 04:33:41.072160 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:41.072140 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sv7z\" (UniqueName: \"kubernetes.io/projected/a7a9d821-f536-4e4e-96e1-3822ebb3972e-kube-api-access-5sv7z\") pod \"kuadrant-operator-controller-manager-55c7f4c975-xblq8\" (UID: \"a7a9d821-f536-4e4e-96e1-3822ebb3972e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xblq8" Apr 21 04:33:41.190259 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:41.190169 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xblq8" Apr 21 04:33:41.315519 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:41.315487 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xblq8"] Apr 21 04:33:41.319247 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:33:41.319214 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7a9d821_f536_4e4e_96e1_3822ebb3972e.slice/crio-e0dbe8bfa7e6bfaeb93cdf3e4a7b3a3fed0db2c3c9bcdfa7c0758c0735d8db3f WatchSource:0}: Error finding container e0dbe8bfa7e6bfaeb93cdf3e4a7b3a3fed0db2c3c9bcdfa7c0758c0735d8db3f: Status 404 returned error can't find the container with id e0dbe8bfa7e6bfaeb93cdf3e4a7b3a3fed0db2c3c9bcdfa7c0758c0735d8db3f Apr 21 04:33:42.139548 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:42.139499 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xblq8" event={"ID":"a7a9d821-f536-4e4e-96e1-3822ebb3972e","Type":"ContainerStarted","Data":"e0dbe8bfa7e6bfaeb93cdf3e4a7b3a3fed0db2c3c9bcdfa7c0758c0735d8db3f"} Apr 21 04:33:47.163444 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:47.163321 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xblq8" event={"ID":"a7a9d821-f536-4e4e-96e1-3822ebb3972e","Type":"ContainerStarted","Data":"50e490b71b033161e11823501f72433883e5d53726e7831e2f6c2fbe2ac79404"} Apr 21 04:33:47.163444 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:47.163418 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xblq8" Apr 21 04:33:47.185177 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:47.185119 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xblq8" podStartSLOduration=1.779314529 podStartE2EDuration="7.185102957s" podCreationTimestamp="2026-04-21 04:33:40 +0000 UTC" firstStartedPulling="2026-04-21 04:33:41.321513825 +0000 UTC m=+589.702616691" lastFinishedPulling="2026-04-21 04:33:46.727302251 +0000 UTC m=+595.108405119" observedRunningTime="2026-04-21 04:33:47.183181566 +0000 UTC m=+595.564284453" watchObservedRunningTime="2026-04-21 04:33:47.185102957 +0000 UTC m=+595.566205845" Apr 21 04:33:52.081965 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:52.081933 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tv4p6_1b413c11-6c0e-410d-bffc-fdd6ba8e6689/ovn-acl-logging/0.log" Apr 21 04:33:52.082358 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:52.082279 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tv4p6_1b413c11-6c0e-410d-bffc-fdd6ba8e6689/ovn-acl-logging/0.log" Apr 21 04:33:58.168953 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:33:58.168923 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-xblq8" Apr 21 04:34:17.163412 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:34:17.163371 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-dvr2q"] Apr 21 04:34:17.245870 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:34:17.245839 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-dvr2q"] Apr 21 04:34:17.245982 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:34:17.245970 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-dvr2q" Apr 21 04:34:17.249478 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:34:17.249457 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-tzgn6\"" Apr 21 04:34:17.286257 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:34:17.286214 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rjv8\" (UniqueName: \"kubernetes.io/projected/195b4e5d-dc5a-4e0d-9800-80a2f60a926f-kube-api-access-7rjv8\") pod \"authorino-7498df8756-dvr2q\" (UID: \"195b4e5d-dc5a-4e0d-9800-80a2f60a926f\") " pod="kuadrant-system/authorino-7498df8756-dvr2q" Apr 21 04:34:17.386721 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:34:17.386672 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7rjv8\" (UniqueName: \"kubernetes.io/projected/195b4e5d-dc5a-4e0d-9800-80a2f60a926f-kube-api-access-7rjv8\") pod \"authorino-7498df8756-dvr2q\" (UID: \"195b4e5d-dc5a-4e0d-9800-80a2f60a926f\") " pod="kuadrant-system/authorino-7498df8756-dvr2q" Apr 21 04:34:17.402550 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:34:17.402522 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rjv8\" (UniqueName: \"kubernetes.io/projected/195b4e5d-dc5a-4e0d-9800-80a2f60a926f-kube-api-access-7rjv8\") pod \"authorino-7498df8756-dvr2q\" (UID: \"195b4e5d-dc5a-4e0d-9800-80a2f60a926f\") " pod="kuadrant-system/authorino-7498df8756-dvr2q" Apr 21 04:34:17.554611 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:34:17.554549 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-dvr2q" Apr 21 04:34:17.683172 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:34:17.683147 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-dvr2q"] Apr 21 04:34:17.685762 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:34:17.685732 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod195b4e5d_dc5a_4e0d_9800_80a2f60a926f.slice/crio-88407f106aceb9c9e2f84ccf93c7b952119a0b8ff6adbfbb96f136dd04bf499b WatchSource:0}: Error finding container 88407f106aceb9c9e2f84ccf93c7b952119a0b8ff6adbfbb96f136dd04bf499b: Status 404 returned error can't find the container with id 88407f106aceb9c9e2f84ccf93c7b952119a0b8ff6adbfbb96f136dd04bf499b Apr 21 04:34:18.266026 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:34:18.265975 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-dvr2q" event={"ID":"195b4e5d-dc5a-4e0d-9800-80a2f60a926f","Type":"ContainerStarted","Data":"88407f106aceb9c9e2f84ccf93c7b952119a0b8ff6adbfbb96f136dd04bf499b"} Apr 21 04:34:21.278730 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:34:21.278691 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-dvr2q" event={"ID":"195b4e5d-dc5a-4e0d-9800-80a2f60a926f","Type":"ContainerStarted","Data":"2870d735652f97d2d22dd739f1de1cfa77defd81ca4a14447a92a07782e28aca"} Apr 21 04:34:21.294129 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:34:21.294072 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-dvr2q" podStartSLOduration=0.857774295 podStartE2EDuration="4.294056366s" podCreationTimestamp="2026-04-21 04:34:17 +0000 UTC" firstStartedPulling="2026-04-21 04:34:17.68715218 +0000 UTC m=+626.068255049" lastFinishedPulling="2026-04-21 04:34:21.123434253 +0000 UTC m=+629.504537120" observedRunningTime="2026-04-21 04:34:21.292729311 +0000 UTC m=+629.673832198" watchObservedRunningTime="2026-04-21 04:34:21.294056366 +0000 UTC m=+629.675159254" Apr 21 04:35:43.123958 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.123870 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-dvr2q"] Apr 21 04:35:43.124406 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.124158 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-dvr2q" podUID="195b4e5d-dc5a-4e0d-9800-80a2f60a926f" containerName="authorino" containerID="cri-o://2870d735652f97d2d22dd739f1de1cfa77defd81ca4a14447a92a07782e28aca" gracePeriod=30 Apr 21 04:35:43.363975 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.363947 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-dvr2q" Apr 21 04:35:43.542384 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.542351 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rjv8\" (UniqueName: \"kubernetes.io/projected/195b4e5d-dc5a-4e0d-9800-80a2f60a926f-kube-api-access-7rjv8\") pod \"195b4e5d-dc5a-4e0d-9800-80a2f60a926f\" (UID: \"195b4e5d-dc5a-4e0d-9800-80a2f60a926f\") " Apr 21 04:35:43.544603 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.544553 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/195b4e5d-dc5a-4e0d-9800-80a2f60a926f-kube-api-access-7rjv8" (OuterVolumeSpecName: "kube-api-access-7rjv8") pod "195b4e5d-dc5a-4e0d-9800-80a2f60a926f" (UID: "195b4e5d-dc5a-4e0d-9800-80a2f60a926f"). InnerVolumeSpecName "kube-api-access-7rjv8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:35:43.562105 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.562077 2574 generic.go:358] "Generic (PLEG): container finished" podID="195b4e5d-dc5a-4e0d-9800-80a2f60a926f" containerID="2870d735652f97d2d22dd739f1de1cfa77defd81ca4a14447a92a07782e28aca" exitCode=0 Apr 21 04:35:43.562225 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.562125 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-dvr2q" Apr 21 04:35:43.562225 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.562141 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-dvr2q" event={"ID":"195b4e5d-dc5a-4e0d-9800-80a2f60a926f","Type":"ContainerDied","Data":"2870d735652f97d2d22dd739f1de1cfa77defd81ca4a14447a92a07782e28aca"} Apr 21 04:35:43.562225 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.562167 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-dvr2q" event={"ID":"195b4e5d-dc5a-4e0d-9800-80a2f60a926f","Type":"ContainerDied","Data":"88407f106aceb9c9e2f84ccf93c7b952119a0b8ff6adbfbb96f136dd04bf499b"} Apr 21 04:35:43.562225 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.562182 2574 scope.go:117] "RemoveContainer" containerID="2870d735652f97d2d22dd739f1de1cfa77defd81ca4a14447a92a07782e28aca" Apr 21 04:35:43.571363 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.571346 2574 scope.go:117] "RemoveContainer" containerID="2870d735652f97d2d22dd739f1de1cfa77defd81ca4a14447a92a07782e28aca" Apr 21 04:35:43.571628 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:35:43.571601 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2870d735652f97d2d22dd739f1de1cfa77defd81ca4a14447a92a07782e28aca\": container with ID starting with 2870d735652f97d2d22dd739f1de1cfa77defd81ca4a14447a92a07782e28aca not found: ID does not exist" containerID="2870d735652f97d2d22dd739f1de1cfa77defd81ca4a14447a92a07782e28aca" Apr 21 04:35:43.571681 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.571641 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2870d735652f97d2d22dd739f1de1cfa77defd81ca4a14447a92a07782e28aca"} err="failed to get container status \"2870d735652f97d2d22dd739f1de1cfa77defd81ca4a14447a92a07782e28aca\": rpc error: code = NotFound desc = could not find container \"2870d735652f97d2d22dd739f1de1cfa77defd81ca4a14447a92a07782e28aca\": container with ID starting with 2870d735652f97d2d22dd739f1de1cfa77defd81ca4a14447a92a07782e28aca not found: ID does not exist" Apr 21 04:35:43.582805 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.582781 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-dvr2q"] Apr 21 04:35:43.586446 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.586424 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-dvr2q"] Apr 21 04:35:43.643298 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.643267 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7rjv8\" (UniqueName: \"kubernetes.io/projected/195b4e5d-dc5a-4e0d-9800-80a2f60a926f-kube-api-access-7rjv8\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:35:43.849245 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.849164 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-fqtgh"] Apr 21 04:35:43.849493 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.849482 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="195b4e5d-dc5a-4e0d-9800-80a2f60a926f" containerName="authorino" Apr 21 04:35:43.849535 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.849495 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="195b4e5d-dc5a-4e0d-9800-80a2f60a926f" containerName="authorino" Apr 21 04:35:43.849604 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.849581 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="195b4e5d-dc5a-4e0d-9800-80a2f60a926f" containerName="authorino" Apr 21 04:35:43.851871 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.851848 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-fqtgh" Apr 21 04:35:43.856213 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.856189 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-l2vw6\"" Apr 21 04:35:43.863981 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.863952 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-fqtgh"] Apr 21 04:35:43.945213 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:43.945171 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7pld\" (UniqueName: \"kubernetes.io/projected/90abea31-6fa7-4ea8-989d-e8d8552052c1-kube-api-access-q7pld\") pod \"maas-controller-6d4c8f55f9-fqtgh\" (UID: \"90abea31-6fa7-4ea8-989d-e8d8552052c1\") " pod="opendatahub/maas-controller-6d4c8f55f9-fqtgh" Apr 21 04:35:44.045854 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:44.045811 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7pld\" (UniqueName: \"kubernetes.io/projected/90abea31-6fa7-4ea8-989d-e8d8552052c1-kube-api-access-q7pld\") pod \"maas-controller-6d4c8f55f9-fqtgh\" (UID: \"90abea31-6fa7-4ea8-989d-e8d8552052c1\") " pod="opendatahub/maas-controller-6d4c8f55f9-fqtgh" Apr 21 04:35:44.054477 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:44.054452 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7pld\" (UniqueName: \"kubernetes.io/projected/90abea31-6fa7-4ea8-989d-e8d8552052c1-kube-api-access-q7pld\") pod \"maas-controller-6d4c8f55f9-fqtgh\" (UID: \"90abea31-6fa7-4ea8-989d-e8d8552052c1\") " pod="opendatahub/maas-controller-6d4c8f55f9-fqtgh" Apr 21 04:35:44.107625 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:44.107505 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-fqtgh"] Apr 21 04:35:44.107857 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:44.107841 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-fqtgh" Apr 21 04:35:44.178248 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:44.178220 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="195b4e5d-dc5a-4e0d-9800-80a2f60a926f" path="/var/lib/kubelet/pods/195b4e5d-dc5a-4e0d-9800-80a2f60a926f/volumes" Apr 21 04:35:44.230671 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:44.230649 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-fqtgh"] Apr 21 04:35:44.232708 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:35:44.232676 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90abea31_6fa7_4ea8_989d_e8d8552052c1.slice/crio-a03e94e6c1a195bb6482eb90d7110d499d2808f7c906234f745fca3c395d7766 WatchSource:0}: Error finding container a03e94e6c1a195bb6482eb90d7110d499d2808f7c906234f745fca3c395d7766: Status 404 returned error can't find the container with id a03e94e6c1a195bb6482eb90d7110d499d2808f7c906234f745fca3c395d7766 Apr 21 04:35:44.569086 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:44.569032 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-fqtgh" event={"ID":"90abea31-6fa7-4ea8-989d-e8d8552052c1","Type":"ContainerStarted","Data":"a03e94e6c1a195bb6482eb90d7110d499d2808f7c906234f745fca3c395d7766"} Apr 21 04:35:47.581478 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:47.581438 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-fqtgh" event={"ID":"90abea31-6fa7-4ea8-989d-e8d8552052c1","Type":"ContainerStarted","Data":"a7659839d2da63d910bde19933dc924653de33089fc31eba2d6c1db30dc748f8"} Apr 21 04:35:47.581937 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:47.581555 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-fqtgh" podUID="90abea31-6fa7-4ea8-989d-e8d8552052c1" containerName="manager" containerID="cri-o://a7659839d2da63d910bde19933dc924653de33089fc31eba2d6c1db30dc748f8" gracePeriod=10 Apr 21 04:35:47.581937 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:47.581615 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-fqtgh" Apr 21 04:35:47.595487 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:47.595423 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-fqtgh" podStartSLOduration=2.2503697369999998 podStartE2EDuration="4.595402976s" podCreationTimestamp="2026-04-21 04:35:43 +0000 UTC" firstStartedPulling="2026-04-21 04:35:44.233927039 +0000 UTC m=+712.615029905" lastFinishedPulling="2026-04-21 04:35:46.578960265 +0000 UTC m=+714.960063144" observedRunningTime="2026-04-21 04:35:47.594822927 +0000 UTC m=+715.975925816" watchObservedRunningTime="2026-04-21 04:35:47.595402976 +0000 UTC m=+715.976505865" Apr 21 04:35:47.814800 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:47.814774 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-fqtgh" Apr 21 04:35:47.982953 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:47.982921 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7pld\" (UniqueName: \"kubernetes.io/projected/90abea31-6fa7-4ea8-989d-e8d8552052c1-kube-api-access-q7pld\") pod \"90abea31-6fa7-4ea8-989d-e8d8552052c1\" (UID: \"90abea31-6fa7-4ea8-989d-e8d8552052c1\") " Apr 21 04:35:47.985193 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:47.985165 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90abea31-6fa7-4ea8-989d-e8d8552052c1-kube-api-access-q7pld" (OuterVolumeSpecName: "kube-api-access-q7pld") pod "90abea31-6fa7-4ea8-989d-e8d8552052c1" (UID: "90abea31-6fa7-4ea8-989d-e8d8552052c1"). InnerVolumeSpecName "kube-api-access-q7pld". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:35:48.084083 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:48.084028 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q7pld\" (UniqueName: \"kubernetes.io/projected/90abea31-6fa7-4ea8-989d-e8d8552052c1-kube-api-access-q7pld\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:35:48.586336 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:48.586304 2574 generic.go:358] "Generic (PLEG): container finished" podID="90abea31-6fa7-4ea8-989d-e8d8552052c1" containerID="a7659839d2da63d910bde19933dc924653de33089fc31eba2d6c1db30dc748f8" exitCode=0 Apr 21 04:35:48.586802 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:48.586361 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-fqtgh" event={"ID":"90abea31-6fa7-4ea8-989d-e8d8552052c1","Type":"ContainerDied","Data":"a7659839d2da63d910bde19933dc924653de33089fc31eba2d6c1db30dc748f8"} Apr 21 04:35:48.586802 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:48.586371 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-fqtgh" Apr 21 04:35:48.586802 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:48.586388 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-fqtgh" event={"ID":"90abea31-6fa7-4ea8-989d-e8d8552052c1","Type":"ContainerDied","Data":"a03e94e6c1a195bb6482eb90d7110d499d2808f7c906234f745fca3c395d7766"} Apr 21 04:35:48.586802 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:48.586402 2574 scope.go:117] "RemoveContainer" containerID="a7659839d2da63d910bde19933dc924653de33089fc31eba2d6c1db30dc748f8" Apr 21 04:35:48.596718 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:48.596586 2574 scope.go:117] "RemoveContainer" containerID="a7659839d2da63d910bde19933dc924653de33089fc31eba2d6c1db30dc748f8" Apr 21 04:35:48.597075 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:35:48.597055 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7659839d2da63d910bde19933dc924653de33089fc31eba2d6c1db30dc748f8\": container with ID starting with a7659839d2da63d910bde19933dc924653de33089fc31eba2d6c1db30dc748f8 not found: ID does not exist" containerID="a7659839d2da63d910bde19933dc924653de33089fc31eba2d6c1db30dc748f8" Apr 21 04:35:48.597137 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:48.597087 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7659839d2da63d910bde19933dc924653de33089fc31eba2d6c1db30dc748f8"} err="failed to get container status \"a7659839d2da63d910bde19933dc924653de33089fc31eba2d6c1db30dc748f8\": rpc error: code = NotFound desc = could not find container \"a7659839d2da63d910bde19933dc924653de33089fc31eba2d6c1db30dc748f8\": container with ID starting with a7659839d2da63d910bde19933dc924653de33089fc31eba2d6c1db30dc748f8 not found: ID does not exist" Apr 21 04:35:48.600529 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:48.600505 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-fqtgh"] Apr 21 04:35:48.604500 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:48.604477 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-fqtgh"] Apr 21 04:35:50.176545 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:35:50.176500 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90abea31-6fa7-4ea8-989d-e8d8552052c1" path="/var/lib/kubelet/pods/90abea31-6fa7-4ea8-989d-e8d8552052c1/volumes" Apr 21 04:36:05.262242 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:05.262206 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-855577bb5c-kz9g5"] Apr 21 04:36:05.262737 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:05.262573 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90abea31-6fa7-4ea8-989d-e8d8552052c1" containerName="manager" Apr 21 04:36:05.262737 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:05.262586 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="90abea31-6fa7-4ea8-989d-e8d8552052c1" containerName="manager" Apr 21 04:36:05.262737 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:05.262671 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="90abea31-6fa7-4ea8-989d-e8d8552052c1" containerName="manager" Apr 21 04:36:05.265423 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:05.265406 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-855577bb5c-kz9g5" Apr 21 04:36:05.267673 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:05.267641 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 21 04:36:05.267673 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:05.267657 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 21 04:36:05.267877 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:05.267664 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-x48nz\"" Apr 21 04:36:05.274828 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:05.274802 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-855577bb5c-kz9g5"] Apr 21 04:36:05.327581 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:05.327541 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh46n\" (UniqueName: \"kubernetes.io/projected/b472bda0-d509-45c6-a1cb-0185bb748967-kube-api-access-gh46n\") pod \"maas-api-855577bb5c-kz9g5\" (UID: \"b472bda0-d509-45c6-a1cb-0185bb748967\") " pod="opendatahub/maas-api-855577bb5c-kz9g5" Apr 21 04:36:05.327767 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:05.327680 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/b472bda0-d509-45c6-a1cb-0185bb748967-maas-api-tls\") pod \"maas-api-855577bb5c-kz9g5\" (UID: \"b472bda0-d509-45c6-a1cb-0185bb748967\") " pod="opendatahub/maas-api-855577bb5c-kz9g5" Apr 21 04:36:05.429205 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:05.429160 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gh46n\" (UniqueName: \"kubernetes.io/projected/b472bda0-d509-45c6-a1cb-0185bb748967-kube-api-access-gh46n\") pod \"maas-api-855577bb5c-kz9g5\" (UID: \"b472bda0-d509-45c6-a1cb-0185bb748967\") " pod="opendatahub/maas-api-855577bb5c-kz9g5" Apr 21 04:36:05.429380 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:05.429253 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/b472bda0-d509-45c6-a1cb-0185bb748967-maas-api-tls\") pod \"maas-api-855577bb5c-kz9g5\" (UID: \"b472bda0-d509-45c6-a1cb-0185bb748967\") " pod="opendatahub/maas-api-855577bb5c-kz9g5" Apr 21 04:36:05.429380 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:36:05.429364 2574 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 21 04:36:05.429458 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:36:05.429430 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b472bda0-d509-45c6-a1cb-0185bb748967-maas-api-tls podName:b472bda0-d509-45c6-a1cb-0185bb748967 nodeName:}" failed. No retries permitted until 2026-04-21 04:36:05.929410305 +0000 UTC m=+734.310513173 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/b472bda0-d509-45c6-a1cb-0185bb748967-maas-api-tls") pod "maas-api-855577bb5c-kz9g5" (UID: "b472bda0-d509-45c6-a1cb-0185bb748967") : secret "maas-api-serving-cert" not found Apr 21 04:36:05.439825 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:05.439797 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh46n\" (UniqueName: \"kubernetes.io/projected/b472bda0-d509-45c6-a1cb-0185bb748967-kube-api-access-gh46n\") pod \"maas-api-855577bb5c-kz9g5\" (UID: \"b472bda0-d509-45c6-a1cb-0185bb748967\") " pod="opendatahub/maas-api-855577bb5c-kz9g5" Apr 21 04:36:05.934239 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:05.934205 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/b472bda0-d509-45c6-a1cb-0185bb748967-maas-api-tls\") pod \"maas-api-855577bb5c-kz9g5\" (UID: \"b472bda0-d509-45c6-a1cb-0185bb748967\") " pod="opendatahub/maas-api-855577bb5c-kz9g5" Apr 21 04:36:05.936670 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:05.936644 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/b472bda0-d509-45c6-a1cb-0185bb748967-maas-api-tls\") pod \"maas-api-855577bb5c-kz9g5\" (UID: \"b472bda0-d509-45c6-a1cb-0185bb748967\") " pod="opendatahub/maas-api-855577bb5c-kz9g5" Apr 21 04:36:06.176925 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:06.176894 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-855577bb5c-kz9g5" Apr 21 04:36:06.507029 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:06.506991 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-855577bb5c-kz9g5"] Apr 21 04:36:06.510516 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:36:06.510486 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb472bda0_d509_45c6_a1cb_0185bb748967.slice/crio-c2e7a89d3cf6e77e3ad360b0f4857bd097913f66926a1592672e31f1b473e41c WatchSource:0}: Error finding container c2e7a89d3cf6e77e3ad360b0f4857bd097913f66926a1592672e31f1b473e41c: Status 404 returned error can't find the container with id c2e7a89d3cf6e77e3ad360b0f4857bd097913f66926a1592672e31f1b473e41c Apr 21 04:36:06.512220 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:06.512200 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 04:36:06.646792 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:06.646754 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-855577bb5c-kz9g5" event={"ID":"b472bda0-d509-45c6-a1cb-0185bb748967","Type":"ContainerStarted","Data":"c2e7a89d3cf6e77e3ad360b0f4857bd097913f66926a1592672e31f1b473e41c"} Apr 21 04:36:08.655733 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:08.655693 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-855577bb5c-kz9g5" event={"ID":"b472bda0-d509-45c6-a1cb-0185bb748967","Type":"ContainerStarted","Data":"588eabe82d3a4c64e1d4ad8d465d4ff213ade4ec47265010114d32851c1aa2dd"} Apr 21 04:36:08.656125 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:08.655752 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-855577bb5c-kz9g5" Apr 21 04:36:08.672091 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:08.672043 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-855577bb5c-kz9g5" podStartSLOduration=2.24836073 podStartE2EDuration="3.67202537s" podCreationTimestamp="2026-04-21 04:36:05 +0000 UTC" firstStartedPulling="2026-04-21 04:36:06.512323487 +0000 UTC m=+734.893426353" lastFinishedPulling="2026-04-21 04:36:07.935988127 +0000 UTC m=+736.317090993" observedRunningTime="2026-04-21 04:36:08.671043067 +0000 UTC m=+737.052145955" watchObservedRunningTime="2026-04-21 04:36:08.67202537 +0000 UTC m=+737.053128259" Apr 21 04:36:14.665476 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:14.665443 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-855577bb5c-kz9g5" Apr 21 04:36:32.704531 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:32.704493 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-7cb5f4f657-45dbm"] Apr 21 04:36:32.710442 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:32.710409 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7cb5f4f657-45dbm" Apr 21 04:36:32.714858 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:32.714826 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7cb5f4f657-45dbm"] Apr 21 04:36:32.883101 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:32.883053 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/5ab973a7-9702-43fe-b3c4-568140e1c750-maas-api-tls\") pod \"maas-api-7cb5f4f657-45dbm\" (UID: \"5ab973a7-9702-43fe-b3c4-568140e1c750\") " pod="opendatahub/maas-api-7cb5f4f657-45dbm" Apr 21 04:36:32.883291 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:32.883121 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42l7z\" (UniqueName: \"kubernetes.io/projected/5ab973a7-9702-43fe-b3c4-568140e1c750-kube-api-access-42l7z\") pod \"maas-api-7cb5f4f657-45dbm\" (UID: \"5ab973a7-9702-43fe-b3c4-568140e1c750\") " pod="opendatahub/maas-api-7cb5f4f657-45dbm" Apr 21 04:36:32.983975 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:32.983938 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/5ab973a7-9702-43fe-b3c4-568140e1c750-maas-api-tls\") pod \"maas-api-7cb5f4f657-45dbm\" (UID: \"5ab973a7-9702-43fe-b3c4-568140e1c750\") " pod="opendatahub/maas-api-7cb5f4f657-45dbm" Apr 21 04:36:32.984170 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:32.983986 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42l7z\" (UniqueName: \"kubernetes.io/projected/5ab973a7-9702-43fe-b3c4-568140e1c750-kube-api-access-42l7z\") pod \"maas-api-7cb5f4f657-45dbm\" (UID: \"5ab973a7-9702-43fe-b3c4-568140e1c750\") " pod="opendatahub/maas-api-7cb5f4f657-45dbm" Apr 21 04:36:32.986457 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:32.986420 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/5ab973a7-9702-43fe-b3c4-568140e1c750-maas-api-tls\") pod \"maas-api-7cb5f4f657-45dbm\" (UID: \"5ab973a7-9702-43fe-b3c4-568140e1c750\") " pod="opendatahub/maas-api-7cb5f4f657-45dbm" Apr 21 04:36:32.992045 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:32.992018 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42l7z\" (UniqueName: \"kubernetes.io/projected/5ab973a7-9702-43fe-b3c4-568140e1c750-kube-api-access-42l7z\") pod \"maas-api-7cb5f4f657-45dbm\" (UID: \"5ab973a7-9702-43fe-b3c4-568140e1c750\") " pod="opendatahub/maas-api-7cb5f4f657-45dbm" Apr 21 04:36:33.022718 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:33.022676 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-7cb5f4f657-45dbm" Apr 21 04:36:33.152716 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:33.152688 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-7cb5f4f657-45dbm"] Apr 21 04:36:33.155473 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:36:33.155436 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ab973a7_9702_43fe_b3c4_568140e1c750.slice/crio-4f895c56efa9d0d6afb6c125b3bc561ef645b3a1eb783179c09a20fd29a9021b WatchSource:0}: Error finding container 4f895c56efa9d0d6afb6c125b3bc561ef645b3a1eb783179c09a20fd29a9021b: Status 404 returned error can't find the container with id 4f895c56efa9d0d6afb6c125b3bc561ef645b3a1eb783179c09a20fd29a9021b Apr 21 04:36:33.744068 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:33.744024 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7cb5f4f657-45dbm" event={"ID":"5ab973a7-9702-43fe-b3c4-568140e1c750","Type":"ContainerStarted","Data":"4f895c56efa9d0d6afb6c125b3bc561ef645b3a1eb783179c09a20fd29a9021b"} Apr 21 04:36:34.752143 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:34.752109 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-7cb5f4f657-45dbm" event={"ID":"5ab973a7-9702-43fe-b3c4-568140e1c750","Type":"ContainerStarted","Data":"cf00efd0efd827cd73cfa60cfdbd3cc846e4e48a69b6c958922084b37048cd7d"} Apr 21 04:36:35.755902 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:35.755867 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-7cb5f4f657-45dbm" Apr 21 04:36:35.771299 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:35.771251 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-7cb5f4f657-45dbm" podStartSLOduration=2.240031768 podStartE2EDuration="3.771230385s" podCreationTimestamp="2026-04-21 04:36:32 +0000 UTC" firstStartedPulling="2026-04-21 04:36:33.15674212 +0000 UTC m=+761.537844986" lastFinishedPulling="2026-04-21 04:36:34.687940724 +0000 UTC m=+763.069043603" observedRunningTime="2026-04-21 04:36:35.770455899 +0000 UTC m=+764.151558788" watchObservedRunningTime="2026-04-21 04:36:35.771230385 +0000 UTC m=+764.152333254" Apr 21 04:36:41.765223 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:41.765191 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-7cb5f4f657-45dbm" Apr 21 04:36:41.812985 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:41.812942 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-855577bb5c-kz9g5"] Apr 21 04:36:41.813230 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:41.813202 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-855577bb5c-kz9g5" podUID="b472bda0-d509-45c6-a1cb-0185bb748967" containerName="maas-api" containerID="cri-o://588eabe82d3a4c64e1d4ad8d465d4ff213ade4ec47265010114d32851c1aa2dd" gracePeriod=30 Apr 21 04:36:42.060322 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:42.060296 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-855577bb5c-kz9g5" Apr 21 04:36:42.159652 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:42.159617 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh46n\" (UniqueName: \"kubernetes.io/projected/b472bda0-d509-45c6-a1cb-0185bb748967-kube-api-access-gh46n\") pod \"b472bda0-d509-45c6-a1cb-0185bb748967\" (UID: \"b472bda0-d509-45c6-a1cb-0185bb748967\") " Apr 21 04:36:42.159844 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:42.159675 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/b472bda0-d509-45c6-a1cb-0185bb748967-maas-api-tls\") pod \"b472bda0-d509-45c6-a1cb-0185bb748967\" (UID: \"b472bda0-d509-45c6-a1cb-0185bb748967\") " Apr 21 04:36:42.161720 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:42.161693 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b472bda0-d509-45c6-a1cb-0185bb748967-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "b472bda0-d509-45c6-a1cb-0185bb748967" (UID: "b472bda0-d509-45c6-a1cb-0185bb748967"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:36:42.161815 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:42.161764 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b472bda0-d509-45c6-a1cb-0185bb748967-kube-api-access-gh46n" (OuterVolumeSpecName: "kube-api-access-gh46n") pod "b472bda0-d509-45c6-a1cb-0185bb748967" (UID: "b472bda0-d509-45c6-a1cb-0185bb748967"). InnerVolumeSpecName "kube-api-access-gh46n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:36:42.260322 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:42.260265 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gh46n\" (UniqueName: \"kubernetes.io/projected/b472bda0-d509-45c6-a1cb-0185bb748967-kube-api-access-gh46n\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:36:42.260322 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:42.260320 2574 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/b472bda0-d509-45c6-a1cb-0185bb748967-maas-api-tls\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:36:42.786109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:42.786066 2574 generic.go:358] "Generic (PLEG): container finished" podID="b472bda0-d509-45c6-a1cb-0185bb748967" containerID="588eabe82d3a4c64e1d4ad8d465d4ff213ade4ec47265010114d32851c1aa2dd" exitCode=0 Apr 21 04:36:42.786537 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:42.786140 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-855577bb5c-kz9g5" Apr 21 04:36:42.786537 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:42.786152 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-855577bb5c-kz9g5" event={"ID":"b472bda0-d509-45c6-a1cb-0185bb748967","Type":"ContainerDied","Data":"588eabe82d3a4c64e1d4ad8d465d4ff213ade4ec47265010114d32851c1aa2dd"} Apr 21 04:36:42.786537 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:42.786190 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-855577bb5c-kz9g5" event={"ID":"b472bda0-d509-45c6-a1cb-0185bb748967","Type":"ContainerDied","Data":"c2e7a89d3cf6e77e3ad360b0f4857bd097913f66926a1592672e31f1b473e41c"} Apr 21 04:36:42.786537 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:42.786213 2574 scope.go:117] "RemoveContainer" containerID="588eabe82d3a4c64e1d4ad8d465d4ff213ade4ec47265010114d32851c1aa2dd" Apr 21 04:36:42.794428 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:42.794409 2574 scope.go:117] "RemoveContainer" containerID="588eabe82d3a4c64e1d4ad8d465d4ff213ade4ec47265010114d32851c1aa2dd" Apr 21 04:36:42.794704 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:36:42.794684 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"588eabe82d3a4c64e1d4ad8d465d4ff213ade4ec47265010114d32851c1aa2dd\": container with ID starting with 588eabe82d3a4c64e1d4ad8d465d4ff213ade4ec47265010114d32851c1aa2dd not found: ID does not exist" containerID="588eabe82d3a4c64e1d4ad8d465d4ff213ade4ec47265010114d32851c1aa2dd" Apr 21 04:36:42.794766 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:42.794712 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"588eabe82d3a4c64e1d4ad8d465d4ff213ade4ec47265010114d32851c1aa2dd"} err="failed to get container status \"588eabe82d3a4c64e1d4ad8d465d4ff213ade4ec47265010114d32851c1aa2dd\": rpc error: code = NotFound desc = could not find container \"588eabe82d3a4c64e1d4ad8d465d4ff213ade4ec47265010114d32851c1aa2dd\": container with ID starting with 588eabe82d3a4c64e1d4ad8d465d4ff213ade4ec47265010114d32851c1aa2dd not found: ID does not exist" Apr 21 04:36:42.801824 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:42.801792 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-855577bb5c-kz9g5"] Apr 21 04:36:42.804416 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:42.804390 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-855577bb5c-kz9g5"] Apr 21 04:36:44.177232 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:44.177193 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b472bda0-d509-45c6-a1cb-0185bb748967" path="/var/lib/kubelet/pods/b472bda0-d509-45c6-a1cb-0185bb748967/volumes" Apr 21 04:36:53.719526 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:53.719485 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-dbd57cc7-z87qs"] Apr 21 04:36:53.720062 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:53.720019 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b472bda0-d509-45c6-a1cb-0185bb748967" containerName="maas-api" Apr 21 04:36:53.720062 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:53.720046 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b472bda0-d509-45c6-a1cb-0185bb748967" containerName="maas-api" Apr 21 04:36:53.720180 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:53.720143 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b472bda0-d509-45c6-a1cb-0185bb748967" containerName="maas-api" Apr 21 04:36:53.724977 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:53.724953 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-dbd57cc7-z87qs" Apr 21 04:36:53.727789 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:53.727763 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-tzgn6\"" Apr 21 04:36:53.728166 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:53.728138 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 21 04:36:53.728166 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:53.728158 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"authorino-oidc-ca\"" Apr 21 04:36:53.729503 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:53.729476 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-dbd57cc7-z87qs"] Apr 21 04:36:53.758322 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:53.758285 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hknqb\" (UniqueName: \"kubernetes.io/projected/50f7a369-8eec-4cf0-8bfd-55f2812fb1e6-kube-api-access-hknqb\") pod \"authorino-dbd57cc7-z87qs\" (UID: \"50f7a369-8eec-4cf0-8bfd-55f2812fb1e6\") " pod="kuadrant-system/authorino-dbd57cc7-z87qs" Apr 21 04:36:53.758503 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:53.758379 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/50f7a369-8eec-4cf0-8bfd-55f2812fb1e6-oidc-ca\") pod \"authorino-dbd57cc7-z87qs\" (UID: \"50f7a369-8eec-4cf0-8bfd-55f2812fb1e6\") " pod="kuadrant-system/authorino-dbd57cc7-z87qs" Apr 21 04:36:53.758503 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:53.758412 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/50f7a369-8eec-4cf0-8bfd-55f2812fb1e6-tls-cert\") pod \"authorino-dbd57cc7-z87qs\" (UID: \"50f7a369-8eec-4cf0-8bfd-55f2812fb1e6\") " pod="kuadrant-system/authorino-dbd57cc7-z87qs" Apr 21 04:36:53.859384 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:53.859343 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hknqb\" (UniqueName: \"kubernetes.io/projected/50f7a369-8eec-4cf0-8bfd-55f2812fb1e6-kube-api-access-hknqb\") pod \"authorino-dbd57cc7-z87qs\" (UID: \"50f7a369-8eec-4cf0-8bfd-55f2812fb1e6\") " pod="kuadrant-system/authorino-dbd57cc7-z87qs" Apr 21 04:36:53.859576 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:53.859427 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/50f7a369-8eec-4cf0-8bfd-55f2812fb1e6-oidc-ca\") pod \"authorino-dbd57cc7-z87qs\" (UID: \"50f7a369-8eec-4cf0-8bfd-55f2812fb1e6\") " pod="kuadrant-system/authorino-dbd57cc7-z87qs" Apr 21 04:36:53.859576 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:53.859458 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/50f7a369-8eec-4cf0-8bfd-55f2812fb1e6-tls-cert\") pod \"authorino-dbd57cc7-z87qs\" (UID: \"50f7a369-8eec-4cf0-8bfd-55f2812fb1e6\") " pod="kuadrant-system/authorino-dbd57cc7-z87qs" Apr 21 04:36:53.860177 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:53.860153 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/50f7a369-8eec-4cf0-8bfd-55f2812fb1e6-oidc-ca\") pod \"authorino-dbd57cc7-z87qs\" (UID: \"50f7a369-8eec-4cf0-8bfd-55f2812fb1e6\") " pod="kuadrant-system/authorino-dbd57cc7-z87qs" Apr 21 04:36:53.861869 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:53.861848 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/50f7a369-8eec-4cf0-8bfd-55f2812fb1e6-tls-cert\") pod \"authorino-dbd57cc7-z87qs\" (UID: \"50f7a369-8eec-4cf0-8bfd-55f2812fb1e6\") " pod="kuadrant-system/authorino-dbd57cc7-z87qs" Apr 21 04:36:53.867256 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:53.867232 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hknqb\" (UniqueName: \"kubernetes.io/projected/50f7a369-8eec-4cf0-8bfd-55f2812fb1e6-kube-api-access-hknqb\") pod \"authorino-dbd57cc7-z87qs\" (UID: \"50f7a369-8eec-4cf0-8bfd-55f2812fb1e6\") " pod="kuadrant-system/authorino-dbd57cc7-z87qs" Apr 21 04:36:54.035616 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:54.035502 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-dbd57cc7-z87qs" Apr 21 04:36:54.177448 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:36:54.177409 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50f7a369_8eec_4cf0_8bfd_55f2812fb1e6.slice/crio-a5da120b23be5b7c53d9ac7d72164ccbf55993dfd7aeddf1b16e05f1eb11a5c6 WatchSource:0}: Error finding container a5da120b23be5b7c53d9ac7d72164ccbf55993dfd7aeddf1b16e05f1eb11a5c6: Status 404 returned error can't find the container with id a5da120b23be5b7c53d9ac7d72164ccbf55993dfd7aeddf1b16e05f1eb11a5c6 Apr 21 04:36:54.178748 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:54.178728 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-dbd57cc7-z87qs"] Apr 21 04:36:54.832823 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:54.832663 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-dbd57cc7-z87qs" event={"ID":"50f7a369-8eec-4cf0-8bfd-55f2812fb1e6","Type":"ContainerStarted","Data":"58e47ae308b3424e1fc42a09e90b6a600e67f6e0194d013582f7314d64cbebd2"} Apr 21 04:36:54.832823 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:54.832716 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-dbd57cc7-z87qs" event={"ID":"50f7a369-8eec-4cf0-8bfd-55f2812fb1e6","Type":"ContainerStarted","Data":"a5da120b23be5b7c53d9ac7d72164ccbf55993dfd7aeddf1b16e05f1eb11a5c6"} Apr 21 04:36:54.849283 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:36:54.849225 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-dbd57cc7-z87qs" podStartSLOduration=1.4657889210000001 podStartE2EDuration="1.849209962s" podCreationTimestamp="2026-04-21 04:36:53 +0000 UTC" firstStartedPulling="2026-04-21 04:36:54.179139934 +0000 UTC m=+782.560242801" lastFinishedPulling="2026-04-21 04:36:54.56256097 +0000 UTC m=+782.943663842" observedRunningTime="2026-04-21 04:36:54.848235783 +0000 UTC m=+783.229338672" watchObservedRunningTime="2026-04-21 04:36:54.849209962 +0000 UTC m=+783.230313134" Apr 21 04:37:08.618024 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.617940 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62"] Apr 21 04:37:08.622804 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.622781 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" Apr 21 04:37:08.625486 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.625458 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 21 04:37:08.626412 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.626375 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-569jl\"" Apr 21 04:37:08.626412 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.626396 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 21 04:37:08.626608 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.626436 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 21 04:37:08.633725 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.633697 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62"] Apr 21 04:37:08.687353 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.687310 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1c600334-e8a2-4df1-a0bb-a52e380063e5-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62\" (UID: \"1c600334-e8a2-4df1-a0bb-a52e380063e5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" Apr 21 04:37:08.687353 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.687355 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c600334-e8a2-4df1-a0bb-a52e380063e5-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62\" (UID: \"1c600334-e8a2-4df1-a0bb-a52e380063e5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" Apr 21 04:37:08.687652 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.687401 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1c600334-e8a2-4df1-a0bb-a52e380063e5-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62\" (UID: \"1c600334-e8a2-4df1-a0bb-a52e380063e5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" Apr 21 04:37:08.687652 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.687486 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1c600334-e8a2-4df1-a0bb-a52e380063e5-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62\" (UID: \"1c600334-e8a2-4df1-a0bb-a52e380063e5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" Apr 21 04:37:08.687652 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.687510 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfgj6\" (UniqueName: \"kubernetes.io/projected/1c600334-e8a2-4df1-a0bb-a52e380063e5-kube-api-access-zfgj6\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62\" (UID: \"1c600334-e8a2-4df1-a0bb-a52e380063e5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" Apr 21 04:37:08.687652 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.687571 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1c600334-e8a2-4df1-a0bb-a52e380063e5-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62\" (UID: \"1c600334-e8a2-4df1-a0bb-a52e380063e5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" Apr 21 04:37:08.788527 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.788478 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1c600334-e8a2-4df1-a0bb-a52e380063e5-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62\" (UID: \"1c600334-e8a2-4df1-a0bb-a52e380063e5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" Apr 21 04:37:08.788773 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.788538 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfgj6\" (UniqueName: \"kubernetes.io/projected/1c600334-e8a2-4df1-a0bb-a52e380063e5-kube-api-access-zfgj6\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62\" (UID: \"1c600334-e8a2-4df1-a0bb-a52e380063e5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" Apr 21 04:37:08.788773 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.788585 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1c600334-e8a2-4df1-a0bb-a52e380063e5-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62\" (UID: \"1c600334-e8a2-4df1-a0bb-a52e380063e5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" Apr 21 04:37:08.788773 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.788698 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1c600334-e8a2-4df1-a0bb-a52e380063e5-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62\" (UID: \"1c600334-e8a2-4df1-a0bb-a52e380063e5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" Apr 21 04:37:08.788773 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.788724 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c600334-e8a2-4df1-a0bb-a52e380063e5-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62\" (UID: \"1c600334-e8a2-4df1-a0bb-a52e380063e5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" Apr 21 04:37:08.788773 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.788756 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1c600334-e8a2-4df1-a0bb-a52e380063e5-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62\" (UID: \"1c600334-e8a2-4df1-a0bb-a52e380063e5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" Apr 21 04:37:08.789089 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.789060 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1c600334-e8a2-4df1-a0bb-a52e380063e5-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62\" (UID: \"1c600334-e8a2-4df1-a0bb-a52e380063e5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" Apr 21 04:37:08.789234 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.789198 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1c600334-e8a2-4df1-a0bb-a52e380063e5-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62\" (UID: \"1c600334-e8a2-4df1-a0bb-a52e380063e5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" Apr 21 04:37:08.789304 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.789249 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1c600334-e8a2-4df1-a0bb-a52e380063e5-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62\" (UID: \"1c600334-e8a2-4df1-a0bb-a52e380063e5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" Apr 21 04:37:08.791012 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.790977 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1c600334-e8a2-4df1-a0bb-a52e380063e5-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62\" (UID: \"1c600334-e8a2-4df1-a0bb-a52e380063e5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" Apr 21 04:37:08.791293 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.791274 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1c600334-e8a2-4df1-a0bb-a52e380063e5-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62\" (UID: \"1c600334-e8a2-4df1-a0bb-a52e380063e5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" Apr 21 04:37:08.796885 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.796860 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfgj6\" (UniqueName: \"kubernetes.io/projected/1c600334-e8a2-4df1-a0bb-a52e380063e5-kube-api-access-zfgj6\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62\" (UID: \"1c600334-e8a2-4df1-a0bb-a52e380063e5\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" Apr 21 04:37:08.935197 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:08.935096 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" Apr 21 04:37:09.068562 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:09.068537 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62"] Apr 21 04:37:09.070587 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:37:09.070548 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c600334_e8a2_4df1_a0bb_a52e380063e5.slice/crio-8170da91684e390725aa4811a5a35767ee3218e62a8f78a72104a0724c0f11ed WatchSource:0}: Error finding container 8170da91684e390725aa4811a5a35767ee3218e62a8f78a72104a0724c0f11ed: Status 404 returned error can't find the container with id 8170da91684e390725aa4811a5a35767ee3218e62a8f78a72104a0724c0f11ed Apr 21 04:37:09.884513 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:09.884471 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" event={"ID":"1c600334-e8a2-4df1-a0bb-a52e380063e5","Type":"ContainerStarted","Data":"8170da91684e390725aa4811a5a35767ee3218e62a8f78a72104a0724c0f11ed"} Apr 21 04:37:15.911419 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:15.911381 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" event={"ID":"1c600334-e8a2-4df1-a0bb-a52e380063e5","Type":"ContainerStarted","Data":"363376f2f9206cfa7528585b0fa3e2f5ccc1f9d7b903a4ad722d5d9d72c9f627"} Apr 21 04:37:20.932841 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:20.932807 2574 generic.go:358] "Generic (PLEG): container finished" podID="1c600334-e8a2-4df1-a0bb-a52e380063e5" containerID="363376f2f9206cfa7528585b0fa3e2f5ccc1f9d7b903a4ad722d5d9d72c9f627" exitCode=0 Apr 21 04:37:20.933221 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:20.932867 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" event={"ID":"1c600334-e8a2-4df1-a0bb-a52e380063e5","Type":"ContainerDied","Data":"363376f2f9206cfa7528585b0fa3e2f5ccc1f9d7b903a4ad722d5d9d72c9f627"} Apr 21 04:37:22.943307 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:22.943269 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" event={"ID":"1c600334-e8a2-4df1-a0bb-a52e380063e5","Type":"ContainerStarted","Data":"da4f91cd65f389271236765bbf0d9302d84d30d156f736066327f7df351b2fce"} Apr 21 04:37:22.943809 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:22.943493 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" Apr 21 04:37:22.960261 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:22.960202 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" podStartSLOduration=2.029453969 podStartE2EDuration="14.960185609s" podCreationTimestamp="2026-04-21 04:37:08 +0000 UTC" firstStartedPulling="2026-04-21 04:37:09.072562541 +0000 UTC m=+797.453665407" lastFinishedPulling="2026-04-21 04:37:22.003294181 +0000 UTC m=+810.384397047" observedRunningTime="2026-04-21 04:37:22.959000544 +0000 UTC m=+811.340103431" watchObservedRunningTime="2026-04-21 04:37:22.960185609 +0000 UTC m=+811.341288494" Apr 21 04:37:25.012901 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.012858 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh"] Apr 21 04:37:25.016523 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.016498 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" Apr 21 04:37:25.018780 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.018755 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 21 04:37:25.023956 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.023927 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh"] Apr 21 04:37:25.139403 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.139354 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/889861c8-8a0f-49e8-b3c9-e2731cc3e331-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh\" (UID: \"889861c8-8a0f-49e8-b3c9-e2731cc3e331\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" Apr 21 04:37:25.139403 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.139415 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/889861c8-8a0f-49e8-b3c9-e2731cc3e331-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh\" (UID: \"889861c8-8a0f-49e8-b3c9-e2731cc3e331\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" Apr 21 04:37:25.139700 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.139499 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l78zq\" (UniqueName: \"kubernetes.io/projected/889861c8-8a0f-49e8-b3c9-e2731cc3e331-kube-api-access-l78zq\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh\" (UID: \"889861c8-8a0f-49e8-b3c9-e2731cc3e331\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" Apr 21 04:37:25.139700 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.139537 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/889861c8-8a0f-49e8-b3c9-e2731cc3e331-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh\" (UID: \"889861c8-8a0f-49e8-b3c9-e2731cc3e331\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" Apr 21 04:37:25.139700 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.139630 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/889861c8-8a0f-49e8-b3c9-e2731cc3e331-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh\" (UID: \"889861c8-8a0f-49e8-b3c9-e2731cc3e331\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" Apr 21 04:37:25.139700 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.139652 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/889861c8-8a0f-49e8-b3c9-e2731cc3e331-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh\" (UID: \"889861c8-8a0f-49e8-b3c9-e2731cc3e331\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" Apr 21 04:37:25.240921 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.240871 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/889861c8-8a0f-49e8-b3c9-e2731cc3e331-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh\" (UID: \"889861c8-8a0f-49e8-b3c9-e2731cc3e331\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" Apr 21 04:37:25.240921 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.240915 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/889861c8-8a0f-49e8-b3c9-e2731cc3e331-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh\" (UID: \"889861c8-8a0f-49e8-b3c9-e2731cc3e331\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" Apr 21 04:37:25.241182 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.240974 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/889861c8-8a0f-49e8-b3c9-e2731cc3e331-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh\" (UID: \"889861c8-8a0f-49e8-b3c9-e2731cc3e331\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" Apr 21 04:37:25.241182 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.241021 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/889861c8-8a0f-49e8-b3c9-e2731cc3e331-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh\" (UID: \"889861c8-8a0f-49e8-b3c9-e2731cc3e331\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" Apr 21 04:37:25.241182 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.241061 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l78zq\" (UniqueName: \"kubernetes.io/projected/889861c8-8a0f-49e8-b3c9-e2731cc3e331-kube-api-access-l78zq\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh\" (UID: \"889861c8-8a0f-49e8-b3c9-e2731cc3e331\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" Apr 21 04:37:25.241182 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.241087 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/889861c8-8a0f-49e8-b3c9-e2731cc3e331-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh\" (UID: \"889861c8-8a0f-49e8-b3c9-e2731cc3e331\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" Apr 21 04:37:25.241394 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.241355 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/889861c8-8a0f-49e8-b3c9-e2731cc3e331-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh\" (UID: \"889861c8-8a0f-49e8-b3c9-e2731cc3e331\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" Apr 21 04:37:25.241394 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.241379 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/889861c8-8a0f-49e8-b3c9-e2731cc3e331-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh\" (UID: \"889861c8-8a0f-49e8-b3c9-e2731cc3e331\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" Apr 21 04:37:25.241515 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.241491 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/889861c8-8a0f-49e8-b3c9-e2731cc3e331-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh\" (UID: \"889861c8-8a0f-49e8-b3c9-e2731cc3e331\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" Apr 21 04:37:25.243317 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.243294 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/889861c8-8a0f-49e8-b3c9-e2731cc3e331-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh\" (UID: \"889861c8-8a0f-49e8-b3c9-e2731cc3e331\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" Apr 21 04:37:25.243544 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.243523 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/889861c8-8a0f-49e8-b3c9-e2731cc3e331-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh\" (UID: \"889861c8-8a0f-49e8-b3c9-e2731cc3e331\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" Apr 21 04:37:25.248761 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.248736 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l78zq\" (UniqueName: \"kubernetes.io/projected/889861c8-8a0f-49e8-b3c9-e2731cc3e331-kube-api-access-l78zq\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh\" (UID: \"889861c8-8a0f-49e8-b3c9-e2731cc3e331\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" Apr 21 04:37:25.343722 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.343632 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" Apr 21 04:37:25.470438 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.470411 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh"] Apr 21 04:37:25.472320 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:37:25.472291 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod889861c8_8a0f_49e8_b3c9_e2731cc3e331.slice/crio-2af8cc043865a63bd093e2c4f4067771ee9a30ed5635053797bd81557e9888cc WatchSource:0}: Error finding container 2af8cc043865a63bd093e2c4f4067771ee9a30ed5635053797bd81557e9888cc: Status 404 returned error can't find the container with id 2af8cc043865a63bd093e2c4f4067771ee9a30ed5635053797bd81557e9888cc Apr 21 04:37:25.955007 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.954959 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" event={"ID":"889861c8-8a0f-49e8-b3c9-e2731cc3e331","Type":"ContainerStarted","Data":"c536296becffed971b692662cc95910c6b49563c01e57f446f10940bb5065f1c"} Apr 21 04:37:25.955007 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:25.955011 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" event={"ID":"889861c8-8a0f-49e8-b3c9-e2731cc3e331","Type":"ContainerStarted","Data":"2af8cc043865a63bd093e2c4f4067771ee9a30ed5635053797bd81557e9888cc"} Apr 21 04:37:33.960916 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:33.960881 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62" Apr 21 04:37:34.909733 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:34.909650 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg"] Apr 21 04:37:34.913530 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:34.913509 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" Apr 21 04:37:34.915647 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:34.915626 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 21 04:37:34.922340 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:34.922320 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg"] Apr 21 04:37:34.932077 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:34.932051 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ea625ee0-d68d-4de4-98fa-69a25a73beb2-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-vvwqg\" (UID: \"ea625ee0-d68d-4de4-98fa-69a25a73beb2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" Apr 21 04:37:34.932173 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:34.932095 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea625ee0-d68d-4de4-98fa-69a25a73beb2-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-vvwqg\" (UID: \"ea625ee0-d68d-4de4-98fa-69a25a73beb2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" Apr 21 04:37:34.932228 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:34.932205 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xvlc\" (UniqueName: \"kubernetes.io/projected/ea625ee0-d68d-4de4-98fa-69a25a73beb2-kube-api-access-6xvlc\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-vvwqg\" (UID: \"ea625ee0-d68d-4de4-98fa-69a25a73beb2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" Apr 21 04:37:34.932270 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:34.932249 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ea625ee0-d68d-4de4-98fa-69a25a73beb2-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-vvwqg\" (UID: \"ea625ee0-d68d-4de4-98fa-69a25a73beb2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" Apr 21 04:37:34.932307 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:34.932286 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ea625ee0-d68d-4de4-98fa-69a25a73beb2-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-vvwqg\" (UID: \"ea625ee0-d68d-4de4-98fa-69a25a73beb2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" Apr 21 04:37:34.932340 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:34.932321 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ea625ee0-d68d-4de4-98fa-69a25a73beb2-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-vvwqg\" (UID: \"ea625ee0-d68d-4de4-98fa-69a25a73beb2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" Apr 21 04:37:34.993699 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:34.993662 2574 generic.go:358] "Generic (PLEG): container finished" podID="889861c8-8a0f-49e8-b3c9-e2731cc3e331" containerID="c536296becffed971b692662cc95910c6b49563c01e57f446f10940bb5065f1c" exitCode=0 Apr 21 04:37:34.994247 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:34.993750 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" event={"ID":"889861c8-8a0f-49e8-b3c9-e2731cc3e331","Type":"ContainerDied","Data":"c536296becffed971b692662cc95910c6b49563c01e57f446f10940bb5065f1c"} Apr 21 04:37:35.032815 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:35.032780 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ea625ee0-d68d-4de4-98fa-69a25a73beb2-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-vvwqg\" (UID: \"ea625ee0-d68d-4de4-98fa-69a25a73beb2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" Apr 21 04:37:35.033001 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:35.032842 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea625ee0-d68d-4de4-98fa-69a25a73beb2-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-vvwqg\" (UID: \"ea625ee0-d68d-4de4-98fa-69a25a73beb2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" Apr 21 04:37:35.033001 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:35.032903 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6xvlc\" (UniqueName: \"kubernetes.io/projected/ea625ee0-d68d-4de4-98fa-69a25a73beb2-kube-api-access-6xvlc\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-vvwqg\" (UID: \"ea625ee0-d68d-4de4-98fa-69a25a73beb2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" Apr 21 04:37:35.033001 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:35.032932 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ea625ee0-d68d-4de4-98fa-69a25a73beb2-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-vvwqg\" (UID: \"ea625ee0-d68d-4de4-98fa-69a25a73beb2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" Apr 21 04:37:35.033001 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:35.032981 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ea625ee0-d68d-4de4-98fa-69a25a73beb2-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-vvwqg\" (UID: \"ea625ee0-d68d-4de4-98fa-69a25a73beb2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" Apr 21 04:37:35.033274 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:35.033249 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ea625ee0-d68d-4de4-98fa-69a25a73beb2-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-vvwqg\" (UID: \"ea625ee0-d68d-4de4-98fa-69a25a73beb2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" Apr 21 04:37:35.033367 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:35.033278 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea625ee0-d68d-4de4-98fa-69a25a73beb2-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-vvwqg\" (UID: \"ea625ee0-d68d-4de4-98fa-69a25a73beb2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" Apr 21 04:37:35.033367 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:35.033328 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ea625ee0-d68d-4de4-98fa-69a25a73beb2-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-vvwqg\" (UID: \"ea625ee0-d68d-4de4-98fa-69a25a73beb2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" Apr 21 04:37:35.033551 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:35.033532 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ea625ee0-d68d-4de4-98fa-69a25a73beb2-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-vvwqg\" (UID: \"ea625ee0-d68d-4de4-98fa-69a25a73beb2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" Apr 21 04:37:35.035356 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:35.035329 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ea625ee0-d68d-4de4-98fa-69a25a73beb2-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-vvwqg\" (UID: \"ea625ee0-d68d-4de4-98fa-69a25a73beb2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" Apr 21 04:37:35.035576 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:35.035558 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ea625ee0-d68d-4de4-98fa-69a25a73beb2-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-vvwqg\" (UID: \"ea625ee0-d68d-4de4-98fa-69a25a73beb2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" Apr 21 04:37:35.040420 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:35.040398 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xvlc\" (UniqueName: \"kubernetes.io/projected/ea625ee0-d68d-4de4-98fa-69a25a73beb2-kube-api-access-6xvlc\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-vvwqg\" (UID: \"ea625ee0-d68d-4de4-98fa-69a25a73beb2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" Apr 21 04:37:35.225449 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:35.225416 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" Apr 21 04:37:35.353733 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:35.353699 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg"] Apr 21 04:37:35.356022 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:37:35.355993 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea625ee0_d68d_4de4_98fa_69a25a73beb2.slice/crio-29a00db950bfdf46548982e648bae969946ab516ef519d359109e6ed2fd3b91b WatchSource:0}: Error finding container 29a00db950bfdf46548982e648bae969946ab516ef519d359109e6ed2fd3b91b: Status 404 returned error can't find the container with id 29a00db950bfdf46548982e648bae969946ab516ef519d359109e6ed2fd3b91b Apr 21 04:37:35.999814 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:35.999772 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" event={"ID":"ea625ee0-d68d-4de4-98fa-69a25a73beb2","Type":"ContainerStarted","Data":"cda8372d0e97abc56b31911c420411603f8866b07641d209effa27028be1a693"} Apr 21 04:37:35.999814 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:35.999820 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" event={"ID":"ea625ee0-d68d-4de4-98fa-69a25a73beb2","Type":"ContainerStarted","Data":"29a00db950bfdf46548982e648bae969946ab516ef519d359109e6ed2fd3b91b"} Apr 21 04:37:36.001495 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:36.001467 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" event={"ID":"889861c8-8a0f-49e8-b3c9-e2731cc3e331","Type":"ContainerStarted","Data":"c68d43123daba990f7bea749b2360e658f8a20351abfd72cb7bf971add4bd642"} Apr 21 04:37:36.001664 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:36.001644 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" Apr 21 04:37:36.035337 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:36.035285 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" podStartSLOduration=11.879917751 podStartE2EDuration="12.035267849s" podCreationTimestamp="2026-04-21 04:37:24 +0000 UTC" firstStartedPulling="2026-04-21 04:37:34.994895022 +0000 UTC m=+823.375997888" lastFinishedPulling="2026-04-21 04:37:35.150245117 +0000 UTC m=+823.531347986" observedRunningTime="2026-04-21 04:37:36.034270987 +0000 UTC m=+824.415373874" watchObservedRunningTime="2026-04-21 04:37:36.035267849 +0000 UTC m=+824.416370741" Apr 21 04:37:42.025530 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:42.025491 2574 generic.go:358] "Generic (PLEG): container finished" podID="ea625ee0-d68d-4de4-98fa-69a25a73beb2" containerID="cda8372d0e97abc56b31911c420411603f8866b07641d209effa27028be1a693" exitCode=0 Apr 21 04:37:42.025530 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:42.025533 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" event={"ID":"ea625ee0-d68d-4de4-98fa-69a25a73beb2","Type":"ContainerDied","Data":"cda8372d0e97abc56b31911c420411603f8866b07641d209effa27028be1a693"} Apr 21 04:37:43.031047 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:43.031013 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" event={"ID":"ea625ee0-d68d-4de4-98fa-69a25a73beb2","Type":"ContainerStarted","Data":"5d16e12549bc09f95c761cecc75e2a82d0b76bbf1495201cac95850ccecfb29e"} Apr 21 04:37:43.031435 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:43.031235 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" Apr 21 04:37:43.049315 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:43.049257 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" podStartSLOduration=8.664299917 podStartE2EDuration="9.04924047s" podCreationTimestamp="2026-04-21 04:37:34 +0000 UTC" firstStartedPulling="2026-04-21 04:37:42.026154636 +0000 UTC m=+830.407257502" lastFinishedPulling="2026-04-21 04:37:42.411095189 +0000 UTC m=+830.792198055" observedRunningTime="2026-04-21 04:37:43.047258855 +0000 UTC m=+831.428361754" watchObservedRunningTime="2026-04-21 04:37:43.04924047 +0000 UTC m=+831.430343360" Apr 21 04:37:47.019029 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:47.018998 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh" Apr 21 04:37:54.053751 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:37:54.053719 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-vvwqg" Apr 21 04:38:32.917277 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:32.917197 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-d84884d8-2xnj2"] Apr 21 04:38:32.921042 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:32.921020 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-d84884d8-2xnj2" Apr 21 04:38:32.927723 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:32.927696 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-d84884d8-2xnj2"] Apr 21 04:38:32.961957 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:32.961929 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/90a04616-bcb9-4215-b7f5-061420076cd6-tls-cert\") pod \"authorino-d84884d8-2xnj2\" (UID: \"90a04616-bcb9-4215-b7f5-061420076cd6\") " pod="kuadrant-system/authorino-d84884d8-2xnj2" Apr 21 04:38:32.962091 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:32.961983 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncn8z\" (UniqueName: \"kubernetes.io/projected/90a04616-bcb9-4215-b7f5-061420076cd6-kube-api-access-ncn8z\") pod \"authorino-d84884d8-2xnj2\" (UID: \"90a04616-bcb9-4215-b7f5-061420076cd6\") " pod="kuadrant-system/authorino-d84884d8-2xnj2" Apr 21 04:38:32.962091 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:32.962011 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/90a04616-bcb9-4215-b7f5-061420076cd6-oidc-ca\") pod \"authorino-d84884d8-2xnj2\" (UID: \"90a04616-bcb9-4215-b7f5-061420076cd6\") " pod="kuadrant-system/authorino-d84884d8-2xnj2" Apr 21 04:38:33.062553 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:33.062513 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/90a04616-bcb9-4215-b7f5-061420076cd6-tls-cert\") pod \"authorino-d84884d8-2xnj2\" (UID: \"90a04616-bcb9-4215-b7f5-061420076cd6\") " pod="kuadrant-system/authorino-d84884d8-2xnj2" Apr 21 04:38:33.062740 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:33.062574 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ncn8z\" (UniqueName: \"kubernetes.io/projected/90a04616-bcb9-4215-b7f5-061420076cd6-kube-api-access-ncn8z\") pod \"authorino-d84884d8-2xnj2\" (UID: \"90a04616-bcb9-4215-b7f5-061420076cd6\") " pod="kuadrant-system/authorino-d84884d8-2xnj2" Apr 21 04:38:33.062740 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:33.062637 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/90a04616-bcb9-4215-b7f5-061420076cd6-oidc-ca\") pod \"authorino-d84884d8-2xnj2\" (UID: \"90a04616-bcb9-4215-b7f5-061420076cd6\") " pod="kuadrant-system/authorino-d84884d8-2xnj2" Apr 21 04:38:33.063245 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:33.063223 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/90a04616-bcb9-4215-b7f5-061420076cd6-oidc-ca\") pod \"authorino-d84884d8-2xnj2\" (UID: \"90a04616-bcb9-4215-b7f5-061420076cd6\") " pod="kuadrant-system/authorino-d84884d8-2xnj2" Apr 21 04:38:33.064982 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:33.064961 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/90a04616-bcb9-4215-b7f5-061420076cd6-tls-cert\") pod \"authorino-d84884d8-2xnj2\" (UID: \"90a04616-bcb9-4215-b7f5-061420076cd6\") " pod="kuadrant-system/authorino-d84884d8-2xnj2" Apr 21 04:38:33.069618 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:33.069569 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncn8z\" (UniqueName: \"kubernetes.io/projected/90a04616-bcb9-4215-b7f5-061420076cd6-kube-api-access-ncn8z\") pod \"authorino-d84884d8-2xnj2\" (UID: \"90a04616-bcb9-4215-b7f5-061420076cd6\") " pod="kuadrant-system/authorino-d84884d8-2xnj2" Apr 21 04:38:33.258508 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:33.258481 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-d84884d8-2xnj2" Apr 21 04:38:33.382390 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:33.382362 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-d84884d8-2xnj2"] Apr 21 04:38:33.384811 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:38:33.384781 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90a04616_bcb9_4215_b7f5_061420076cd6.slice/crio-c1cca93f37ac43dff0d498c22fbfea0a1c8365dab80c23eb99c4b405e62aafea WatchSource:0}: Error finding container c1cca93f37ac43dff0d498c22fbfea0a1c8365dab80c23eb99c4b405e62aafea: Status 404 returned error can't find the container with id c1cca93f37ac43dff0d498c22fbfea0a1c8365dab80c23eb99c4b405e62aafea Apr 21 04:38:34.220961 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:34.220920 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-d84884d8-2xnj2" event={"ID":"90a04616-bcb9-4215-b7f5-061420076cd6","Type":"ContainerStarted","Data":"451fd71c2f970d8699eb903155d9db8fb07ee1c227055e98283475860dab831f"} Apr 21 04:38:34.221496 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:34.221467 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-d84884d8-2xnj2" event={"ID":"90a04616-bcb9-4215-b7f5-061420076cd6","Type":"ContainerStarted","Data":"c1cca93f37ac43dff0d498c22fbfea0a1c8365dab80c23eb99c4b405e62aafea"} Apr 21 04:38:34.241069 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:34.241009 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-d84884d8-2xnj2" podStartSLOduration=1.7789383509999999 podStartE2EDuration="2.24099281s" podCreationTimestamp="2026-04-21 04:38:32 +0000 UTC" firstStartedPulling="2026-04-21 04:38:33.386042307 +0000 UTC m=+881.767145172" lastFinishedPulling="2026-04-21 04:38:33.848096748 +0000 UTC m=+882.229199631" observedRunningTime="2026-04-21 04:38:34.23996664 +0000 UTC m=+882.621069535" watchObservedRunningTime="2026-04-21 04:38:34.24099281 +0000 UTC m=+882.622095723" Apr 21 04:38:34.269823 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:34.269785 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-dbd57cc7-z87qs"] Apr 21 04:38:34.270091 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:34.270060 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-dbd57cc7-z87qs" podUID="50f7a369-8eec-4cf0-8bfd-55f2812fb1e6" containerName="authorino" containerID="cri-o://58e47ae308b3424e1fc42a09e90b6a600e67f6e0194d013582f7314d64cbebd2" gracePeriod=30 Apr 21 04:38:34.524149 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:34.524125 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-dbd57cc7-z87qs" Apr 21 04:38:34.579475 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:34.579439 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/50f7a369-8eec-4cf0-8bfd-55f2812fb1e6-oidc-ca\") pod \"50f7a369-8eec-4cf0-8bfd-55f2812fb1e6\" (UID: \"50f7a369-8eec-4cf0-8bfd-55f2812fb1e6\") " Apr 21 04:38:34.579662 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:34.579496 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/50f7a369-8eec-4cf0-8bfd-55f2812fb1e6-tls-cert\") pod \"50f7a369-8eec-4cf0-8bfd-55f2812fb1e6\" (UID: \"50f7a369-8eec-4cf0-8bfd-55f2812fb1e6\") " Apr 21 04:38:34.579662 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:34.579555 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hknqb\" (UniqueName: \"kubernetes.io/projected/50f7a369-8eec-4cf0-8bfd-55f2812fb1e6-kube-api-access-hknqb\") pod \"50f7a369-8eec-4cf0-8bfd-55f2812fb1e6\" (UID: \"50f7a369-8eec-4cf0-8bfd-55f2812fb1e6\") " Apr 21 04:38:34.581619 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:34.581566 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50f7a369-8eec-4cf0-8bfd-55f2812fb1e6-kube-api-access-hknqb" (OuterVolumeSpecName: "kube-api-access-hknqb") pod "50f7a369-8eec-4cf0-8bfd-55f2812fb1e6" (UID: "50f7a369-8eec-4cf0-8bfd-55f2812fb1e6"). InnerVolumeSpecName "kube-api-access-hknqb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 04:38:34.584566 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:34.584537 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50f7a369-8eec-4cf0-8bfd-55f2812fb1e6-oidc-ca" (OuterVolumeSpecName: "oidc-ca") pod "50f7a369-8eec-4cf0-8bfd-55f2812fb1e6" (UID: "50f7a369-8eec-4cf0-8bfd-55f2812fb1e6"). InnerVolumeSpecName "oidc-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 04:38:34.589852 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:34.589827 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50f7a369-8eec-4cf0-8bfd-55f2812fb1e6-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "50f7a369-8eec-4cf0-8bfd-55f2812fb1e6" (UID: "50f7a369-8eec-4cf0-8bfd-55f2812fb1e6"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 04:38:34.680350 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:34.680310 2574 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/50f7a369-8eec-4cf0-8bfd-55f2812fb1e6-tls-cert\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:38:34.680350 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:34.680348 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hknqb\" (UniqueName: \"kubernetes.io/projected/50f7a369-8eec-4cf0-8bfd-55f2812fb1e6-kube-api-access-hknqb\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:38:34.680350 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:34.680358 2574 reconciler_common.go:299] "Volume detached for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/50f7a369-8eec-4cf0-8bfd-55f2812fb1e6-oidc-ca\") on node \"ip-10-0-139-26.ec2.internal\" DevicePath \"\"" Apr 21 04:38:35.225841 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:35.225812 2574 generic.go:358] "Generic (PLEG): container finished" podID="50f7a369-8eec-4cf0-8bfd-55f2812fb1e6" containerID="58e47ae308b3424e1fc42a09e90b6a600e67f6e0194d013582f7314d64cbebd2" exitCode=0 Apr 21 04:38:35.226313 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:35.225855 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-dbd57cc7-z87qs" Apr 21 04:38:35.226313 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:35.225894 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-dbd57cc7-z87qs" event={"ID":"50f7a369-8eec-4cf0-8bfd-55f2812fb1e6","Type":"ContainerDied","Data":"58e47ae308b3424e1fc42a09e90b6a600e67f6e0194d013582f7314d64cbebd2"} Apr 21 04:38:35.226313 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:35.225935 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-dbd57cc7-z87qs" event={"ID":"50f7a369-8eec-4cf0-8bfd-55f2812fb1e6","Type":"ContainerDied","Data":"a5da120b23be5b7c53d9ac7d72164ccbf55993dfd7aeddf1b16e05f1eb11a5c6"} Apr 21 04:38:35.226313 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:35.225950 2574 scope.go:117] "RemoveContainer" containerID="58e47ae308b3424e1fc42a09e90b6a600e67f6e0194d013582f7314d64cbebd2" Apr 21 04:38:35.234722 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:35.234704 2574 scope.go:117] "RemoveContainer" containerID="58e47ae308b3424e1fc42a09e90b6a600e67f6e0194d013582f7314d64cbebd2" Apr 21 04:38:35.234970 ip-10-0-139-26 kubenswrapper[2574]: E0421 04:38:35.234951 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58e47ae308b3424e1fc42a09e90b6a600e67f6e0194d013582f7314d64cbebd2\": container with ID starting with 58e47ae308b3424e1fc42a09e90b6a600e67f6e0194d013582f7314d64cbebd2 not found: ID does not exist" containerID="58e47ae308b3424e1fc42a09e90b6a600e67f6e0194d013582f7314d64cbebd2" Apr 21 04:38:35.235019 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:35.234979 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58e47ae308b3424e1fc42a09e90b6a600e67f6e0194d013582f7314d64cbebd2"} err="failed to get container status \"58e47ae308b3424e1fc42a09e90b6a600e67f6e0194d013582f7314d64cbebd2\": rpc error: code = NotFound desc = could not find container \"58e47ae308b3424e1fc42a09e90b6a600e67f6e0194d013582f7314d64cbebd2\": container with ID starting with 58e47ae308b3424e1fc42a09e90b6a600e67f6e0194d013582f7314d64cbebd2 not found: ID does not exist" Apr 21 04:38:35.246484 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:35.246460 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-dbd57cc7-z87qs"] Apr 21 04:38:35.250210 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:35.250188 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-dbd57cc7-z87qs"] Apr 21 04:38:36.177171 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:36.177141 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50f7a369-8eec-4cf0-8bfd-55f2812fb1e6" path="/var/lib/kubelet/pods/50f7a369-8eec-4cf0-8bfd-55f2812fb1e6/volumes" Apr 21 04:38:52.107697 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:52.107668 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tv4p6_1b413c11-6c0e-410d-bffc-fdd6ba8e6689/ovn-acl-logging/0.log" Apr 21 04:38:52.108916 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:38:52.108894 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tv4p6_1b413c11-6c0e-410d-bffc-fdd6ba8e6689/ovn-acl-logging/0.log" Apr 21 04:39:12.973904 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:12.973871 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-d84884d8-2xnj2_90a04616-bcb9-4215-b7f5-061420076cd6/authorino/0.log" Apr 21 04:39:16.764549 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:16.764513 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-7cb5f4f657-45dbm_5ab973a7-9702-43fe-b3c4-568140e1c750/maas-api/0.log" Apr 21 04:39:17.105779 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:17.105680 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5b6f69cdb8-52l7q_b5b6e822-b2b1-4333-a8b1-22d85ee382ff/manager/0.log" Apr 21 04:39:18.668237 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:18.668202 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-d84884d8-2xnj2_90a04616-bcb9-4215-b7f5-061420076cd6/authorino/0.log" Apr 21 04:39:18.987466 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:18.987439 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-4dhfg_40cd261b-27b6-495b-87b6-5b7734d535cd/kuadrant-console-plugin/0.log" Apr 21 04:39:19.218030 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:19.218000 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-xblq8_a7a9d821-f536-4e4e-96e1-3822ebb3972e/manager/0.log" Apr 21 04:39:20.099655 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:20.099625 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5f98864f9-t6qwc_d01edc4d-43c6-4980-9d79-1d640eee85db/kube-auth-proxy/0.log" Apr 21 04:39:20.656164 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:20.656130 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62_1c600334-e8a2-4df1-a0bb-a52e380063e5/main/0.log" Apr 21 04:39:20.668641 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:20.668619 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-b8c62_1c600334-e8a2-4df1-a0bb-a52e380063e5/storage-initializer/0.log" Apr 21 04:39:20.897005 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:20.896974 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-vvwqg_ea625ee0-d68d-4de4-98fa-69a25a73beb2/storage-initializer/0.log" Apr 21 04:39:20.904609 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:20.904566 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-vvwqg_ea625ee0-d68d-4de4-98fa-69a25a73beb2/main/0.log" Apr 21 04:39:21.249860 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:21.249828 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh_889861c8-8a0f-49e8-b3c9-e2731cc3e331/storage-initializer/0.log" Apr 21 04:39:21.259397 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:21.259369 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-hdsgh_889861c8-8a0f-49e8-b3c9-e2731cc3e331/main/0.log" Apr 21 04:39:27.563653 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:27.563616 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-qbdpv_3d73e541-4e3f-47bf-8031-d49890b7f8d2/global-pull-secret-syncer/0.log" Apr 21 04:39:27.637975 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:27.637940 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-h4mfk_3d78d6d3-1607-4a6a-88f3-3f0d7eeda73a/konnectivity-agent/0.log" Apr 21 04:39:27.708428 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:27.708390 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-26.ec2.internal_cd7c596238e57e8ba8682f24a2cccbe3/haproxy/0.log" Apr 21 04:39:31.995944 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:31.995917 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-d84884d8-2xnj2_90a04616-bcb9-4215-b7f5-061420076cd6/authorino/0.log" Apr 21 04:39:32.068649 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:32.068532 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-4dhfg_40cd261b-27b6-495b-87b6-5b7734d535cd/kuadrant-console-plugin/0.log" Apr 21 04:39:32.159884 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:32.159821 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-xblq8_a7a9d821-f536-4e4e-96e1-3822ebb3972e/manager/0.log" Apr 21 04:39:33.797508 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:33.797479 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-m5jgk_dc428b9d-35a8-45e5-b688-c59395e673af/kube-state-metrics/0.log" Apr 21 04:39:33.815650 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:33.815622 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-m5jgk_dc428b9d-35a8-45e5-b688-c59395e673af/kube-rbac-proxy-main/0.log" Apr 21 04:39:33.840790 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:33.840756 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-m5jgk_dc428b9d-35a8-45e5-b688-c59395e673af/kube-rbac-proxy-self/0.log" Apr 21 04:39:33.868640 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:33.868565 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-57f8b777bc-fmfdt_e6975186-f2b2-47eb-9ec1-e32179e2d5b9/metrics-server/0.log" Apr 21 04:39:33.897311 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:33.897277 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-nj9bs_ae39b787-f2c0-425c-87ca-a4a8a1f8e0df/monitoring-plugin/0.log" Apr 21 04:39:33.928283 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:33.928252 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6xx4x_b807a52f-f037-480f-8383-561de6752c10/node-exporter/0.log" Apr 21 04:39:33.952897 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:33.952856 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6xx4x_b807a52f-f037-480f-8383-561de6752c10/kube-rbac-proxy/0.log" Apr 21 04:39:33.970863 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:33.970839 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-6xx4x_b807a52f-f037-480f-8383-561de6752c10/init-textfile/0.log" Apr 21 04:39:34.217291 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:34.217258 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d34b0eee-6deb-439e-9d04-8b7bb13c4408/prometheus/0.log" Apr 21 04:39:34.252395 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:34.252359 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d34b0eee-6deb-439e-9d04-8b7bb13c4408/config-reloader/0.log" Apr 21 04:39:34.273535 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:34.273512 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d34b0eee-6deb-439e-9d04-8b7bb13c4408/thanos-sidecar/0.log" Apr 21 04:39:34.293377 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:34.293349 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d34b0eee-6deb-439e-9d04-8b7bb13c4408/kube-rbac-proxy-web/0.log" Apr 21 04:39:34.312081 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:34.312055 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d34b0eee-6deb-439e-9d04-8b7bb13c4408/kube-rbac-proxy/0.log" Apr 21 04:39:34.330891 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:34.330862 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d34b0eee-6deb-439e-9d04-8b7bb13c4408/kube-rbac-proxy-thanos/0.log" Apr 21 04:39:34.348741 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:34.348723 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d34b0eee-6deb-439e-9d04-8b7bb13c4408/init-config-reloader/0.log" Apr 21 04:39:34.379204 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:34.379180 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-2frq4_bc41b2e4-073e-4fce-a850-7e5fa29c9da4/prometheus-operator/0.log" Apr 21 04:39:34.393649 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:34.393627 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-2frq4_bc41b2e4-073e-4fce-a850-7e5fa29c9da4/kube-rbac-proxy/0.log" Apr 21 04:39:34.414073 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:34.414050 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-ncwvv_63181619-2d3e-4acb-842b-6eb60832feee/prometheus-operator-admission-webhook/0.log" Apr 21 04:39:36.177492 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.177464 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx"] Apr 21 04:39:36.177867 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.177820 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="50f7a369-8eec-4cf0-8bfd-55f2812fb1e6" containerName="authorino" Apr 21 04:39:36.177867 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.177832 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f7a369-8eec-4cf0-8bfd-55f2812fb1e6" containerName="authorino" Apr 21 04:39:36.177950 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.177900 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="50f7a369-8eec-4cf0-8bfd-55f2812fb1e6" containerName="authorino" Apr 21 04:39:36.181119 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.181099 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx" Apr 21 04:39:36.183391 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.183371 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pv5rw\"/\"openshift-service-ca.crt\"" Apr 21 04:39:36.184170 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.184154 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-pv5rw\"/\"kube-root-ca.crt\"" Apr 21 04:39:36.184257 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.184155 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-pv5rw\"/\"default-dockercfg-zc9pn\"" Apr 21 04:39:36.190228 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.190208 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx"] Apr 21 04:39:36.336449 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.336415 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/80591983-8b10-47da-b452-13162576aa61-sys\") pod \"perf-node-gather-daemonset-lrxrx\" (UID: \"80591983-8b10-47da-b452-13162576aa61\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx" Apr 21 04:39:36.336650 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.336456 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/80591983-8b10-47da-b452-13162576aa61-podres\") pod \"perf-node-gather-daemonset-lrxrx\" (UID: \"80591983-8b10-47da-b452-13162576aa61\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx" Apr 21 04:39:36.336650 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.336495 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/80591983-8b10-47da-b452-13162576aa61-lib-modules\") pod \"perf-node-gather-daemonset-lrxrx\" (UID: \"80591983-8b10-47da-b452-13162576aa61\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx" Apr 21 04:39:36.336650 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.336546 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsv9t\" (UniqueName: \"kubernetes.io/projected/80591983-8b10-47da-b452-13162576aa61-kube-api-access-lsv9t\") pod \"perf-node-gather-daemonset-lrxrx\" (UID: \"80591983-8b10-47da-b452-13162576aa61\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx" Apr 21 04:39:36.336650 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.336646 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/80591983-8b10-47da-b452-13162576aa61-proc\") pod \"perf-node-gather-daemonset-lrxrx\" (UID: \"80591983-8b10-47da-b452-13162576aa61\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx" Apr 21 04:39:36.437212 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.437142 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lsv9t\" (UniqueName: \"kubernetes.io/projected/80591983-8b10-47da-b452-13162576aa61-kube-api-access-lsv9t\") pod \"perf-node-gather-daemonset-lrxrx\" (UID: \"80591983-8b10-47da-b452-13162576aa61\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx" Apr 21 04:39:36.437212 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.437206 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/80591983-8b10-47da-b452-13162576aa61-proc\") pod \"perf-node-gather-daemonset-lrxrx\" (UID: \"80591983-8b10-47da-b452-13162576aa61\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx" Apr 21 04:39:36.437394 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.437240 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/80591983-8b10-47da-b452-13162576aa61-sys\") pod \"perf-node-gather-daemonset-lrxrx\" (UID: \"80591983-8b10-47da-b452-13162576aa61\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx" Apr 21 04:39:36.437394 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.437273 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/80591983-8b10-47da-b452-13162576aa61-podres\") pod \"perf-node-gather-daemonset-lrxrx\" (UID: \"80591983-8b10-47da-b452-13162576aa61\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx" Apr 21 04:39:36.437394 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.437287 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/80591983-8b10-47da-b452-13162576aa61-proc\") pod \"perf-node-gather-daemonset-lrxrx\" (UID: \"80591983-8b10-47da-b452-13162576aa61\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx" Apr 21 04:39:36.437394 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.437309 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/80591983-8b10-47da-b452-13162576aa61-lib-modules\") pod \"perf-node-gather-daemonset-lrxrx\" (UID: \"80591983-8b10-47da-b452-13162576aa61\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx" Apr 21 04:39:36.437394 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.437349 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/80591983-8b10-47da-b452-13162576aa61-sys\") pod \"perf-node-gather-daemonset-lrxrx\" (UID: \"80591983-8b10-47da-b452-13162576aa61\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx" Apr 21 04:39:36.437570 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.437416 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/80591983-8b10-47da-b452-13162576aa61-podres\") pod \"perf-node-gather-daemonset-lrxrx\" (UID: \"80591983-8b10-47da-b452-13162576aa61\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx" Apr 21 04:39:36.437570 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.437465 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/80591983-8b10-47da-b452-13162576aa61-lib-modules\") pod \"perf-node-gather-daemonset-lrxrx\" (UID: \"80591983-8b10-47da-b452-13162576aa61\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx" Apr 21 04:39:36.445139 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.445119 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsv9t\" (UniqueName: \"kubernetes.io/projected/80591983-8b10-47da-b452-13162576aa61-kube-api-access-lsv9t\") pod \"perf-node-gather-daemonset-lrxrx\" (UID: \"80591983-8b10-47da-b452-13162576aa61\") " pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx" Apr 21 04:39:36.490856 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.490834 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx" Apr 21 04:39:36.612905 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:36.612874 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx"] Apr 21 04:39:36.614917 ip-10-0-139-26 kubenswrapper[2574]: W0421 04:39:36.614889 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod80591983_8b10_47da_b452_13162576aa61.slice/crio-bf34a87e1e78d10d0d3a5e0a3179bd017c2bf5f94da5f311726207e10481d24e WatchSource:0}: Error finding container bf34a87e1e78d10d0d3a5e0a3179bd017c2bf5f94da5f311726207e10481d24e: Status 404 returned error can't find the container with id bf34a87e1e78d10d0d3a5e0a3179bd017c2bf5f94da5f311726207e10481d24e Apr 21 04:39:37.262780 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:37.262753 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-6btnv_64eb89e0-020c-47da-9bd6-7413a505a504/volume-data-source-validator/0.log" Apr 21 04:39:37.449441 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:37.449407 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx" event={"ID":"80591983-8b10-47da-b452-13162576aa61","Type":"ContainerStarted","Data":"ccec4263c823d36e972269d3415454ba0170d1fb2c9fa456fac63c034a94aa9b"} Apr 21 04:39:37.449441 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:37.449444 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx" event={"ID":"80591983-8b10-47da-b452-13162576aa61","Type":"ContainerStarted","Data":"bf34a87e1e78d10d0d3a5e0a3179bd017c2bf5f94da5f311726207e10481d24e"} Apr 21 04:39:37.449681 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:37.449467 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx" Apr 21 04:39:37.465631 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:37.465553 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx" podStartSLOduration=1.46553525 podStartE2EDuration="1.46553525s" podCreationTimestamp="2026-04-21 04:39:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 04:39:37.463177909 +0000 UTC m=+945.844280797" watchObservedRunningTime="2026-04-21 04:39:37.46553525 +0000 UTC m=+945.846638151" Apr 21 04:39:38.065019 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:38.064977 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rn9gk_0a79ec68-2e66-4417-8b48-1d40b7272c91/dns/0.log" Apr 21 04:39:38.084249 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:38.084219 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rn9gk_0a79ec68-2e66-4417-8b48-1d40b7272c91/kube-rbac-proxy/0.log" Apr 21 04:39:38.225748 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:38.225709 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hpx9f_50d19486-d861-4624-a572-7c1d8e897542/dns-node-resolver/0.log" Apr 21 04:39:38.873514 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:38.873477 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-rnwcg_493c7d5b-0f42-40a4-ab37-19d6681834e3/node-ca/0.log" Apr 21 04:39:39.746180 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:39.746151 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-5f98864f9-t6qwc_d01edc4d-43c6-4980-9d79-1d640eee85db/kube-auth-proxy/0.log" Apr 21 04:39:40.295041 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:40.295011 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rqbs9_6252963e-17c6-4a33-86f2-a83c646d8f7c/serve-healthcheck-canary/0.log" Apr 21 04:39:40.731510 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:40.731481 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-425bd_0abaf46b-78e1-4455-b661-112f2e50f6af/kube-rbac-proxy/0.log" Apr 21 04:39:40.756930 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:40.756908 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-425bd_0abaf46b-78e1-4455-b661-112f2e50f6af/exporter/0.log" Apr 21 04:39:40.777562 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:40.777539 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-425bd_0abaf46b-78e1-4455-b661-112f2e50f6af/extractor/0.log" Apr 21 04:39:42.766300 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:42.766271 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-7cb5f4f657-45dbm_5ab973a7-9702-43fe-b3c4-568140e1c750/maas-api/0.log" Apr 21 04:39:42.845297 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:42.845268 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5b6f69cdb8-52l7q_b5b6e822-b2b1-4333-a8b1-22d85ee382ff/manager/0.log" Apr 21 04:39:43.462364 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:43.462338 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-pv5rw/perf-node-gather-daemonset-lrxrx" Apr 21 04:39:44.105326 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:44.105297 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-579f6d4cb9-664wz_6eebfacc-87b9-41a5-9b26-5e2d40d339f1/manager/0.log" Apr 21 04:39:44.153091 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:44.153061 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-2dd27_ab4ccffc-f76f-4f2b-aa81-9b9a24c60963/openshift-lws-operator/0.log" Apr 21 04:39:48.468041 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:48.468013 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-jwcj4_1dc5b891-b82e-43cb-9d7b-5575fff9501b/migrator/0.log" Apr 21 04:39:48.484962 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:48.484933 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-jwcj4_1dc5b891-b82e-43cb-9d7b-5575fff9501b/graceful-termination/0.log" Apr 21 04:39:48.823710 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:48.823624 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-vklr2_0bb93254-dca8-4e59-9bd6-90ac24699232/kube-storage-version-migrator-operator/1.log" Apr 21 04:39:48.825378 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:48.825351 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-vklr2_0bb93254-dca8-4e59-9bd6-90ac24699232/kube-storage-version-migrator-operator/0.log" Apr 21 04:39:49.862817 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:49.862778 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8hxd5_8910acd9-f7d1-43e0-86a5-84c8a0670a16/kube-multus/0.log" Apr 21 04:39:50.435177 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:50.435146 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d4kbl_1182efca-c6ba-4b0f-9492-7a32d77ea693/kube-multus-additional-cni-plugins/0.log" Apr 21 04:39:50.472192 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:50.472148 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d4kbl_1182efca-c6ba-4b0f-9492-7a32d77ea693/egress-router-binary-copy/0.log" Apr 21 04:39:50.498952 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:50.498927 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d4kbl_1182efca-c6ba-4b0f-9492-7a32d77ea693/cni-plugins/0.log" Apr 21 04:39:50.518284 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:50.518257 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d4kbl_1182efca-c6ba-4b0f-9492-7a32d77ea693/bond-cni-plugin/0.log" Apr 21 04:39:50.538317 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:50.538293 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d4kbl_1182efca-c6ba-4b0f-9492-7a32d77ea693/routeoverride-cni/0.log" Apr 21 04:39:50.556410 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:50.556381 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d4kbl_1182efca-c6ba-4b0f-9492-7a32d77ea693/whereabouts-cni-bincopy/0.log" Apr 21 04:39:50.574743 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:50.574717 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-d4kbl_1182efca-c6ba-4b0f-9492-7a32d77ea693/whereabouts-cni/0.log" Apr 21 04:39:50.640109 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:50.640078 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-g7q5r_d0b4a7b4-e6a1-4816-a96e-0792f47539d9/network-metrics-daemon/0.log" Apr 21 04:39:50.657454 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:50.657425 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-g7q5r_d0b4a7b4-e6a1-4816-a96e-0792f47539d9/kube-rbac-proxy/0.log" Apr 21 04:39:52.034568 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:52.034532 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tv4p6_1b413c11-6c0e-410d-bffc-fdd6ba8e6689/ovn-controller/0.log" Apr 21 04:39:52.048801 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:52.048766 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tv4p6_1b413c11-6c0e-410d-bffc-fdd6ba8e6689/ovn-acl-logging/0.log" Apr 21 04:39:52.058355 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:52.058324 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tv4p6_1b413c11-6c0e-410d-bffc-fdd6ba8e6689/ovn-acl-logging/1.log" Apr 21 04:39:52.078492 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:52.078458 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tv4p6_1b413c11-6c0e-410d-bffc-fdd6ba8e6689/kube-rbac-proxy-node/0.log" Apr 21 04:39:52.098794 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:52.098757 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tv4p6_1b413c11-6c0e-410d-bffc-fdd6ba8e6689/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 04:39:52.113274 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:52.113249 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tv4p6_1b413c11-6c0e-410d-bffc-fdd6ba8e6689/northd/0.log" Apr 21 04:39:52.136231 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:52.136200 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tv4p6_1b413c11-6c0e-410d-bffc-fdd6ba8e6689/nbdb/0.log" Apr 21 04:39:52.154517 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:52.154480 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tv4p6_1b413c11-6c0e-410d-bffc-fdd6ba8e6689/sbdb/0.log" Apr 21 04:39:52.331751 ip-10-0-139-26 kubenswrapper[2574]: I0421 04:39:52.331670 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tv4p6_1b413c11-6c0e-410d-bffc-fdd6ba8e6689/ovnkube-controller/0.log"