Apr 16 22:13:49.287844 ip-10-0-140-65 systemd[1]: Starting Kubernetes Kubelet... Apr 16 22:13:49.776331 ip-10-0-140-65 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:49.776331 ip-10-0-140-65 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 22:13:49.776331 ip-10-0-140-65 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:49.776331 ip-10-0-140-65 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 22:13:49.776331 ip-10-0-140-65 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:13:49.780166 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.780065 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 22:13:49.784555 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784532 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:49.784593 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784561 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:49.784593 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784567 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:49.784593 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784570 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:49.784593 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784573 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:49.784593 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784576 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:49.784593 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784580 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:49.784593 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784583 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:49.784593 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784586 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:49.784593 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784588 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:49.784593 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784591 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:49.784593 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784595 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:49.784593 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784598 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:49.785001 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784601 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:49.785001 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784604 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:49.785001 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784607 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:49.785001 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784610 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:49.785001 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784612 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:49.785001 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784615 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:49.785001 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784617 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:49.785001 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784620 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:49.785001 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784626 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:49.785001 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784629 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:49.785001 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784632 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:49.785001 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784634 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:49.785001 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784637 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:49.785001 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784639 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:49.785001 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784642 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:49.785001 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784645 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:49.785001 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784647 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:49.785001 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784650 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:49.785001 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784653 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:49.785001 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784655 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:49.785715 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784657 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:49.785715 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784660 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:49.785715 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784663 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:49.785715 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784665 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:49.785715 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784668 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:49.785715 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784670 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:49.785715 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784673 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:49.785715 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784675 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:49.785715 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784678 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:49.785715 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784681 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:49.785715 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784684 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:49.785715 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784687 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:49.785715 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784689 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:49.785715 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784692 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:49.785715 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784695 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:49.785715 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784698 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:49.785715 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784701 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:49.785715 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784704 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:49.785715 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784707 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:49.785715 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784709 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:49.786364 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784711 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:49.786364 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784716 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:49.786364 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784720 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:49.786364 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784723 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:49.786364 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784726 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:49.786364 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784728 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:49.786364 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784731 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:49.786364 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784733 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:49.786364 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784736 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:49.786364 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784739 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:49.786364 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784741 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:49.786364 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784744 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:49.786364 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784746 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:49.786364 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784749 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:49.786364 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784751 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:49.786364 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784754 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:49.786364 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784756 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:49.786364 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784759 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:49.786364 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784761 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:49.786364 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784764 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:49.786887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784766 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:49.786887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784769 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:49.786887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784771 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:49.786887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784774 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:49.786887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784784 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:49.786887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784788 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:49.786887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784790 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:49.786887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784793 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:49.786887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784796 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:49.786887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784799 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:49.786887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784802 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:49.786887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784804 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:49.786887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.784807 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:49.786887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.785914 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:49.786887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.785928 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:49.786887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.785933 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:49.786887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.785966 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:49.786887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.785973 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:49.786887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.785979 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:49.786887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.785984 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:49.787371 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.785990 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:49.787371 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.785994 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:49.787371 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786002 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:49.787371 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786007 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:49.787371 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786011 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:49.787371 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786015 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:49.787371 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786018 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:49.787371 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786034 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:49.787371 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786044 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:49.787371 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786049 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:49.787371 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786055 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:49.787371 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786180 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:49.787371 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786191 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:49.787371 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786196 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:49.787371 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786203 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:49.787371 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786209 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:49.787371 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786213 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:49.787371 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786218 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:49.787371 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786222 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:49.787371 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786226 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:49.787887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786230 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:49.787887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786234 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:49.787887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786239 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:49.787887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786243 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:49.787887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786247 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:49.787887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786251 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:49.787887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786256 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:49.787887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786260 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:49.787887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786266 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:49.787887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786270 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:49.787887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786274 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:49.787887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786279 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:49.787887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786283 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:49.787887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786287 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:49.787887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786291 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:49.787887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786295 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:49.787887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786299 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:49.787887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786302 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:49.787887 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786306 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:49.788360 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786311 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:49.788360 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786315 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:49.788360 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786320 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:49.788360 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786324 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:49.788360 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786328 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:49.788360 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786332 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:49.788360 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786336 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:49.788360 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786340 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:49.788360 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786344 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:49.788360 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786348 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:49.788360 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786353 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:49.788360 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786358 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:49.788360 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786362 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:49.788360 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786366 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:49.788360 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786369 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:49.788360 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786372 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:49.788360 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786375 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:49.788360 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786378 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:49.788360 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786380 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:49.788360 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786383 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:49.788900 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786385 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:49.788900 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786388 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:49.788900 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786391 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:49.788900 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786394 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:49.788900 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786396 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:49.788900 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786402 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:49.788900 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786405 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:49.788900 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786407 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:49.788900 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786410 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:49.788900 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786413 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:49.788900 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786416 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:49.788900 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786419 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:49.788900 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786423 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:49.788900 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786426 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:49.788900 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786429 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:49.788900 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786434 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:49.788900 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786438 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:49.788900 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786441 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:49.788900 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786444 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.786447 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787790 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787800 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787807 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787812 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787816 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787820 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787826 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787830 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787834 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787837 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787840 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787844 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787847 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787850 2572 flags.go:64] FLAG: --cgroup-root="" Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787853 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787856 2572 flags.go:64] FLAG: --client-ca-file="" Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787859 2572 flags.go:64] FLAG: --cloud-config="" Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787861 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787864 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787869 2572 flags.go:64] FLAG: --cluster-domain="" Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787872 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787875 2572 flags.go:64] FLAG: --config-dir="" Apr 16 22:13:49.789370 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787878 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787881 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787885 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787889 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787893 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787896 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787899 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787903 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787906 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787909 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787912 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787916 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787919 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787922 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787925 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787928 2572 flags.go:64] FLAG: --enable-server="true" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787931 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787937 2572 flags.go:64] FLAG: --event-burst="100" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787940 2572 flags.go:64] FLAG: --event-qps="50" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787943 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787946 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787949 2572 flags.go:64] FLAG: --eviction-hard="" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787953 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787956 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787960 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 22:13:49.789977 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787963 2572 flags.go:64] FLAG: --eviction-soft="" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787966 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787969 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787972 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787975 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787978 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787981 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787984 2572 flags.go:64] FLAG: --feature-gates="" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787988 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787991 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787994 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.787997 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788001 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788004 2572 flags.go:64] FLAG: --help="false" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788007 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-140-65.ec2.internal" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788010 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788013 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788016 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788020 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788023 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788027 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788030 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788032 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 22:13:49.790601 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788035 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788038 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788041 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788044 2572 flags.go:64] FLAG: --kube-reserved="" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788047 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788050 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788053 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788056 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788058 2572 flags.go:64] FLAG: --lock-file="" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788061 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788064 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788067 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788073 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788076 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788078 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788081 2572 flags.go:64] FLAG: --logging-format="text" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788084 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788087 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788090 2572 flags.go:64] FLAG: --manifest-url="" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788093 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788097 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788104 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788109 2572 flags.go:64] FLAG: --max-pods="110" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788112 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788115 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 22:13:49.791162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788118 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788121 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788124 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788127 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788130 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788139 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788142 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788145 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788148 2572 flags.go:64] FLAG: --pod-cidr="" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788151 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788156 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788159 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788162 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788165 2572 flags.go:64] FLAG: --port="10250" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788169 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788171 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0370954362c2363f2" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788175 2572 flags.go:64] FLAG: --qos-reserved="" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788178 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788181 2572 flags.go:64] FLAG: --register-node="true" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788184 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788186 2572 flags.go:64] FLAG: --register-with-taints="" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788190 2572 flags.go:64] FLAG: --registry-burst="10" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788193 2572 flags.go:64] FLAG: --registry-qps="5" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788196 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788199 2572 flags.go:64] FLAG: --reserved-memory="" Apr 16 22:13:49.791761 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788206 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788209 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788212 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788215 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788219 2572 flags.go:64] FLAG: --runonce="false" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788222 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788226 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788229 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788232 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788235 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788238 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788241 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788244 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788247 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788250 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788253 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788256 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788259 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788262 2572 flags.go:64] FLAG: --system-cgroups="" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788264 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788270 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788273 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788276 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788280 2572 flags.go:64] FLAG: --tls-min-version="" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788283 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 22:13:49.792366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788285 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 22:13:49.792976 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788289 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 22:13:49.792976 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788292 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 22:13:49.792976 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788295 2572 flags.go:64] FLAG: --v="2" Apr 16 22:13:49.792976 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788299 2572 flags.go:64] FLAG: --version="false" Apr 16 22:13:49.792976 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788303 2572 flags.go:64] FLAG: --vmodule="" Apr 16 22:13:49.792976 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788308 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 22:13:49.792976 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788312 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 22:13:49.792976 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788432 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:49.792976 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788436 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:49.792976 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788439 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:49.792976 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788442 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:49.792976 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788448 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:49.792976 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788451 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:49.792976 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788454 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:49.792976 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788457 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:49.792976 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788459 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:49.792976 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788462 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:49.792976 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788464 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:49.792976 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788467 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:49.792976 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788469 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:49.792976 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788472 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:49.793491 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788474 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:49.793491 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788477 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:49.793491 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788479 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:49.793491 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788484 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:49.793491 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788487 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:49.793491 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788490 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:49.793491 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788493 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:49.793491 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788495 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:49.793491 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788498 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:49.793491 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788501 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:49.793491 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788504 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:49.793491 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788506 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:49.793491 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788509 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:49.793491 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788512 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:49.793491 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788514 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:49.793491 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788517 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:49.793491 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788519 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:49.793491 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788523 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:49.793491 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788526 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:49.793491 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788528 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:49.794014 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788531 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:49.794014 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788533 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:49.794014 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788536 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:49.794014 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788543 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:49.794014 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788545 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:49.794014 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788561 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:49.794014 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788564 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:49.794014 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788567 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:49.794014 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788569 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:49.794014 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788572 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:49.794014 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788575 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:49.794014 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788578 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:49.794014 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788581 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:49.794014 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788583 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:49.794014 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788586 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:49.794014 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788588 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:49.794014 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788591 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:49.794014 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788593 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:49.794014 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788596 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:49.794014 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788599 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:49.794513 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788601 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:49.794513 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788604 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:49.794513 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788606 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:49.794513 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788609 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:49.794513 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788611 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:49.794513 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788614 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:49.794513 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788616 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:49.794513 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788619 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:49.794513 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788622 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:49.794513 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788626 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:49.794513 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788629 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:49.794513 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788632 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:49.794513 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788634 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:49.794513 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788637 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:49.794513 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788641 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:49.794513 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788644 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:49.794513 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788648 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:49.794513 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788651 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:49.794513 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788653 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:49.794513 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788656 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:49.795087 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788658 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:49.795087 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788661 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:49.795087 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788664 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:49.795087 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788666 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:49.795087 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788669 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:49.795087 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788673 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:49.795087 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788676 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:49.795087 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788679 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:49.795087 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788682 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:49.795087 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788684 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:49.795087 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788687 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:49.795087 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.788689 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:49.795087 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.788698 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:49.798005 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.797986 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 22:13:49.798053 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.798006 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 22:13:49.798083 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798061 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:49.798083 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798066 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:49.798083 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798070 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:49.798083 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798073 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:49.798083 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798077 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:49.798083 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798080 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:49.798083 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798083 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:49.798083 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798086 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:49.798288 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798090 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:49.798288 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798093 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:49.798288 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798096 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:49.798288 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798098 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:49.798288 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798101 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:49.798288 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798104 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:49.798288 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798107 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:49.798288 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798109 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:49.798288 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798112 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:49.798288 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798115 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:49.798288 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798117 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:49.798288 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798120 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:49.798288 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798123 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:49.798288 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798126 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:49.798288 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798128 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:49.798288 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798131 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:49.798288 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798133 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:49.798288 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798136 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:49.798288 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798141 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:49.798792 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798145 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:49.798792 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798148 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:49.798792 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798151 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:49.798792 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798154 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:49.798792 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798158 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:49.798792 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798161 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:49.798792 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798164 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:49.798792 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798167 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:49.798792 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798170 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:49.798792 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798172 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:49.798792 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798176 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:49.798792 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798178 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:49.798792 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798181 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:49.798792 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798183 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:49.798792 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798187 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:49.798792 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798190 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:49.798792 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798193 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:49.798792 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798195 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:49.798792 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798198 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:49.798792 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798201 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:49.799276 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798203 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:49.799276 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798206 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:49.799276 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798209 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:49.799276 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798211 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:49.799276 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798214 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:49.799276 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798216 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:49.799276 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798219 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:49.799276 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798222 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:49.799276 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798224 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:49.799276 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798227 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:49.799276 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798229 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:49.799276 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798232 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:49.799276 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798235 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:49.799276 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798237 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:49.799276 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798240 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:49.799276 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798242 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:49.799276 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798245 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:49.799276 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798249 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:49.799276 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798252 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:49.799276 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798254 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:49.799830 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798257 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:49.799830 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798259 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:49.799830 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798262 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:49.799830 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798265 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:49.799830 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798267 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:49.799830 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798270 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:49.799830 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798273 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:49.799830 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798276 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:49.799830 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798278 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:49.799830 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798281 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:49.799830 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798284 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:49.799830 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798287 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:49.799830 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798289 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:49.799830 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798292 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:49.799830 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798295 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:49.799830 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798298 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:49.799830 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798300 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:49.799830 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798303 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:49.799830 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798307 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:49.800338 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.798313 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:49.800338 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798423 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:13:49.800338 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798429 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:13:49.800338 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798432 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:13:49.800338 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798435 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:13:49.800338 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798438 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:13:49.800338 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798440 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:13:49.800338 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798443 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:13:49.800338 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798445 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:13:49.800338 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798448 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:13:49.800338 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798451 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:13:49.800338 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798459 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:13:49.800338 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798461 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:13:49.800338 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798464 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:13:49.800338 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798467 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:13:49.800338 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798470 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:13:49.800784 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798473 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:13:49.800784 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798476 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:13:49.800784 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798478 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:13:49.800784 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798481 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:13:49.800784 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798484 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:13:49.800784 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798487 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:13:49.800784 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798489 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:13:49.800784 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798492 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:13:49.800784 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798495 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:13:49.800784 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798497 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:13:49.800784 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798500 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:13:49.800784 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798503 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:13:49.800784 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798505 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:13:49.800784 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798519 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:13:49.800784 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798522 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:13:49.800784 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798524 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:13:49.800784 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798527 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:13:49.800784 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798530 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:13:49.800784 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798533 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:13:49.800784 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798535 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:13:49.801272 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798538 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:13:49.801272 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798541 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:13:49.801272 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798543 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:13:49.801272 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798559 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:13:49.801272 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798562 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:13:49.801272 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798564 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:13:49.801272 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798567 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:13:49.801272 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798571 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:13:49.801272 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798575 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:13:49.801272 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798578 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:13:49.801272 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798582 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:13:49.801272 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798585 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:13:49.801272 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798588 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:13:49.801272 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798591 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:13:49.801272 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798593 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:13:49.801272 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798596 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:13:49.801272 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798599 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:13:49.801272 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798602 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:13:49.801272 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798605 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:13:49.801272 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798608 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:13:49.801785 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798610 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:13:49.801785 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798613 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:13:49.801785 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798616 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:13:49.801785 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798618 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:13:49.801785 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798621 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:13:49.801785 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798624 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:13:49.801785 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798628 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:13:49.801785 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798631 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:13:49.801785 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798634 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:13:49.801785 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798636 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:13:49.801785 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798639 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:13:49.801785 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798642 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:13:49.801785 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798644 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:13:49.801785 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798647 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:13:49.801785 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798649 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:13:49.801785 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798652 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:13:49.801785 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798654 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:13:49.801785 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798657 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:13:49.801785 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798659 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:13:49.802240 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798662 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:13:49.802240 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798664 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:13:49.802240 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798668 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:13:49.802240 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798670 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:13:49.802240 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798673 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:13:49.802240 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798676 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:13:49.802240 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798679 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:13:49.802240 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798681 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:13:49.802240 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798684 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:13:49.802240 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798687 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:13:49.802240 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798689 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:13:49.802240 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:49.798692 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:13:49.802240 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.798697 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:13:49.802240 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.799530 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 22:13:49.802721 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.802707 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 22:13:49.803768 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.803757 2572 server.go:1019] "Starting client certificate rotation" Apr 16 22:13:49.803871 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.803856 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:49.803908 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.803897 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:13:49.834803 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.834782 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:49.837737 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.837717 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:13:49.848776 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.848758 2572 log.go:25] "Validated CRI v1 runtime API" Apr 16 22:13:49.854986 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.854970 2572 log.go:25] "Validated CRI v1 image API" Apr 16 22:13:49.856107 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.856092 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 22:13:49.860267 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.860247 2572 fs.go:135] Filesystem UUIDs: map[2a7100b2-e775-469e-8a53-b6baf2d9287b:/dev/nvme0n1p3 62328bf3-d8c7-4bdb-a13e-adc6b2357f02:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 16 22:13:49.860351 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.860267 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 22:13:49.865036 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.865019 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:49.866087 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.865983 2572 manager.go:217] Machine: {Timestamp:2026-04-16 22:13:49.864316858 +0000 UTC m=+0.445299061 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3018870 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2de54e72bd26e9e311fd654334aeea SystemUUID:ec2de54e-72bd-26e9-e311-fd654334aeea BootID:9c6a8df1-a154-41b1-b0c4-a85478c12a33 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b4:2b:54:d1:61 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b4:2b:54:d1:61 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ba:5d:40:dd:c1:cb Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 22:13:49.866087 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.866083 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 22:13:49.866191 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.866169 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 22:13:49.867940 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.867912 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 22:13:49.868086 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.867943 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-65.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 22:13:49.868128 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.868093 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 22:13:49.868128 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.868102 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 22:13:49.868128 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.868114 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:49.869045 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.869035 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:13:49.870970 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.870960 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:49.871072 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.871064 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 22:13:49.873423 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.873413 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 16 22:13:49.873869 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.873859 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 22:13:49.873906 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.873880 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 22:13:49.873906 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.873890 2572 kubelet.go:397] "Adding apiserver pod source" Apr 16 22:13:49.873906 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.873899 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 22:13:49.875848 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.875837 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:49.875893 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.875857 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:13:49.879048 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.879029 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 22:13:49.880354 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.880341 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 22:13:49.882216 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.882205 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 22:13:49.882272 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.882222 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 22:13:49.882272 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.882228 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 22:13:49.882272 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.882234 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 22:13:49.882272 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.882240 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 22:13:49.882272 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.882245 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 22:13:49.882272 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.882251 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 22:13:49.882272 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.882256 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 22:13:49.882272 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.882263 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 22:13:49.882272 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.882269 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 22:13:49.882535 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.882287 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 22:13:49.882535 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.882296 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 22:13:49.883109 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.883100 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 22:13:49.883109 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.883109 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 22:13:49.884190 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:49.884169 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-65.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 22:13:49.884190 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:49.884181 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 22:13:49.885421 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.885406 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-65.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 22:13:49.886682 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.886670 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 22:13:49.886726 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.886705 2572 server.go:1295] "Started kubelet" Apr 16 22:13:49.886823 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.886797 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 22:13:49.886871 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.886806 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 22:13:49.886919 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.886878 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 22:13:49.887544 ip-10-0-140-65 systemd[1]: Started Kubernetes Kubelet. Apr 16 22:13:49.888039 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.887900 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 22:13:49.888269 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.888255 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-t777b" Apr 16 22:13:49.890899 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.890883 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 16 22:13:49.893683 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.893666 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-t777b" Apr 16 22:13:49.895964 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:49.892659 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-140-65.ec2.internal.18a6f60af058927e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-140-65.ec2.internal,UID:ip-10-0-140-65.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-140-65.ec2.internal,},FirstTimestamp:2026-04-16 22:13:49.886681726 +0000 UTC m=+0.467663921,LastTimestamp:2026-04-16 22:13:49.886681726 +0000 UTC m=+0.467663921,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-140-65.ec2.internal,}" Apr 16 22:13:49.896063 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.895967 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:49.896512 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.896491 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 22:13:49.897268 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.897249 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 22:13:49.897355 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.897283 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 22:13:49.897355 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.897257 2572 factory.go:55] Registering systemd factory Apr 16 22:13:49.897355 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.897299 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 22:13:49.897355 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.897346 2572 factory.go:223] Registration of the systemd container factory successfully Apr 16 22:13:49.897355 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.897356 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 16 22:13:49.897606 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.897364 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 16 22:13:49.897681 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.897604 2572 factory.go:153] Registering CRI-O factory Apr 16 22:13:49.897681 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.897618 2572 factory.go:223] Registration of the crio container factory successfully Apr 16 22:13:49.897777 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.897681 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 22:13:49.897777 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.897718 2572 factory.go:103] Registering Raw factory Apr 16 22:13:49.897777 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.897735 2572 manager.go:1196] Started watching for new ooms in manager Apr 16 22:13:49.897906 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:49.897847 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-65.ec2.internal\" not found" Apr 16 22:13:49.898478 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.898462 2572 manager.go:319] Starting recovery of all containers Apr 16 22:13:49.898865 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.898848 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:49.899161 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:49.899138 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 22:13:49.902371 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:49.902347 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-140-65.ec2.internal\" not found" node="ip-10-0-140-65.ec2.internal" Apr 16 22:13:49.908060 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.907905 2572 manager.go:324] Recovery completed Apr 16 22:13:49.909211 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:49.909191 2572 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 22:13:49.912222 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.912210 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:49.915084 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.915067 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-65.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:49.915084 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.915097 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-65.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:49.915220 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.915106 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-65.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:49.915644 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.915623 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 22:13:49.915644 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.915638 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 22:13:49.915741 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.915657 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:13:49.918099 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.918087 2572 policy_none.go:49] "None policy: Start" Apr 16 22:13:49.918146 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.918103 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 22:13:49.918146 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.918113 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 16 22:13:49.966240 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.966218 2572 manager.go:341] "Starting Device Plugin manager" Apr 16 22:13:49.972842 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:49.966275 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 22:13:49.972842 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.966294 2572 server.go:85] "Starting device plugin registration server" Apr 16 22:13:49.972842 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.966599 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 22:13:49.972842 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.966611 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 22:13:49.972842 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.966709 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 22:13:49.972842 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.966783 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 22:13:49.972842 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:49.966793 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 22:13:49.972842 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:49.968396 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 22:13:49.972842 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:49.968451 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-65.ec2.internal\" not found" Apr 16 22:13:50.035750 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.035651 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 22:13:50.036930 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.036912 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 22:13:50.036995 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.036945 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 22:13:50.036995 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.036968 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 22:13:50.036995 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.036977 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 22:13:50.037126 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:50.037019 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 22:13:50.041542 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.041523 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:50.066780 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.066752 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:50.067677 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.067659 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-65.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:50.067777 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.067691 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-65.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:50.067777 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.067707 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-65.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:50.067777 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.067731 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-65.ec2.internal" Apr 16 22:13:50.075505 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.075491 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-65.ec2.internal" Apr 16 22:13:50.075585 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:50.075511 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-65.ec2.internal\": node \"ip-10-0-140-65.ec2.internal\" not found" Apr 16 22:13:50.089411 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:50.089387 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-65.ec2.internal\" not found" Apr 16 22:13:50.138145 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.138107 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-65.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-140-65.ec2.internal"] Apr 16 22:13:50.138212 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.138189 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:50.139061 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.139044 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-65.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:50.139134 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.139073 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-65.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:50.139134 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.139083 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-65.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:50.140448 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.140425 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:50.140599 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.140583 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-65.ec2.internal" Apr 16 22:13:50.140653 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.140625 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:50.141133 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.141119 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-65.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:50.141133 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.141127 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-65.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:50.141239 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.141145 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-65.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:50.141239 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.141159 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-65.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:50.141239 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.141148 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-65.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:50.141239 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.141221 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-65.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:50.142282 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.142268 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-65.ec2.internal" Apr 16 22:13:50.142352 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.142296 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:13:50.142945 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.142931 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-65.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:13:50.143006 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.142955 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-65.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:13:50.143006 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.142965 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-65.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:13:50.168128 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:50.168108 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-65.ec2.internal\" not found" node="ip-10-0-140-65.ec2.internal" Apr 16 22:13:50.172503 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:50.172489 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-65.ec2.internal\" not found" node="ip-10-0-140-65.ec2.internal" Apr 16 22:13:50.189472 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:50.189445 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-65.ec2.internal\" not found" Apr 16 22:13:50.289831 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:50.289767 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-65.ec2.internal\" not found" Apr 16 22:13:50.298139 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.298118 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ff930f10e817d00d22cb5161373a31a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-65.ec2.internal\" (UID: \"9ff930f10e817d00d22cb5161373a31a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-65.ec2.internal" Apr 16 22:13:50.298198 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.298147 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cb37a96c125ac85aff486f553494b98e-config\") pod \"kube-apiserver-proxy-ip-10-0-140-65.ec2.internal\" (UID: \"cb37a96c125ac85aff486f553494b98e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-65.ec2.internal" Apr 16 22:13:50.298198 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.298163 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9ff930f10e817d00d22cb5161373a31a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-65.ec2.internal\" (UID: \"9ff930f10e817d00d22cb5161373a31a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-65.ec2.internal" Apr 16 22:13:50.390084 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:50.390058 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-65.ec2.internal\" not found" Apr 16 22:13:50.398356 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.398340 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9ff930f10e817d00d22cb5161373a31a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-65.ec2.internal\" (UID: \"9ff930f10e817d00d22cb5161373a31a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-65.ec2.internal" Apr 16 22:13:50.398411 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.398363 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ff930f10e817d00d22cb5161373a31a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-65.ec2.internal\" (UID: \"9ff930f10e817d00d22cb5161373a31a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-65.ec2.internal" Apr 16 22:13:50.398411 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.398383 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cb37a96c125ac85aff486f553494b98e-config\") pod \"kube-apiserver-proxy-ip-10-0-140-65.ec2.internal\" (UID: \"cb37a96c125ac85aff486f553494b98e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-65.ec2.internal" Apr 16 22:13:50.398480 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.398421 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cb37a96c125ac85aff486f553494b98e-config\") pod \"kube-apiserver-proxy-ip-10-0-140-65.ec2.internal\" (UID: \"cb37a96c125ac85aff486f553494b98e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-65.ec2.internal" Apr 16 22:13:50.398480 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.398447 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ff930f10e817d00d22cb5161373a31a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-65.ec2.internal\" (UID: \"9ff930f10e817d00d22cb5161373a31a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-65.ec2.internal" Apr 16 22:13:50.398480 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.398462 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9ff930f10e817d00d22cb5161373a31a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-65.ec2.internal\" (UID: \"9ff930f10e817d00d22cb5161373a31a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-65.ec2.internal" Apr 16 22:13:50.470557 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.470526 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-65.ec2.internal" Apr 16 22:13:50.475134 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.475111 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-65.ec2.internal" Apr 16 22:13:50.490756 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:50.490734 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-65.ec2.internal\" not found" Apr 16 22:13:50.591242 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:50.591152 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-65.ec2.internal\" not found" Apr 16 22:13:50.691694 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:50.691664 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-65.ec2.internal\" not found" Apr 16 22:13:50.792274 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:50.792242 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-65.ec2.internal\" not found" Apr 16 22:13:50.803679 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.803661 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 22:13:50.803801 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.803786 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:50.803847 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.803827 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:13:50.893162 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:50.893136 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-65.ec2.internal\" not found" Apr 16 22:13:50.896397 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.896364 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 22:08:49 +0000 UTC" deadline="2027-10-23 23:43:56.840354151 +0000 UTC" Apr 16 22:13:50.896493 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.896404 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13321h30m5.94395337s" Apr 16 22:13:50.896493 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.896386 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 22:13:50.905522 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.905349 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:13:50.927994 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.927968 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-ddkqz" Apr 16 22:13:50.933371 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.933351 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-ddkqz" Apr 16 22:13:50.968046 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:50.967991 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ff930f10e817d00d22cb5161373a31a.slice/crio-2769dca1dd1193ccf32c60a603d716c3a929eff88941e31356122105aea2f11f WatchSource:0}: Error finding container 2769dca1dd1193ccf32c60a603d716c3a929eff88941e31356122105aea2f11f: Status 404 returned error can't find the container with id 2769dca1dd1193ccf32c60a603d716c3a929eff88941e31356122105aea2f11f Apr 16 22:13:50.972509 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:50.972483 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb37a96c125ac85aff486f553494b98e.slice/crio-fa2175ff06cc947aa9125a4a5b3082586cf55108543285c9281619c41b540447 WatchSource:0}: Error finding container fa2175ff06cc947aa9125a4a5b3082586cf55108543285c9281619c41b540447: Status 404 returned error can't find the container with id fa2175ff06cc947aa9125a4a5b3082586cf55108543285c9281619c41b540447 Apr 16 22:13:50.973248 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:50.973233 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:13:50.993763 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:50.993740 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-65.ec2.internal\" not found" Apr 16 22:13:51.040188 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.040142 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-65.ec2.internal" event={"ID":"cb37a96c125ac85aff486f553494b98e","Type":"ContainerStarted","Data":"fa2175ff06cc947aa9125a4a5b3082586cf55108543285c9281619c41b540447"} Apr 16 22:13:51.041056 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.041035 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-65.ec2.internal" event={"ID":"9ff930f10e817d00d22cb5161373a31a","Type":"ContainerStarted","Data":"2769dca1dd1193ccf32c60a603d716c3a929eff88941e31356122105aea2f11f"} Apr 16 22:13:51.094216 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:51.094187 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-65.ec2.internal\" not found" Apr 16 22:13:51.194788 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:51.194712 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-65.ec2.internal\" not found" Apr 16 22:13:51.295282 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:51.295241 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-65.ec2.internal\" not found" Apr 16 22:13:51.396132 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:51.396096 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-65.ec2.internal\" not found" Apr 16 22:13:51.402944 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.402913 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:51.426848 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.426819 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:51.497098 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.497030 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-65.ec2.internal" Apr 16 22:13:51.509194 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.509159 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:51.510307 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.510283 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-65.ec2.internal" Apr 16 22:13:51.517421 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.517399 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:13:51.875283 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.875239 2572 apiserver.go:52] "Watching apiserver" Apr 16 22:13:51.883813 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.883788 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 22:13:51.884194 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.884166 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64","openshift-cluster-node-tuning-operator/tuned-2z49w","openshift-image-registry/node-ca-fcr4l","openshift-multus/multus-additional-cni-plugins-6z8bb","openshift-network-diagnostics/network-check-target-jmbfd","openshift-network-operator/iptables-alerter-4mbj2","openshift-dns/node-resolver-4h7zb","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-65.ec2.internal","openshift-multus/multus-rztk7","openshift-multus/network-metrics-daemon-cd2s2","openshift-ovn-kubernetes/ovnkube-node-qntcx","kube-system/konnectivity-agent-fdq6c","kube-system/kube-apiserver-proxy-ip-10-0-140-65.ec2.internal"] Apr 16 22:13:51.886864 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.886804 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4h7zb" Apr 16 22:13:51.887974 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.887960 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:51.888072 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.888057 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fcr4l" Apr 16 22:13:51.889281 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.889211 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 22:13:51.890047 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.889684 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-4bxft\"" Apr 16 22:13:51.891395 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.890273 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:51.891395 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.890511 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 22:13:51.891395 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.890681 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:51.891395 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.890872 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 22:13:51.891395 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.891074 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2shqw\"" Apr 16 22:13:51.891395 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.891190 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 22:13:51.891720 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.891531 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-qqddp\"" Apr 16 22:13:51.891771 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.891734 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:51.891819 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.891778 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 22:13:51.892823 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.892776 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:13:51.893118 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:51.892938 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jmbfd" podUID="782f4e27-ac2a-4597-87e3-319c3917a087" Apr 16 22:13:51.893864 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.893801 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 22:13:51.893864 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.893819 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 22:13:51.893864 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.893851 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 22:13:51.894121 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.893965 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-mzrbl\"" Apr 16 22:13:51.894121 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.893967 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 22:13:51.894216 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.894197 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 22:13:51.895930 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.895911 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4mbj2" Apr 16 22:13:51.896044 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.896023 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:51.898511 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.898492 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:13:51.898701 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.898681 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 22:13:51.898807 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.898681 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 22:13:51.898891 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.898873 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 22:13:51.899030 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.899011 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rztk7" Apr 16 22:13:51.899156 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.899077 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 22:13:51.899283 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.899265 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-p97zh\"" Apr 16 22:13:51.899385 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.899360 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 22:13:51.899532 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.899510 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-thbxz\"" Apr 16 22:13:51.901220 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.901198 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-5wclm\"" Apr 16 22:13:51.901309 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.901222 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 22:13:51.901598 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.901580 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:13:51.901695 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:51.901650 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cd2s2" podUID="4ffa9102-be6a-431e-b1c8-ee3b5b01e588" Apr 16 22:13:51.901753 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.901701 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:51.902939 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.902918 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fdq6c" Apr 16 22:13:51.903866 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.903728 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 22:13:51.904077 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.904013 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 22:13:51.904077 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.904027 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dk4kq\"" Apr 16 22:13:51.904424 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.904276 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 22:13:51.904424 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.904280 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 22:13:51.904602 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.904584 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 22:13:51.904707 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.904687 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 22:13:51.905290 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905185 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bcafa867-5e82-4ecb-b67b-a3ca411c6879-etc-selinux\") pod \"aws-ebs-csi-driver-node-kdw64\" (UID: \"bcafa867-5e82-4ecb-b67b-a3ca411c6879\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:51.905290 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905218 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bcafa867-5e82-4ecb-b67b-a3ca411c6879-sys-fs\") pod \"aws-ebs-csi-driver-node-kdw64\" (UID: \"bcafa867-5e82-4ecb-b67b-a3ca411c6879\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:51.905290 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905242 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tf4c\" (UniqueName: \"kubernetes.io/projected/bcafa867-5e82-4ecb-b67b-a3ca411c6879-kube-api-access-8tf4c\") pod \"aws-ebs-csi-driver-node-kdw64\" (UID: \"bcafa867-5e82-4ecb-b67b-a3ca411c6879\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:51.905290 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905264 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/52f439d6-f3b5-4680-9824-b2e26d67be20-os-release\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:51.905290 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905269 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-r2mbz\"" Apr 16 22:13:51.905572 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905299 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-system-cni-dir\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:51.905572 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905324 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-etc-sysctl-conf\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:51.905572 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905348 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/52f439d6-f3b5-4680-9824-b2e26d67be20-system-cni-dir\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:51.905572 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905387 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-cnibin\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:51.905572 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905412 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/428e9197-b90c-43be-a86a-81a7795c4f68-multus-daemon-config\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:51.905572 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905435 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp527\" (UniqueName: \"kubernetes.io/projected/428e9197-b90c-43be-a86a-81a7795c4f68-kube-api-access-tp527\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:51.905572 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905456 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 22:13:51.905572 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905459 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-etc-sysconfig\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:51.905572 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905495 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f190b2ca-0861-4d63-b3a6-20531ce22e01-etc-tuned\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:51.905572 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905517 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03fed82b-78d8-4490-8c58-013c3b157475-host\") pod \"node-ca-fcr4l\" (UID: \"03fed82b-78d8-4490-8c58-013c3b157475\") " pod="openshift-image-registry/node-ca-fcr4l" Apr 16 22:13:51.905572 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905541 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/428e9197-b90c-43be-a86a-81a7795c4f68-cni-binary-copy\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:51.906081 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905594 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-host-run-multus-certs\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:51.906081 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905620 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-etc-systemd\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:51.906081 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905642 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-host\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:51.906081 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905677 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49tjg\" (UniqueName: \"kubernetes.io/projected/f190b2ca-0861-4d63-b3a6-20531ce22e01-kube-api-access-49tjg\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:51.906081 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905693 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 22:13:51.906081 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905713 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dltz5\" (UniqueName: \"kubernetes.io/projected/52f439d6-f3b5-4680-9824-b2e26d67be20-kube-api-access-dltz5\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:51.906081 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905744 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-os-release\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:51.906081 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905771 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-host-run-k8s-cni-cncf-io\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:51.906081 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905793 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-etc-kubernetes\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:51.906081 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905815 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-run\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:51.906081 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905837 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f190b2ca-0861-4d63-b3a6-20531ce22e01-tmp\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:51.906081 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905874 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/340bef7a-cec6-49ff-9089-98ba19d935b8-hosts-file\") pod \"node-resolver-4h7zb\" (UID: \"340bef7a-cec6-49ff-9089-98ba19d935b8\") " pod="openshift-dns/node-resolver-4h7zb" Apr 16 22:13:51.906081 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905924 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-etc-kubernetes\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:51.906081 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905952 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-host-var-lib-cni-multus\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:51.906081 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.905974 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8eecc640-8a16-4e14-9e02-e8cf75b619f9-host-slash\") pod \"iptables-alerter-4mbj2\" (UID: \"8eecc640-8a16-4e14-9e02-e8cf75b619f9\") " pod="openshift-network-operator/iptables-alerter-4mbj2" Apr 16 22:13:51.906081 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906006 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bcafa867-5e82-4ecb-b67b-a3ca411c6879-registration-dir\") pod \"aws-ebs-csi-driver-node-kdw64\" (UID: \"bcafa867-5e82-4ecb-b67b-a3ca411c6879\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:51.906081 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906047 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/52f439d6-f3b5-4680-9824-b2e26d67be20-cnibin\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:51.906880 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906087 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc5nd\" (UniqueName: \"kubernetes.io/projected/8eecc640-8a16-4e14-9e02-e8cf75b619f9-kube-api-access-hc5nd\") pod \"iptables-alerter-4mbj2\" (UID: \"8eecc640-8a16-4e14-9e02-e8cf75b619f9\") " pod="openshift-network-operator/iptables-alerter-4mbj2" Apr 16 22:13:51.906880 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906136 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-host-var-lib-kubelet\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:51.906880 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906201 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-hostroot\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:51.906880 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906233 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-sys\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:51.906880 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906261 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-lib-modules\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:51.906880 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906294 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5dfq\" (UniqueName: \"kubernetes.io/projected/03fed82b-78d8-4490-8c58-013c3b157475-kube-api-access-f5dfq\") pod \"node-ca-fcr4l\" (UID: \"03fed82b-78d8-4490-8c58-013c3b157475\") " pod="openshift-image-registry/node-ca-fcr4l" Apr 16 22:13:51.906880 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906342 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/52f439d6-f3b5-4680-9824-b2e26d67be20-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:51.906880 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906368 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bcafa867-5e82-4ecb-b67b-a3ca411c6879-socket-dir\") pod \"aws-ebs-csi-driver-node-kdw64\" (UID: \"bcafa867-5e82-4ecb-b67b-a3ca411c6879\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:51.906880 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906395 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/340bef7a-cec6-49ff-9089-98ba19d935b8-tmp-dir\") pod \"node-resolver-4h7zb\" (UID: \"340bef7a-cec6-49ff-9089-98ba19d935b8\") " pod="openshift-dns/node-resolver-4h7zb" Apr 16 22:13:51.906880 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906458 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-multus-socket-dir-parent\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:51.906880 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906507 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j9qm\" (UniqueName: \"kubernetes.io/projected/340bef7a-cec6-49ff-9089-98ba19d935b8-kube-api-access-8j9qm\") pod \"node-resolver-4h7zb\" (UID: \"340bef7a-cec6-49ff-9089-98ba19d935b8\") " pod="openshift-dns/node-resolver-4h7zb" Apr 16 22:13:51.906880 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906538 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-host-run-netns\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:51.906880 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906583 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/03fed82b-78d8-4490-8c58-013c3b157475-serviceca\") pod \"node-ca-fcr4l\" (UID: \"03fed82b-78d8-4490-8c58-013c3b157475\") " pod="openshift-image-registry/node-ca-fcr4l" Apr 16 22:13:51.906880 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906600 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/52f439d6-f3b5-4680-9824-b2e26d67be20-cni-binary-copy\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:51.906880 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906632 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8eecc640-8a16-4e14-9e02-e8cf75b619f9-iptables-alerter-script\") pod \"iptables-alerter-4mbj2\" (UID: \"8eecc640-8a16-4e14-9e02-e8cf75b619f9\") " pod="openshift-network-operator/iptables-alerter-4mbj2" Apr 16 22:13:51.906880 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906664 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-multus-cni-dir\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:51.907537 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906713 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-host-var-lib-cni-bin\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:51.907537 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906773 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-var-lib-kubelet\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:51.907537 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906809 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/52f439d6-f3b5-4680-9824-b2e26d67be20-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:51.907537 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906865 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-multus-conf-dir\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:51.907537 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906888 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bcafa867-5e82-4ecb-b67b-a3ca411c6879-device-dir\") pod \"aws-ebs-csi-driver-node-kdw64\" (UID: \"bcafa867-5e82-4ecb-b67b-a3ca411c6879\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:51.907537 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906912 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/52f439d6-f3b5-4680-9824-b2e26d67be20-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:51.907537 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906939 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxvhn\" (UniqueName: \"kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn\") pod \"network-check-target-jmbfd\" (UID: \"782f4e27-ac2a-4597-87e3-319c3917a087\") " pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:13:51.907537 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.906991 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bcafa867-5e82-4ecb-b67b-a3ca411c6879-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kdw64\" (UID: \"bcafa867-5e82-4ecb-b67b-a3ca411c6879\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:51.907537 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.907014 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-etc-modprobe-d\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:51.907537 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.907038 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-etc-sysctl-d\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:51.934125 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.934070 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:50 +0000 UTC" deadline="2027-10-20 23:23:07.833870103 +0000 UTC" Apr 16 22:13:51.934125 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.934124 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13249h9m15.899749798s" Apr 16 22:13:51.998575 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:51.998535 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 22:13:52.007562 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007520 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bcafa867-5e82-4ecb-b67b-a3ca411c6879-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kdw64\" (UID: \"bcafa867-5e82-4ecb-b67b-a3ca411c6879\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:52.007701 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007570 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-etc-modprobe-d\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.007701 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007631 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bcafa867-5e82-4ecb-b67b-a3ca411c6879-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kdw64\" (UID: \"bcafa867-5e82-4ecb-b67b-a3ca411c6879\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:52.007701 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007669 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-etc-sysctl-d\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.007701 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007691 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-etc-openvswitch\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.007920 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007709 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-log-socket\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.007920 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007693 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-etc-modprobe-d\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.007920 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007726 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bcafa867-5e82-4ecb-b67b-a3ca411c6879-etc-selinux\") pod \"aws-ebs-csi-driver-node-kdw64\" (UID: \"bcafa867-5e82-4ecb-b67b-a3ca411c6879\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:52.007920 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007773 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bcafa867-5e82-4ecb-b67b-a3ca411c6879-sys-fs\") pod \"aws-ebs-csi-driver-node-kdw64\" (UID: \"bcafa867-5e82-4ecb-b67b-a3ca411c6879\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:52.007920 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007776 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-etc-sysctl-d\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.007920 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007787 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bcafa867-5e82-4ecb-b67b-a3ca411c6879-etc-selinux\") pod \"aws-ebs-csi-driver-node-kdw64\" (UID: \"bcafa867-5e82-4ecb-b67b-a3ca411c6879\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:52.007920 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007793 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8tf4c\" (UniqueName: \"kubernetes.io/projected/bcafa867-5e82-4ecb-b67b-a3ca411c6879-kube-api-access-8tf4c\") pod \"aws-ebs-csi-driver-node-kdw64\" (UID: \"bcafa867-5e82-4ecb-b67b-a3ca411c6879\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:52.007920 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007811 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/52f439d6-f3b5-4680-9824-b2e26d67be20-os-release\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:52.007920 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007827 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-system-cni-dir\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.007920 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007843 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-etc-sysctl-conf\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.007920 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007843 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bcafa867-5e82-4ecb-b67b-a3ca411c6879-sys-fs\") pod \"aws-ebs-csi-driver-node-kdw64\" (UID: \"bcafa867-5e82-4ecb-b67b-a3ca411c6879\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:52.007920 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007860 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs\") pod \"network-metrics-daemon-cd2s2\" (UID: \"4ffa9102-be6a-431e-b1c8-ee3b5b01e588\") " pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:13:52.007920 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007898 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-system-cni-dir\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.007920 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007926 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/52f439d6-f3b5-4680-9824-b2e26d67be20-system-cni-dir\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:52.007920 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007933 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/52f439d6-f3b5-4680-9824-b2e26d67be20-os-release\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:52.008595 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007954 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-cnibin\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.008595 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007973 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/52f439d6-f3b5-4680-9824-b2e26d67be20-system-cni-dir\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:52.008595 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007979 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/428e9197-b90c-43be-a86a-81a7795c4f68-multus-daemon-config\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.008595 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.007993 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-etc-sysctl-conf\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.008595 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008017 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tp527\" (UniqueName: \"kubernetes.io/projected/428e9197-b90c-43be-a86a-81a7795c4f68-kube-api-access-tp527\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.008595 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008017 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-cnibin\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.008595 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008137 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-etc-sysconfig\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.008595 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008173 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f190b2ca-0861-4d63-b3a6-20531ce22e01-etc-tuned\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.008595 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008200 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03fed82b-78d8-4490-8c58-013c3b157475-host\") pod \"node-ca-fcr4l\" (UID: \"03fed82b-78d8-4490-8c58-013c3b157475\") " pod="openshift-image-registry/node-ca-fcr4l" Apr 16 22:13:52.008595 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008217 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-etc-sysconfig\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.008595 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008223 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/428e9197-b90c-43be-a86a-81a7795c4f68-cni-binary-copy\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.008595 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008266 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03fed82b-78d8-4490-8c58-013c3b157475-host\") pod \"node-ca-fcr4l\" (UID: \"03fed82b-78d8-4490-8c58-013c3b157475\") " pod="openshift-image-registry/node-ca-fcr4l" Apr 16 22:13:52.008595 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008281 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-host-run-multus-certs\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.008595 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008311 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-etc-systemd\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.008595 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008316 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-host-run-multus-certs\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.008595 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008334 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-host\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.008595 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008359 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49tjg\" (UniqueName: \"kubernetes.io/projected/f190b2ca-0861-4d63-b3a6-20531ce22e01-kube-api-access-49tjg\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.008595 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008366 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-etc-systemd\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.009375 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008388 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-node-log\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.009375 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008409 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-host\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.009375 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008417 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dltz5\" (UniqueName: \"kubernetes.io/projected/52f439d6-f3b5-4680-9824-b2e26d67be20-kube-api-access-dltz5\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:52.009375 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008445 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-os-release\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.009375 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008469 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-host-run-k8s-cni-cncf-io\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.009375 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008488 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 22:13:52.009375 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008505 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-etc-kubernetes\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.009375 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008529 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-run\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.009375 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008575 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-os-release\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.009375 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008601 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-run\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.009375 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008600 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-host-run-k8s-cni-cncf-io\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.009375 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008631 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f190b2ca-0861-4d63-b3a6-20531ce22e01-tmp\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.009375 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008660 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.009375 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008688 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/829af633-ea5e-4051-916a-45ddab265148-ovn-node-metrics-cert\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.009375 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008631 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-etc-kubernetes\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.009375 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008715 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/340bef7a-cec6-49ff-9089-98ba19d935b8-hosts-file\") pod \"node-resolver-4h7zb\" (UID: \"340bef7a-cec6-49ff-9089-98ba19d935b8\") " pod="openshift-dns/node-resolver-4h7zb" Apr 16 22:13:52.009375 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008743 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-etc-kubernetes\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.009375 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008793 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/340bef7a-cec6-49ff-9089-98ba19d935b8-hosts-file\") pod \"node-resolver-4h7zb\" (UID: \"340bef7a-cec6-49ff-9089-98ba19d935b8\") " pod="openshift-dns/node-resolver-4h7zb" Apr 16 22:13:52.010200 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008794 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-host-var-lib-cni-multus\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.010200 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008850 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-systemd-units\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.010200 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008896 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-host-var-lib-cni-multus\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.010200 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008923 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-etc-kubernetes\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.010200 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008927 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8eecc640-8a16-4e14-9e02-e8cf75b619f9-host-slash\") pod \"iptables-alerter-4mbj2\" (UID: \"8eecc640-8a16-4e14-9e02-e8cf75b619f9\") " pod="openshift-network-operator/iptables-alerter-4mbj2" Apr 16 22:13:52.010200 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.008964 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8eecc640-8a16-4e14-9e02-e8cf75b619f9-host-slash\") pod \"iptables-alerter-4mbj2\" (UID: \"8eecc640-8a16-4e14-9e02-e8cf75b619f9\") " pod="openshift-network-operator/iptables-alerter-4mbj2" Apr 16 22:13:52.010200 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009000 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg9jb\" (UniqueName: \"kubernetes.io/projected/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-kube-api-access-pg9jb\") pod \"network-metrics-daemon-cd2s2\" (UID: \"4ffa9102-be6a-431e-b1c8-ee3b5b01e588\") " pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:13:52.010200 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009039 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bcafa867-5e82-4ecb-b67b-a3ca411c6879-registration-dir\") pod \"aws-ebs-csi-driver-node-kdw64\" (UID: \"bcafa867-5e82-4ecb-b67b-a3ca411c6879\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:52.010200 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009103 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/52f439d6-f3b5-4680-9824-b2e26d67be20-cnibin\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:52.010200 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009132 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hc5nd\" (UniqueName: \"kubernetes.io/projected/8eecc640-8a16-4e14-9e02-e8cf75b619f9-kube-api-access-hc5nd\") pod \"iptables-alerter-4mbj2\" (UID: \"8eecc640-8a16-4e14-9e02-e8cf75b619f9\") " pod="openshift-network-operator/iptables-alerter-4mbj2" Apr 16 22:13:52.010200 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009150 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bcafa867-5e82-4ecb-b67b-a3ca411c6879-registration-dir\") pod \"aws-ebs-csi-driver-node-kdw64\" (UID: \"bcafa867-5e82-4ecb-b67b-a3ca411c6879\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:52.010200 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009160 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-host-var-lib-kubelet\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.010200 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009185 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-hostroot\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.010200 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-sys\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.010200 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009218 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-lib-modules\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.010200 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009234 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/52f439d6-f3b5-4680-9824-b2e26d67be20-cnibin\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:52.010200 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009237 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/33fba0e3-6c1e-4bb4-a2e7-db58d15d18a4-agent-certs\") pod \"konnectivity-agent-fdq6c\" (UID: \"33fba0e3-6c1e-4bb4-a2e7-db58d15d18a4\") " pod="kube-system/konnectivity-agent-fdq6c" Apr 16 22:13:52.010905 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009276 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/428e9197-b90c-43be-a86a-81a7795c4f68-multus-daemon-config\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.010905 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009277 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5dfq\" (UniqueName: \"kubernetes.io/projected/03fed82b-78d8-4490-8c58-013c3b157475-kube-api-access-f5dfq\") pod \"node-ca-fcr4l\" (UID: \"03fed82b-78d8-4490-8c58-013c3b157475\") " pod="openshift-image-registry/node-ca-fcr4l" Apr 16 22:13:52.010905 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009341 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/52f439d6-f3b5-4680-9824-b2e26d67be20-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:52.010905 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009374 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/33fba0e3-6c1e-4bb4-a2e7-db58d15d18a4-konnectivity-ca\") pod \"konnectivity-agent-fdq6c\" (UID: \"33fba0e3-6c1e-4bb4-a2e7-db58d15d18a4\") " pod="kube-system/konnectivity-agent-fdq6c" Apr 16 22:13:52.010905 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009399 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-host-slash\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.010905 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009426 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-host-run-netns\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.010905 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009444 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-hostroot\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.010905 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009455 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-host-run-ovn-kubernetes\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.010905 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009478 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-host-cni-bin\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.010905 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009486 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-host-var-lib-kubelet\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.010905 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009486 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-sys\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.010905 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009501 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bcafa867-5e82-4ecb-b67b-a3ca411c6879-socket-dir\") pod \"aws-ebs-csi-driver-node-kdw64\" (UID: \"bcafa867-5e82-4ecb-b67b-a3ca411c6879\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:52.010905 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009568 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/340bef7a-cec6-49ff-9089-98ba19d935b8-tmp-dir\") pod \"node-resolver-4h7zb\" (UID: \"340bef7a-cec6-49ff-9089-98ba19d935b8\") " pod="openshift-dns/node-resolver-4h7zb" Apr 16 22:13:52.010905 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009591 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-lib-modules\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.010905 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009627 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-multus-socket-dir-parent\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.010905 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009639 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bcafa867-5e82-4ecb-b67b-a3ca411c6879-socket-dir\") pod \"aws-ebs-csi-driver-node-kdw64\" (UID: \"bcafa867-5e82-4ecb-b67b-a3ca411c6879\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:52.010905 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009657 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-run-systemd\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.011426 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009684 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-var-lib-openvswitch\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.011426 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009696 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-multus-socket-dir-parent\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.011426 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009717 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-run-ovn\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.011426 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009743 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-host-cni-netd\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.011426 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009767 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/829af633-ea5e-4051-916a-45ddab265148-ovnkube-script-lib\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.011426 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8j9qm\" (UniqueName: \"kubernetes.io/projected/340bef7a-cec6-49ff-9089-98ba19d935b8-kube-api-access-8j9qm\") pod \"node-resolver-4h7zb\" (UID: \"340bef7a-cec6-49ff-9089-98ba19d935b8\") " pod="openshift-dns/node-resolver-4h7zb" Apr 16 22:13:52.011426 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009819 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/829af633-ea5e-4051-916a-45ddab265148-env-overrides\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.011426 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009860 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scbx9\" (UniqueName: \"kubernetes.io/projected/829af633-ea5e-4051-916a-45ddab265148-kube-api-access-scbx9\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.011426 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009913 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/340bef7a-cec6-49ff-9089-98ba19d935b8-tmp-dir\") pod \"node-resolver-4h7zb\" (UID: \"340bef7a-cec6-49ff-9089-98ba19d935b8\") " pod="openshift-dns/node-resolver-4h7zb" Apr 16 22:13:52.011426 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009919 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-host-run-netns\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.011426 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.009949 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-run-openvswitch\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.011426 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010005 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/03fed82b-78d8-4490-8c58-013c3b157475-serviceca\") pod \"node-ca-fcr4l\" (UID: \"03fed82b-78d8-4490-8c58-013c3b157475\") " pod="openshift-image-registry/node-ca-fcr4l" Apr 16 22:13:52.011426 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010034 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/52f439d6-f3b5-4680-9824-b2e26d67be20-cni-binary-copy\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:52.011426 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010055 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-host-run-netns\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.011426 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010059 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/52f439d6-f3b5-4680-9824-b2e26d67be20-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:52.011426 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010061 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8eecc640-8a16-4e14-9e02-e8cf75b619f9-iptables-alerter-script\") pod \"iptables-alerter-4mbj2\" (UID: \"8eecc640-8a16-4e14-9e02-e8cf75b619f9\") " pod="openshift-network-operator/iptables-alerter-4mbj2" Apr 16 22:13:52.011426 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010097 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/428e9197-b90c-43be-a86a-81a7795c4f68-cni-binary-copy\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.012044 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010113 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-multus-cni-dir\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.012044 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010142 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-host-var-lib-cni-bin\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.012044 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010168 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-var-lib-kubelet\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.012044 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010213 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/52f439d6-f3b5-4680-9824-b2e26d67be20-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:52.012044 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010237 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-multus-conf-dir\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.012044 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010260 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bcafa867-5e82-4ecb-b67b-a3ca411c6879-device-dir\") pod \"aws-ebs-csi-driver-node-kdw64\" (UID: \"bcafa867-5e82-4ecb-b67b-a3ca411c6879\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:52.012044 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010262 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-host-var-lib-cni-bin\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.012044 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010313 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f190b2ca-0861-4d63-b3a6-20531ce22e01-var-lib-kubelet\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.012044 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010329 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bcafa867-5e82-4ecb-b67b-a3ca411c6879-device-dir\") pod \"aws-ebs-csi-driver-node-kdw64\" (UID: \"bcafa867-5e82-4ecb-b67b-a3ca411c6879\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:52.012044 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010424 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/52f439d6-f3b5-4680-9824-b2e26d67be20-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:52.012044 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010434 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/03fed82b-78d8-4490-8c58-013c3b157475-serviceca\") pod \"node-ca-fcr4l\" (UID: \"03fed82b-78d8-4490-8c58-013c3b157475\") " pod="openshift-image-registry/node-ca-fcr4l" Apr 16 22:13:52.012044 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010444 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-multus-cni-dir\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.012044 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010468 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/428e9197-b90c-43be-a86a-81a7795c4f68-multus-conf-dir\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.012044 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010471 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/52f439d6-f3b5-4680-9824-b2e26d67be20-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:52.012044 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010518 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxvhn\" (UniqueName: \"kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn\") pod \"network-check-target-jmbfd\" (UID: \"782f4e27-ac2a-4597-87e3-319c3917a087\") " pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:13:52.012044 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010540 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/52f439d6-f3b5-4680-9824-b2e26d67be20-cni-binary-copy\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:52.012044 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010564 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-host-kubelet\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.012602 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010591 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/829af633-ea5e-4051-916a-45ddab265148-ovnkube-config\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.012602 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010598 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8eecc640-8a16-4e14-9e02-e8cf75b619f9-iptables-alerter-script\") pod \"iptables-alerter-4mbj2\" (UID: \"8eecc640-8a16-4e14-9e02-e8cf75b619f9\") " pod="openshift-network-operator/iptables-alerter-4mbj2" Apr 16 22:13:52.012602 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.010825 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/52f439d6-f3b5-4680-9824-b2e26d67be20-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:52.012602 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.011807 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f190b2ca-0861-4d63-b3a6-20531ce22e01-tmp\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.012602 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.012152 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f190b2ca-0861-4d63-b3a6-20531ce22e01-etc-tuned\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.020758 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.020039 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dltz5\" (UniqueName: \"kubernetes.io/projected/52f439d6-f3b5-4680-9824-b2e26d67be20-kube-api-access-dltz5\") pod \"multus-additional-cni-plugins-6z8bb\" (UID: \"52f439d6-f3b5-4680-9824-b2e26d67be20\") " pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:52.020758 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:52.020139 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:52.020758 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.020157 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tf4c\" (UniqueName: \"kubernetes.io/projected/bcafa867-5e82-4ecb-b67b-a3ca411c6879-kube-api-access-8tf4c\") pod \"aws-ebs-csi-driver-node-kdw64\" (UID: \"bcafa867-5e82-4ecb-b67b-a3ca411c6879\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:52.020758 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:52.020172 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:52.020758 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:52.020195 2572 projected.go:194] Error preparing data for projected volume kube-api-access-zxvhn for pod openshift-network-diagnostics/network-check-target-jmbfd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:52.020758 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.020209 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49tjg\" (UniqueName: \"kubernetes.io/projected/f190b2ca-0861-4d63-b3a6-20531ce22e01-kube-api-access-49tjg\") pod \"tuned-2z49w\" (UID: \"f190b2ca-0861-4d63-b3a6-20531ce22e01\") " pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.020758 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:52.020274 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn podName:782f4e27-ac2a-4597-87e3-319c3917a087 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:52.520252405 +0000 UTC m=+3.101234608 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zxvhn" (UniqueName: "kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn") pod "network-check-target-jmbfd" (UID: "782f4e27-ac2a-4597-87e3-319c3917a087") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:52.022585 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.022563 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc5nd\" (UniqueName: \"kubernetes.io/projected/8eecc640-8a16-4e14-9e02-e8cf75b619f9-kube-api-access-hc5nd\") pod \"iptables-alerter-4mbj2\" (UID: \"8eecc640-8a16-4e14-9e02-e8cf75b619f9\") " pod="openshift-network-operator/iptables-alerter-4mbj2" Apr 16 22:13:52.022943 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.022922 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp527\" (UniqueName: \"kubernetes.io/projected/428e9197-b90c-43be-a86a-81a7795c4f68-kube-api-access-tp527\") pod \"multus-rztk7\" (UID: \"428e9197-b90c-43be-a86a-81a7795c4f68\") " pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.023357 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.023213 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5dfq\" (UniqueName: \"kubernetes.io/projected/03fed82b-78d8-4490-8c58-013c3b157475-kube-api-access-f5dfq\") pod \"node-ca-fcr4l\" (UID: \"03fed82b-78d8-4490-8c58-013c3b157475\") " pod="openshift-image-registry/node-ca-fcr4l" Apr 16 22:13:52.025050 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.025032 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j9qm\" (UniqueName: \"kubernetes.io/projected/340bef7a-cec6-49ff-9089-98ba19d935b8-kube-api-access-8j9qm\") pod \"node-resolver-4h7zb\" (UID: \"340bef7a-cec6-49ff-9089-98ba19d935b8\") " pod="openshift-dns/node-resolver-4h7zb" Apr 16 22:13:52.026855 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.026837 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:52.111641 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.111607 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-run-openvswitch\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.111811 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.111672 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-host-kubelet\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.111811 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.111697 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/829af633-ea5e-4051-916a-45ddab265148-ovnkube-config\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.111811 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.111726 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-etc-openvswitch\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.111811 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.111726 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-run-openvswitch\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.111811 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.111752 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-log-socket\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.111811 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.111786 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs\") pod \"network-metrics-daemon-cd2s2\" (UID: \"4ffa9102-be6a-431e-b1c8-ee3b5b01e588\") " pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:13:52.111811 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.111793 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-etc-openvswitch\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.111811 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.111795 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-host-kubelet\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112201 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.111819 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-node-log\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112201 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.111826 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-log-socket\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112201 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.111849 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112201 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.111877 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/829af633-ea5e-4051-916a-45ddab265148-ovn-node-metrics-cert\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112201 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:52.111896 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:52.112201 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.111907 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112201 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.111893 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-node-log\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112201 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.111941 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-systemd-units\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112201 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:52.111965 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs podName:4ffa9102-be6a-431e-b1c8-ee3b5b01e588 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:52.61194417 +0000 UTC m=+3.192926358 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs") pod "network-metrics-daemon-cd2s2" (UID: "4ffa9102-be6a-431e-b1c8-ee3b5b01e588") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:52.112201 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.111988 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-systemd-units\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112201 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112004 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pg9jb\" (UniqueName: \"kubernetes.io/projected/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-kube-api-access-pg9jb\") pod \"network-metrics-daemon-cd2s2\" (UID: \"4ffa9102-be6a-431e-b1c8-ee3b5b01e588\") " pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:13:52.112201 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112035 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/33fba0e3-6c1e-4bb4-a2e7-db58d15d18a4-agent-certs\") pod \"konnectivity-agent-fdq6c\" (UID: \"33fba0e3-6c1e-4bb4-a2e7-db58d15d18a4\") " pod="kube-system/konnectivity-agent-fdq6c" Apr 16 22:13:52.112201 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112066 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/33fba0e3-6c1e-4bb4-a2e7-db58d15d18a4-konnectivity-ca\") pod \"konnectivity-agent-fdq6c\" (UID: \"33fba0e3-6c1e-4bb4-a2e7-db58d15d18a4\") " pod="kube-system/konnectivity-agent-fdq6c" Apr 16 22:13:52.112201 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112090 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-host-slash\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112201 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112129 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-host-slash\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112201 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112189 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-host-run-netns\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112959 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112221 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-host-run-ovn-kubernetes\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112959 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112249 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-host-cni-bin\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112959 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112261 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-host-run-netns\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112959 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112271 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-host-run-ovn-kubernetes\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112959 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112275 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-run-systemd\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112959 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112317 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-var-lib-openvswitch\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112959 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112322 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-run-systemd\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112959 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112345 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-run-ovn\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112959 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112371 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-host-cni-netd\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112959 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112391 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-run-ovn\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112959 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112395 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/829af633-ea5e-4051-916a-45ddab265148-ovnkube-script-lib\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112959 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112402 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-var-lib-openvswitch\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112959 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112417 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-host-cni-netd\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112959 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112373 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/829af633-ea5e-4051-916a-45ddab265148-host-cni-bin\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112959 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112422 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/829af633-ea5e-4051-916a-45ddab265148-env-overrides\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112959 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112481 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-scbx9\" (UniqueName: \"kubernetes.io/projected/829af633-ea5e-4051-916a-45ddab265148-kube-api-access-scbx9\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.112959 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.112366 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/829af633-ea5e-4051-916a-45ddab265148-ovnkube-config\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.113649 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.113220 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/829af633-ea5e-4051-916a-45ddab265148-env-overrides\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.113649 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.113228 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/829af633-ea5e-4051-916a-45ddab265148-ovnkube-script-lib\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.113649 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.113313 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/33fba0e3-6c1e-4bb4-a2e7-db58d15d18a4-konnectivity-ca\") pod \"konnectivity-agent-fdq6c\" (UID: \"33fba0e3-6c1e-4bb4-a2e7-db58d15d18a4\") " pod="kube-system/konnectivity-agent-fdq6c" Apr 16 22:13:52.114875 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.114831 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/829af633-ea5e-4051-916a-45ddab265148-ovn-node-metrics-cert\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.114875 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.114869 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/33fba0e3-6c1e-4bb4-a2e7-db58d15d18a4-agent-certs\") pod \"konnectivity-agent-fdq6c\" (UID: \"33fba0e3-6c1e-4bb4-a2e7-db58d15d18a4\") " pod="kube-system/konnectivity-agent-fdq6c" Apr 16 22:13:52.120768 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.120746 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-scbx9\" (UniqueName: \"kubernetes.io/projected/829af633-ea5e-4051-916a-45ddab265148-kube-api-access-scbx9\") pod \"ovnkube-node-qntcx\" (UID: \"829af633-ea5e-4051-916a-45ddab265148\") " pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.120879 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.120849 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg9jb\" (UniqueName: \"kubernetes.io/projected/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-kube-api-access-pg9jb\") pod \"network-metrics-daemon-cd2s2\" (UID: \"4ffa9102-be6a-431e-b1c8-ee3b5b01e588\") " pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:13:52.159010 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.158946 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:13:52.201412 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.201379 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-4h7zb" Apr 16 22:13:52.208194 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.208172 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2z49w" Apr 16 22:13:52.217320 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.217296 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-rztk7" Apr 16 22:13:52.222362 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.222334 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fcr4l" Apr 16 22:13:52.228894 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.228877 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6z8bb" Apr 16 22:13:52.235490 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.235459 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4mbj2" Apr 16 22:13:52.241095 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.241074 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" Apr 16 22:13:52.246702 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.246681 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:13:52.250214 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.250196 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fdq6c" Apr 16 22:13:52.489372 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:52.489344 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod428e9197_b90c_43be_a86a_81a7795c4f68.slice/crio-1b410bbec3aae48d6eba61f40f7f680a68997531b36194d2ffb7706344608fa7 WatchSource:0}: Error finding container 1b410bbec3aae48d6eba61f40f7f680a68997531b36194d2ffb7706344608fa7: Status 404 returned error can't find the container with id 1b410bbec3aae48d6eba61f40f7f680a68997531b36194d2ffb7706344608fa7 Apr 16 22:13:52.490230 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:52.490205 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf190b2ca_0861_4d63_b3a6_20531ce22e01.slice/crio-4075f7ed859cfe46b7c7c63311591805fe9ee136fd637ac044479818e62108e8 WatchSource:0}: Error finding container 4075f7ed859cfe46b7c7c63311591805fe9ee136fd637ac044479818e62108e8: Status 404 returned error can't find the container with id 4075f7ed859cfe46b7c7c63311591805fe9ee136fd637ac044479818e62108e8 Apr 16 22:13:52.491146 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:52.491123 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33fba0e3_6c1e_4bb4_a2e7_db58d15d18a4.slice/crio-bfd7d59cab42be1d40a1bdae6efbbb78910a6d50d371f59dd90cfe2de5f73bb2 WatchSource:0}: Error finding container bfd7d59cab42be1d40a1bdae6efbbb78910a6d50d371f59dd90cfe2de5f73bb2: Status 404 returned error can't find the container with id bfd7d59cab42be1d40a1bdae6efbbb78910a6d50d371f59dd90cfe2de5f73bb2 Apr 16 22:13:52.493084 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:52.493060 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52f439d6_f3b5_4680_9824_b2e26d67be20.slice/crio-6b065f75b6edb784f59267c308ebb8d6cf1d8b81fd4d12df352775ba28f3c9d8 WatchSource:0}: Error finding container 6b065f75b6edb784f59267c308ebb8d6cf1d8b81fd4d12df352775ba28f3c9d8: Status 404 returned error can't find the container with id 6b065f75b6edb784f59267c308ebb8d6cf1d8b81fd4d12df352775ba28f3c9d8 Apr 16 22:13:52.495894 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:52.495873 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8eecc640_8a16_4e14_9e02_e8cf75b619f9.slice/crio-2834ab40af2c30c47579fbc0dfca82d2ce1100da75f070171c701ec0c529bcd2 WatchSource:0}: Error finding container 2834ab40af2c30c47579fbc0dfca82d2ce1100da75f070171c701ec0c529bcd2: Status 404 returned error can't find the container with id 2834ab40af2c30c47579fbc0dfca82d2ce1100da75f070171c701ec0c529bcd2 Apr 16 22:13:52.496872 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:52.496760 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03fed82b_78d8_4490_8c58_013c3b157475.slice/crio-ee7cdeb73217c3304ac0a93178e056432eb1f205fb5a1caddfc9daa62426df18 WatchSource:0}: Error finding container ee7cdeb73217c3304ac0a93178e056432eb1f205fb5a1caddfc9daa62426df18: Status 404 returned error can't find the container with id ee7cdeb73217c3304ac0a93178e056432eb1f205fb5a1caddfc9daa62426df18 Apr 16 22:13:52.497895 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:52.497860 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcafa867_5e82_4ecb_b67b_a3ca411c6879.slice/crio-4a0bdd61f65fd3e009fce5c68cecfaf96d17e2c4733040f67183fbacee63f2b8 WatchSource:0}: Error finding container 4a0bdd61f65fd3e009fce5c68cecfaf96d17e2c4733040f67183fbacee63f2b8: Status 404 returned error can't find the container with id 4a0bdd61f65fd3e009fce5c68cecfaf96d17e2c4733040f67183fbacee63f2b8 Apr 16 22:13:52.498700 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:52.498678 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod829af633_ea5e_4051_916a_45ddab265148.slice/crio-41d0aa06ff42bac0abd4fd43b5153f2d0d2b6bccc2494a1ef9bb5df094de4734 WatchSource:0}: Error finding container 41d0aa06ff42bac0abd4fd43b5153f2d0d2b6bccc2494a1ef9bb5df094de4734: Status 404 returned error can't find the container with id 41d0aa06ff42bac0abd4fd43b5153f2d0d2b6bccc2494a1ef9bb5df094de4734 Apr 16 22:13:52.499980 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:13:52.499939 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod340bef7a_cec6_49ff_9089_98ba19d935b8.slice/crio-9098b6e523c1e6740808cc6ccfd077ce0d81c48a78fe1d878e22d90b297f759d WatchSource:0}: Error finding container 9098b6e523c1e6740808cc6ccfd077ce0d81c48a78fe1d878e22d90b297f759d: Status 404 returned error can't find the container with id 9098b6e523c1e6740808cc6ccfd077ce0d81c48a78fe1d878e22d90b297f759d Apr 16 22:13:52.615850 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.615673 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxvhn\" (UniqueName: \"kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn\") pod \"network-check-target-jmbfd\" (UID: \"782f4e27-ac2a-4597-87e3-319c3917a087\") " pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:13:52.615850 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:52.615838 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:52.615850 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:52.615855 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:52.616043 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:52.615864 2572 projected.go:194] Error preparing data for projected volume kube-api-access-zxvhn for pod openshift-network-diagnostics/network-check-target-jmbfd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:52.616043 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.615886 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs\") pod \"network-metrics-daemon-cd2s2\" (UID: \"4ffa9102-be6a-431e-b1c8-ee3b5b01e588\") " pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:13:52.616043 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:52.615911 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn podName:782f4e27-ac2a-4597-87e3-319c3917a087 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:53.615897946 +0000 UTC m=+4.196880143 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-zxvhn" (UniqueName: "kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn") pod "network-check-target-jmbfd" (UID: "782f4e27-ac2a-4597-87e3-319c3917a087") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:52.616043 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:52.615957 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:52.616043 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:52.616000 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs podName:4ffa9102-be6a-431e-b1c8-ee3b5b01e588 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:53.615988995 +0000 UTC m=+4.196971177 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs") pod "network-metrics-daemon-cd2s2" (UID: "4ffa9102-be6a-431e-b1c8-ee3b5b01e588") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:52.934963 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.934883 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 22:08:50 +0000 UTC" deadline="2028-02-01 18:07:04.615568658 +0000 UTC" Apr 16 22:13:52.934963 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:52.934927 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15739h53m11.680646014s" Apr 16 22:13:53.037462 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:53.037431 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:13:53.038000 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:53.037962 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cd2s2" podUID="4ffa9102-be6a-431e-b1c8-ee3b5b01e588" Apr 16 22:13:53.056741 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:53.056705 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4h7zb" event={"ID":"340bef7a-cec6-49ff-9089-98ba19d935b8","Type":"ContainerStarted","Data":"9098b6e523c1e6740808cc6ccfd077ce0d81c48a78fe1d878e22d90b297f759d"} Apr 16 22:13:53.075319 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:53.075261 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" event={"ID":"829af633-ea5e-4051-916a-45ddab265148","Type":"ContainerStarted","Data":"41d0aa06ff42bac0abd4fd43b5153f2d0d2b6bccc2494a1ef9bb5df094de4734"} Apr 16 22:13:53.084165 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:53.084134 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fcr4l" event={"ID":"03fed82b-78d8-4490-8c58-013c3b157475","Type":"ContainerStarted","Data":"ee7cdeb73217c3304ac0a93178e056432eb1f205fb5a1caddfc9daa62426df18"} Apr 16 22:13:53.090277 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:53.090210 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4mbj2" event={"ID":"8eecc640-8a16-4e14-9e02-e8cf75b619f9","Type":"ContainerStarted","Data":"2834ab40af2c30c47579fbc0dfca82d2ce1100da75f070171c701ec0c529bcd2"} Apr 16 22:13:53.092786 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:53.092740 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fdq6c" event={"ID":"33fba0e3-6c1e-4bb4-a2e7-db58d15d18a4","Type":"ContainerStarted","Data":"bfd7d59cab42be1d40a1bdae6efbbb78910a6d50d371f59dd90cfe2de5f73bb2"} Apr 16 22:13:53.097544 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:53.097016 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2z49w" event={"ID":"f190b2ca-0861-4d63-b3a6-20531ce22e01","Type":"ContainerStarted","Data":"4075f7ed859cfe46b7c7c63311591805fe9ee136fd637ac044479818e62108e8"} Apr 16 22:13:53.100166 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:53.099807 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-65.ec2.internal" event={"ID":"cb37a96c125ac85aff486f553494b98e","Type":"ContainerStarted","Data":"ef841cb8b1b37fb7de6a3ea20ab620bf626e16a8558a9a55a38711f72ad69023"} Apr 16 22:13:53.109420 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:53.109395 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" event={"ID":"bcafa867-5e82-4ecb-b67b-a3ca411c6879","Type":"ContainerStarted","Data":"4a0bdd61f65fd3e009fce5c68cecfaf96d17e2c4733040f67183fbacee63f2b8"} Apr 16 22:13:53.119253 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:53.119227 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z8bb" event={"ID":"52f439d6-f3b5-4680-9824-b2e26d67be20","Type":"ContainerStarted","Data":"6b065f75b6edb784f59267c308ebb8d6cf1d8b81fd4d12df352775ba28f3c9d8"} Apr 16 22:13:53.143716 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:53.143684 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rztk7" event={"ID":"428e9197-b90c-43be-a86a-81a7795c4f68","Type":"ContainerStarted","Data":"1b410bbec3aae48d6eba61f40f7f680a68997531b36194d2ffb7706344608fa7"} Apr 16 22:13:53.623417 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:53.622710 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxvhn\" (UniqueName: \"kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn\") pod \"network-check-target-jmbfd\" (UID: \"782f4e27-ac2a-4597-87e3-319c3917a087\") " pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:13:53.623417 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:53.622769 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs\") pod \"network-metrics-daemon-cd2s2\" (UID: \"4ffa9102-be6a-431e-b1c8-ee3b5b01e588\") " pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:13:53.623417 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:53.622913 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:53.623417 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:53.622913 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:53.623417 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:53.623008 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:53.623417 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:53.623023 2572 projected.go:194] Error preparing data for projected volume kube-api-access-zxvhn for pod openshift-network-diagnostics/network-check-target-jmbfd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:53.623417 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:53.623025 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs podName:4ffa9102-be6a-431e-b1c8-ee3b5b01e588 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:55.623005688 +0000 UTC m=+6.203987876 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs") pod "network-metrics-daemon-cd2s2" (UID: "4ffa9102-be6a-431e-b1c8-ee3b5b01e588") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:53.623417 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:53.623079 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn podName:782f4e27-ac2a-4597-87e3-319c3917a087 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:55.623061646 +0000 UTC m=+6.204043846 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-zxvhn" (UniqueName: "kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn") pod "network-check-target-jmbfd" (UID: "782f4e27-ac2a-4597-87e3-319c3917a087") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:54.041189 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:54.040635 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:13:54.041189 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:54.040765 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jmbfd" podUID="782f4e27-ac2a-4597-87e3-319c3917a087" Apr 16 22:13:54.164315 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:54.163497 2572 generic.go:358] "Generic (PLEG): container finished" podID="9ff930f10e817d00d22cb5161373a31a" containerID="abf94c8c680c3c7362fa0d32e1157ae98f17abe3ba3990e5c39f56e7c4e0f216" exitCode=0 Apr 16 22:13:54.164493 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:54.164450 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-65.ec2.internal" event={"ID":"9ff930f10e817d00d22cb5161373a31a","Type":"ContainerDied","Data":"abf94c8c680c3c7362fa0d32e1157ae98f17abe3ba3990e5c39f56e7c4e0f216"} Apr 16 22:13:54.177856 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:54.176982 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-65.ec2.internal" podStartSLOduration=3.176962833 podStartE2EDuration="3.176962833s" podCreationTimestamp="2026-04-16 22:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:13:53.113264781 +0000 UTC m=+3.694246981" watchObservedRunningTime="2026-04-16 22:13:54.176962833 +0000 UTC m=+4.757945039" Apr 16 22:13:55.037931 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:55.037886 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:13:55.038126 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:55.038031 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cd2s2" podUID="4ffa9102-be6a-431e-b1c8-ee3b5b01e588" Apr 16 22:13:55.181541 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:55.181507 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-65.ec2.internal" event={"ID":"9ff930f10e817d00d22cb5161373a31a","Type":"ContainerStarted","Data":"148fe3203a9ebe9014d774ec019cc34a8e733298bdabd1814922ec564422dd22"} Apr 16 22:13:55.643117 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:55.642783 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxvhn\" (UniqueName: \"kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn\") pod \"network-check-target-jmbfd\" (UID: \"782f4e27-ac2a-4597-87e3-319c3917a087\") " pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:13:55.643117 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:55.642855 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs\") pod \"network-metrics-daemon-cd2s2\" (UID: \"4ffa9102-be6a-431e-b1c8-ee3b5b01e588\") " pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:13:55.643117 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:55.642990 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:55.643117 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:55.643049 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs podName:4ffa9102-be6a-431e-b1c8-ee3b5b01e588 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:59.643031361 +0000 UTC m=+10.224013551 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs") pod "network-metrics-daemon-cd2s2" (UID: "4ffa9102-be6a-431e-b1c8-ee3b5b01e588") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:55.643469 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:55.643442 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:55.643469 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:55.643462 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:55.643643 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:55.643475 2572 projected.go:194] Error preparing data for projected volume kube-api-access-zxvhn for pod openshift-network-diagnostics/network-check-target-jmbfd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:55.643643 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:55.643525 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn podName:782f4e27-ac2a-4597-87e3-319c3917a087 nodeName:}" failed. No retries permitted until 2026-04-16 22:13:59.643508677 +0000 UTC m=+10.224490877 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-zxvhn" (UniqueName: "kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn") pod "network-check-target-jmbfd" (UID: "782f4e27-ac2a-4597-87e3-319c3917a087") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:56.040668 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:56.040133 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:13:56.040668 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:56.040258 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jmbfd" podUID="782f4e27-ac2a-4597-87e3-319c3917a087" Apr 16 22:13:57.037854 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:57.037817 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:13:57.038322 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:57.037967 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cd2s2" podUID="4ffa9102-be6a-431e-b1c8-ee3b5b01e588" Apr 16 22:13:58.039471 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:58.038788 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:13:58.039471 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:58.038943 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jmbfd" podUID="782f4e27-ac2a-4597-87e3-319c3917a087" Apr 16 22:13:59.037994 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:59.037959 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:13:59.038192 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:59.038107 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cd2s2" podUID="4ffa9102-be6a-431e-b1c8-ee3b5b01e588" Apr 16 22:13:59.678587 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:59.678518 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxvhn\" (UniqueName: \"kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn\") pod \"network-check-target-jmbfd\" (UID: \"782f4e27-ac2a-4597-87e3-319c3917a087\") " pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:13:59.678587 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:13:59.678584 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs\") pod \"network-metrics-daemon-cd2s2\" (UID: \"4ffa9102-be6a-431e-b1c8-ee3b5b01e588\") " pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:13:59.679040 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:59.678694 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:13:59.679040 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:59.678713 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:13:59.679040 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:59.678724 2572 projected.go:194] Error preparing data for projected volume kube-api-access-zxvhn for pod openshift-network-diagnostics/network-check-target-jmbfd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:59.679040 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:59.678773 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn podName:782f4e27-ac2a-4597-87e3-319c3917a087 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:07.678758415 +0000 UTC m=+18.259740597 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-zxvhn" (UniqueName: "kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn") pod "network-check-target-jmbfd" (UID: "782f4e27-ac2a-4597-87e3-319c3917a087") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:13:59.679040 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:59.678699 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:13:59.679040 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:13:59.678846 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs podName:4ffa9102-be6a-431e-b1c8-ee3b5b01e588 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:07.678825738 +0000 UTC m=+18.259807944 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs") pod "network-metrics-daemon-cd2s2" (UID: "4ffa9102-be6a-431e-b1c8-ee3b5b01e588") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:00.040282 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:00.040250 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:14:00.040498 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:00.040379 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jmbfd" podUID="782f4e27-ac2a-4597-87e3-319c3917a087" Apr 16 22:14:01.037374 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:01.037343 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:14:01.037873 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:01.037564 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cd2s2" podUID="4ffa9102-be6a-431e-b1c8-ee3b5b01e588" Apr 16 22:14:02.037991 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:02.037949 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:14:02.038454 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:02.038076 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jmbfd" podUID="782f4e27-ac2a-4597-87e3-319c3917a087" Apr 16 22:14:03.037725 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:03.037682 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:14:03.038017 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:03.037832 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cd2s2" podUID="4ffa9102-be6a-431e-b1c8-ee3b5b01e588" Apr 16 22:14:04.038104 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:04.038069 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:14:04.038625 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:04.038222 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jmbfd" podUID="782f4e27-ac2a-4597-87e3-319c3917a087" Apr 16 22:14:05.037768 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:05.037723 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:14:05.037966 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:05.037869 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cd2s2" podUID="4ffa9102-be6a-431e-b1c8-ee3b5b01e588" Apr 16 22:14:06.037604 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:06.037565 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:14:06.038006 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:06.037698 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jmbfd" podUID="782f4e27-ac2a-4597-87e3-319c3917a087" Apr 16 22:14:07.037930 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:07.037892 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:14:07.038480 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:07.038024 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cd2s2" podUID="4ffa9102-be6a-431e-b1c8-ee3b5b01e588" Apr 16 22:14:07.741244 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:07.741156 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxvhn\" (UniqueName: \"kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn\") pod \"network-check-target-jmbfd\" (UID: \"782f4e27-ac2a-4597-87e3-319c3917a087\") " pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:14:07.741244 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:07.741212 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs\") pod \"network-metrics-daemon-cd2s2\" (UID: \"4ffa9102-be6a-431e-b1c8-ee3b5b01e588\") " pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:14:07.741477 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:07.741328 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:07.741477 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:07.741393 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs podName:4ffa9102-be6a-431e-b1c8-ee3b5b01e588 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:23.74137563 +0000 UTC m=+34.322357812 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs") pod "network-metrics-daemon-cd2s2" (UID: "4ffa9102-be6a-431e-b1c8-ee3b5b01e588") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:14:07.741477 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:07.741326 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:14:07.741477 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:07.741443 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:14:07.741477 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:07.741458 2572 projected.go:194] Error preparing data for projected volume kube-api-access-zxvhn for pod openshift-network-diagnostics/network-check-target-jmbfd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:07.741747 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:07.741506 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn podName:782f4e27-ac2a-4597-87e3-319c3917a087 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:23.741494259 +0000 UTC m=+34.322476442 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-zxvhn" (UniqueName: "kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn") pod "network-check-target-jmbfd" (UID: "782f4e27-ac2a-4597-87e3-319c3917a087") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:08.037696 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:08.037657 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:14:08.037874 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:08.037792 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jmbfd" podUID="782f4e27-ac2a-4597-87e3-319c3917a087" Apr 16 22:14:09.037912 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:09.037893 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:14:09.038196 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:09.037991 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cd2s2" podUID="4ffa9102-be6a-431e-b1c8-ee3b5b01e588" Apr 16 22:14:10.038247 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.037967 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:14:10.038820 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:10.038320 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jmbfd" podUID="782f4e27-ac2a-4597-87e3-319c3917a087" Apr 16 22:14:10.207959 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.207928 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" event={"ID":"829af633-ea5e-4051-916a-45ddab265148","Type":"ContainerStarted","Data":"74c1a317decd118fe5820921a18c5eab13ba4a82706db95d9c4cbbb13da3fee6"} Apr 16 22:14:10.208081 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.207969 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" event={"ID":"829af633-ea5e-4051-916a-45ddab265148","Type":"ContainerStarted","Data":"383fd30d61d09fe782a985827d197fc8274578911797e3fbd63ace792c11aff8"} Apr 16 22:14:10.208081 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.207980 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" event={"ID":"829af633-ea5e-4051-916a-45ddab265148","Type":"ContainerStarted","Data":"ae0744e700153841060cdcbe2256e808edb421f90ec9e717905faf0a85a59d47"} Apr 16 22:14:10.208081 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.207990 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" event={"ID":"829af633-ea5e-4051-916a-45ddab265148","Type":"ContainerStarted","Data":"f1b35341175268228d491286c73058f91ac3718f8eeb81383f68feda06ef8b24"} Apr 16 22:14:10.208081 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.207997 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" event={"ID":"829af633-ea5e-4051-916a-45ddab265148","Type":"ContainerStarted","Data":"807fa8e748e880760bc2a866286d813162e165933128f17803c1f43368706181"} Apr 16 22:14:10.208999 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.208972 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fcr4l" event={"ID":"03fed82b-78d8-4490-8c58-013c3b157475","Type":"ContainerStarted","Data":"2859d4c262ed288cf61d452d2f1e8ab006e854a9659f91e4f6b290af85355388"} Apr 16 22:14:10.210036 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.210014 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fdq6c" event={"ID":"33fba0e3-6c1e-4bb4-a2e7-db58d15d18a4","Type":"ContainerStarted","Data":"c63c92d023bd74eef752da173886d05bf100f5ae4db53f82fb7e319d7a409a74"} Apr 16 22:14:10.211090 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.211068 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2z49w" event={"ID":"f190b2ca-0861-4d63-b3a6-20531ce22e01","Type":"ContainerStarted","Data":"8ee9a61f0d29f9194dbf5037b435c29893c0be8711e4345475838dfee465dba6"} Apr 16 22:14:10.212242 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.212221 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" event={"ID":"bcafa867-5e82-4ecb-b67b-a3ca411c6879","Type":"ContainerStarted","Data":"22c2cad9023d8b78debe5c9c8cca1eb80662379db4937364dff01a0a874e0e1e"} Apr 16 22:14:10.213416 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.213397 2572 generic.go:358] "Generic (PLEG): container finished" podID="52f439d6-f3b5-4680-9824-b2e26d67be20" containerID="571cec8bceac100a425ad5d2bcb98d5c1d0a5583507e0c725029746cfe3a8ea3" exitCode=0 Apr 16 22:14:10.213501 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.213450 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z8bb" event={"ID":"52f439d6-f3b5-4680-9824-b2e26d67be20","Type":"ContainerDied","Data":"571cec8bceac100a425ad5d2bcb98d5c1d0a5583507e0c725029746cfe3a8ea3"} Apr 16 22:14:10.214735 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.214682 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-rztk7" event={"ID":"428e9197-b90c-43be-a86a-81a7795c4f68","Type":"ContainerStarted","Data":"316c992adb20ff00aaa8394d77b08012032d16f8a002f93916593af711f27b4d"} Apr 16 22:14:10.215881 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.215855 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-4h7zb" event={"ID":"340bef7a-cec6-49ff-9089-98ba19d935b8","Type":"ContainerStarted","Data":"d4acff00aecf7180669b523a6e5a66a8cc9d98a21e396ccb8c69eaaf4f2300f2"} Apr 16 22:14:10.223498 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.223463 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fcr4l" podStartSLOduration=3.536516539 podStartE2EDuration="20.223452447s" podCreationTimestamp="2026-04-16 22:13:50 +0000 UTC" firstStartedPulling="2026-04-16 22:13:52.499636964 +0000 UTC m=+3.080619154" lastFinishedPulling="2026-04-16 22:14:09.186572879 +0000 UTC m=+19.767555062" observedRunningTime="2026-04-16 22:14:10.223409304 +0000 UTC m=+20.804391508" watchObservedRunningTime="2026-04-16 22:14:10.223452447 +0000 UTC m=+20.804434653" Apr 16 22:14:10.223767 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.223741 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-65.ec2.internal" podStartSLOduration=19.223733133 podStartE2EDuration="19.223733133s" podCreationTimestamp="2026-04-16 22:13:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:13:55.194991034 +0000 UTC m=+5.775973241" watchObservedRunningTime="2026-04-16 22:14:10.223733133 +0000 UTC m=+20.804715376" Apr 16 22:14:10.237283 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.237247 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-fdq6c" podStartSLOduration=11.414223738 podStartE2EDuration="20.237240197s" podCreationTimestamp="2026-04-16 22:13:50 +0000 UTC" firstStartedPulling="2026-04-16 22:13:52.493748027 +0000 UTC m=+3.074730211" lastFinishedPulling="2026-04-16 22:14:01.316764486 +0000 UTC m=+11.897746670" observedRunningTime="2026-04-16 22:14:10.236859897 +0000 UTC m=+20.817842103" watchObservedRunningTime="2026-04-16 22:14:10.237240197 +0000 UTC m=+20.818222439" Apr 16 22:14:10.252744 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.252702 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-2z49w" podStartSLOduration=3.556781269 podStartE2EDuration="20.252694621s" podCreationTimestamp="2026-04-16 22:13:50 +0000 UTC" firstStartedPulling="2026-04-16 22:13:52.4922938 +0000 UTC m=+3.073275983" lastFinishedPulling="2026-04-16 22:14:09.188207149 +0000 UTC m=+19.769189335" observedRunningTime="2026-04-16 22:14:10.252378224 +0000 UTC m=+20.833360428" watchObservedRunningTime="2026-04-16 22:14:10.252694621 +0000 UTC m=+20.833676825" Apr 16 22:14:10.272386 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.272342 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-rztk7" podStartSLOduration=3.5494667 podStartE2EDuration="20.272328469s" podCreationTimestamp="2026-04-16 22:13:50 +0000 UTC" firstStartedPulling="2026-04-16 22:13:52.491213546 +0000 UTC m=+3.072195730" lastFinishedPulling="2026-04-16 22:14:09.214075314 +0000 UTC m=+19.795057499" observedRunningTime="2026-04-16 22:14:10.271868086 +0000 UTC m=+20.852850292" watchObservedRunningTime="2026-04-16 22:14:10.272328469 +0000 UTC m=+20.853310674" Apr 16 22:14:10.293044 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.293000 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-4h7zb" podStartSLOduration=3.770466012 podStartE2EDuration="20.292988227s" podCreationTimestamp="2026-04-16 22:13:50 +0000 UTC" firstStartedPulling="2026-04-16 22:13:52.503588045 +0000 UTC m=+3.084570234" lastFinishedPulling="2026-04-16 22:14:09.026110251 +0000 UTC m=+19.607092449" observedRunningTime="2026-04-16 22:14:10.292733291 +0000 UTC m=+20.873715521" watchObservedRunningTime="2026-04-16 22:14:10.292988227 +0000 UTC m=+20.873970431" Apr 16 22:14:10.710163 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.710113 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 22:14:10.978148 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.977853 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T22:14:10.710136654Z","UUID":"19eb7172-7a82-40b8-b1c5-9c0e1f270255","Handler":null,"Name":"","Endpoint":""} Apr 16 22:14:10.979801 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.979772 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 22:14:10.979801 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:10.979807 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 22:14:11.037406 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:11.037378 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:14:11.037599 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:11.037510 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cd2s2" podUID="4ffa9102-be6a-431e-b1c8-ee3b5b01e588" Apr 16 22:14:11.220183 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:11.220112 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" event={"ID":"bcafa867-5e82-4ecb-b67b-a3ca411c6879","Type":"ContainerStarted","Data":"556b156b0905aa3c4d5b59a94624576066d9273519e759105f7cdf0b0cf4f755"} Apr 16 22:14:11.225766 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:11.225725 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" event={"ID":"829af633-ea5e-4051-916a-45ddab265148","Type":"ContainerStarted","Data":"fa23a3e2a19cb7c2b636404b29679fdd0c00f878a5c0a4c8deca442ba3016cbc"} Apr 16 22:14:11.227767 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:11.227732 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4mbj2" event={"ID":"8eecc640-8a16-4e14-9e02-e8cf75b619f9","Type":"ContainerStarted","Data":"7713ea2665d83d5f4285f1bc38a75231fc450978d82061f7c439ec871ff6f44f"} Apr 16 22:14:11.244337 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:11.244126 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-4mbj2" podStartSLOduration=4.71658926 podStartE2EDuration="21.244102592s" podCreationTimestamp="2026-04-16 22:13:50 +0000 UTC" firstStartedPulling="2026-04-16 22:13:52.498590006 +0000 UTC m=+3.079572189" lastFinishedPulling="2026-04-16 22:14:09.026103323 +0000 UTC m=+19.607085521" observedRunningTime="2026-04-16 22:14:11.242945535 +0000 UTC m=+21.823927740" watchObservedRunningTime="2026-04-16 22:14:11.244102592 +0000 UTC m=+21.825084798" Apr 16 22:14:12.037287 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:12.037253 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:14:12.037472 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:12.037390 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jmbfd" podUID="782f4e27-ac2a-4597-87e3-319c3917a087" Apr 16 22:14:12.231835 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:12.231742 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" event={"ID":"bcafa867-5e82-4ecb-b67b-a3ca411c6879","Type":"ContainerStarted","Data":"0bb1de81cec8a85e00b7d19ad2b855511fe231ce828113ca47d214f924ea4568"} Apr 16 22:14:12.256239 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:12.256193 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kdw64" podStartSLOduration=2.878737785 podStartE2EDuration="22.256173108s" podCreationTimestamp="2026-04-16 22:13:50 +0000 UTC" firstStartedPulling="2026-04-16 22:13:52.500919168 +0000 UTC m=+3.081901362" lastFinishedPulling="2026-04-16 22:14:11.878354502 +0000 UTC m=+22.459336685" observedRunningTime="2026-04-16 22:14:12.254628606 +0000 UTC m=+22.835610809" watchObservedRunningTime="2026-04-16 22:14:12.256173108 +0000 UTC m=+22.837155313" Apr 16 22:14:13.037822 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:13.037794 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:14:13.037989 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:13.037931 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cd2s2" podUID="4ffa9102-be6a-431e-b1c8-ee3b5b01e588" Apr 16 22:14:13.237379 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:13.237196 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" event={"ID":"829af633-ea5e-4051-916a-45ddab265148","Type":"ContainerStarted","Data":"da12efd088d20495deb7d835a6bda8e807ea5dff04c270a688bbb60418c33d60"} Apr 16 22:14:14.037811 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:14.037775 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:14:14.038012 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:14.037912 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jmbfd" podUID="782f4e27-ac2a-4597-87e3-319c3917a087" Apr 16 22:14:14.895565 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:14.895513 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-fdq6c" Apr 16 22:14:14.896310 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:14.896131 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-fdq6c" Apr 16 22:14:15.037635 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:15.037609 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:14:15.037788 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:15.037720 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cd2s2" podUID="4ffa9102-be6a-431e-b1c8-ee3b5b01e588" Apr 16 22:14:15.244229 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:15.244189 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" event={"ID":"829af633-ea5e-4051-916a-45ddab265148","Type":"ContainerStarted","Data":"06410156ae3d9e3b6355919763f5a0d26bddb8df2e463723ae0e57414521d86b"} Apr 16 22:14:15.244427 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:15.244407 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:14:15.244573 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:15.244521 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:14:15.245752 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:15.245730 2572 generic.go:358] "Generic (PLEG): container finished" podID="52f439d6-f3b5-4680-9824-b2e26d67be20" containerID="a9c3a186a93915ad4310ac70bf1eaca84cb2da27142a03ec006a720befcf53ab" exitCode=0 Apr 16 22:14:15.245854 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:15.245807 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z8bb" event={"ID":"52f439d6-f3b5-4680-9824-b2e26d67be20","Type":"ContainerDied","Data":"a9c3a186a93915ad4310ac70bf1eaca84cb2da27142a03ec006a720befcf53ab"} Apr 16 22:14:15.259923 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:15.259904 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:14:15.271233 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:15.271198 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" podStartSLOduration=8.523156411 podStartE2EDuration="25.271187002s" podCreationTimestamp="2026-04-16 22:13:50 +0000 UTC" firstStartedPulling="2026-04-16 22:13:52.502225203 +0000 UTC m=+3.083207387" lastFinishedPulling="2026-04-16 22:14:09.250255795 +0000 UTC m=+19.831237978" observedRunningTime="2026-04-16 22:14:15.270156082 +0000 UTC m=+25.851138287" watchObservedRunningTime="2026-04-16 22:14:15.271187002 +0000 UTC m=+25.852169207" Apr 16 22:14:16.037510 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:16.037479 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:14:16.037886 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:16.037601 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jmbfd" podUID="782f4e27-ac2a-4597-87e3-319c3917a087" Apr 16 22:14:16.248214 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:16.248183 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:14:16.261147 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:16.261124 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:14:16.671999 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:16.671919 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jmbfd"] Apr 16 22:14:16.672148 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:16.672062 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:14:16.672240 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:16.672171 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jmbfd" podUID="782f4e27-ac2a-4597-87e3-319c3917a087" Apr 16 22:14:16.673366 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:16.673342 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cd2s2"] Apr 16 22:14:16.673481 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:16.673436 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:14:16.673596 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:16.673573 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cd2s2" podUID="4ffa9102-be6a-431e-b1c8-ee3b5b01e588" Apr 16 22:14:17.251066 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:17.251033 2572 generic.go:358] "Generic (PLEG): container finished" podID="52f439d6-f3b5-4680-9824-b2e26d67be20" containerID="9095121017d97a7d5c18dc3385b531e97a50aba022977b5b42fa9fd4c63964bf" exitCode=0 Apr 16 22:14:17.251435 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:17.251120 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z8bb" event={"ID":"52f439d6-f3b5-4680-9824-b2e26d67be20","Type":"ContainerDied","Data":"9095121017d97a7d5c18dc3385b531e97a50aba022977b5b42fa9fd4c63964bf"} Apr 16 22:14:18.037423 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:18.037390 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:14:18.037681 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:18.037493 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jmbfd" podUID="782f4e27-ac2a-4597-87e3-319c3917a087" Apr 16 22:14:18.062243 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:18.062214 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-fdq6c" Apr 16 22:14:18.062372 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:18.062352 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 22:14:18.062883 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:18.062855 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-fdq6c" Apr 16 22:14:19.037184 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:19.037156 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:14:19.037628 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:19.037280 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cd2s2" podUID="4ffa9102-be6a-431e-b1c8-ee3b5b01e588" Apr 16 22:14:19.257915 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:19.257789 2572 generic.go:358] "Generic (PLEG): container finished" podID="52f439d6-f3b5-4680-9824-b2e26d67be20" containerID="ceea271dc5ed8653d9da9e57ebaf53f2fb85fd8bf39fa2ec50d372a439e512ab" exitCode=0 Apr 16 22:14:19.257915 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:19.257832 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z8bb" event={"ID":"52f439d6-f3b5-4680-9824-b2e26d67be20","Type":"ContainerDied","Data":"ceea271dc5ed8653d9da9e57ebaf53f2fb85fd8bf39fa2ec50d372a439e512ab"} Apr 16 22:14:20.039336 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:20.039297 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:14:20.039903 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:20.039417 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jmbfd" podUID="782f4e27-ac2a-4597-87e3-319c3917a087" Apr 16 22:14:21.037690 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:21.037656 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:14:21.037874 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:21.037796 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cd2s2" podUID="4ffa9102-be6a-431e-b1c8-ee3b5b01e588" Apr 16 22:14:22.037651 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.037620 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:14:22.038091 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:22.037730 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jmbfd" podUID="782f4e27-ac2a-4597-87e3-319c3917a087" Apr 16 22:14:22.223218 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.223137 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-65.ec2.internal" event="NodeReady" Apr 16 22:14:22.223387 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.223301 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 22:14:22.277193 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.277166 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-x4vpp"] Apr 16 22:14:22.294153 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.294121 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-g8j9n"] Apr 16 22:14:22.294327 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.294312 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x4vpp" Apr 16 22:14:22.298969 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.298940 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 22:14:22.299115 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.298989 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 22:14:22.299641 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.299616 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fk2j6\"" Apr 16 22:14:22.320659 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.320638 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x4vpp"] Apr 16 22:14:22.320659 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.320662 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-g8j9n"] Apr 16 22:14:22.320878 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.320761 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-g8j9n" Apr 16 22:14:22.323823 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.323801 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-86pmr\"" Apr 16 22:14:22.324773 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.324752 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 22:14:22.325050 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.325025 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 22:14:22.325200 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.325182 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 22:14:22.453013 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.452965 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls\") pod \"dns-default-x4vpp\" (UID: \"6f971928-4414-4b05-9352-534e9045942e\") " pod="openshift-dns/dns-default-x4vpp" Apr 16 22:14:22.453013 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.453012 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hv7p\" (UniqueName: \"kubernetes.io/projected/6f971928-4414-4b05-9352-534e9045942e-kube-api-access-6hv7p\") pod \"dns-default-x4vpp\" (UID: \"6f971928-4414-4b05-9352-534e9045942e\") " pod="openshift-dns/dns-default-x4vpp" Apr 16 22:14:22.453232 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.453072 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxd84\" (UniqueName: \"kubernetes.io/projected/f12d17a9-d1af-465c-a23e-81d8f09a5156-kube-api-access-sxd84\") pod \"ingress-canary-g8j9n\" (UID: \"f12d17a9-d1af-465c-a23e-81d8f09a5156\") " pod="openshift-ingress-canary/ingress-canary-g8j9n" Apr 16 22:14:22.453232 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.453160 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f971928-4414-4b05-9352-534e9045942e-config-volume\") pod \"dns-default-x4vpp\" (UID: \"6f971928-4414-4b05-9352-534e9045942e\") " pod="openshift-dns/dns-default-x4vpp" Apr 16 22:14:22.453232 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.453204 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6f971928-4414-4b05-9352-534e9045942e-tmp-dir\") pod \"dns-default-x4vpp\" (UID: \"6f971928-4414-4b05-9352-534e9045942e\") " pod="openshift-dns/dns-default-x4vpp" Apr 16 22:14:22.453232 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.453229 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert\") pod \"ingress-canary-g8j9n\" (UID: \"f12d17a9-d1af-465c-a23e-81d8f09a5156\") " pod="openshift-ingress-canary/ingress-canary-g8j9n" Apr 16 22:14:22.553821 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.553781 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f971928-4414-4b05-9352-534e9045942e-config-volume\") pod \"dns-default-x4vpp\" (UID: \"6f971928-4414-4b05-9352-534e9045942e\") " pod="openshift-dns/dns-default-x4vpp" Apr 16 22:14:22.553821 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.553826 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6f971928-4414-4b05-9352-534e9045942e-tmp-dir\") pod \"dns-default-x4vpp\" (UID: \"6f971928-4414-4b05-9352-534e9045942e\") " pod="openshift-dns/dns-default-x4vpp" Apr 16 22:14:22.554072 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.553856 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert\") pod \"ingress-canary-g8j9n\" (UID: \"f12d17a9-d1af-465c-a23e-81d8f09a5156\") " pod="openshift-ingress-canary/ingress-canary-g8j9n" Apr 16 22:14:22.554072 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.553896 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls\") pod \"dns-default-x4vpp\" (UID: \"6f971928-4414-4b05-9352-534e9045942e\") " pod="openshift-dns/dns-default-x4vpp" Apr 16 22:14:22.554072 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.553920 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hv7p\" (UniqueName: \"kubernetes.io/projected/6f971928-4414-4b05-9352-534e9045942e-kube-api-access-6hv7p\") pod \"dns-default-x4vpp\" (UID: \"6f971928-4414-4b05-9352-534e9045942e\") " pod="openshift-dns/dns-default-x4vpp" Apr 16 22:14:22.554072 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:22.554010 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:22.554072 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.554061 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxd84\" (UniqueName: \"kubernetes.io/projected/f12d17a9-d1af-465c-a23e-81d8f09a5156-kube-api-access-sxd84\") pod \"ingress-canary-g8j9n\" (UID: \"f12d17a9-d1af-465c-a23e-81d8f09a5156\") " pod="openshift-ingress-canary/ingress-canary-g8j9n" Apr 16 22:14:22.554301 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:22.554095 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert podName:f12d17a9-d1af-465c-a23e-81d8f09a5156 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:23.054073556 +0000 UTC m=+33.635055742 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert") pod "ingress-canary-g8j9n" (UID: "f12d17a9-d1af-465c-a23e-81d8f09a5156") : secret "canary-serving-cert" not found Apr 16 22:14:22.554301 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:22.554014 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:22.554301 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:22.554193 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls podName:6f971928-4414-4b05-9352-534e9045942e nodeName:}" failed. No retries permitted until 2026-04-16 22:14:23.054180067 +0000 UTC m=+33.635162254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls") pod "dns-default-x4vpp" (UID: "6f971928-4414-4b05-9352-534e9045942e") : secret "dns-default-metrics-tls" not found Apr 16 22:14:22.554301 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.554245 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6f971928-4414-4b05-9352-534e9045942e-tmp-dir\") pod \"dns-default-x4vpp\" (UID: \"6f971928-4414-4b05-9352-534e9045942e\") " pod="openshift-dns/dns-default-x4vpp" Apr 16 22:14:22.566347 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.566322 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hv7p\" (UniqueName: \"kubernetes.io/projected/6f971928-4414-4b05-9352-534e9045942e-kube-api-access-6hv7p\") pod \"dns-default-x4vpp\" (UID: \"6f971928-4414-4b05-9352-534e9045942e\") " pod="openshift-dns/dns-default-x4vpp" Apr 16 22:14:22.566454 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.566422 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxd84\" (UniqueName: \"kubernetes.io/projected/f12d17a9-d1af-465c-a23e-81d8f09a5156-kube-api-access-sxd84\") pod \"ingress-canary-g8j9n\" (UID: \"f12d17a9-d1af-465c-a23e-81d8f09a5156\") " pod="openshift-ingress-canary/ingress-canary-g8j9n" Apr 16 22:14:22.566790 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:22.566773 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f971928-4414-4b05-9352-534e9045942e-config-volume\") pod \"dns-default-x4vpp\" (UID: \"6f971928-4414-4b05-9352-534e9045942e\") " pod="openshift-dns/dns-default-x4vpp" Apr 16 22:14:23.037419 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:23.037380 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:14:23.040374 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:23.040350 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7dlkb\"" Apr 16 22:14:23.041048 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:23.040390 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:14:23.059235 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:23.059203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert\") pod \"ingress-canary-g8j9n\" (UID: \"f12d17a9-d1af-465c-a23e-81d8f09a5156\") " pod="openshift-ingress-canary/ingress-canary-g8j9n" Apr 16 22:14:23.059381 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:23.059259 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls\") pod \"dns-default-x4vpp\" (UID: \"6f971928-4414-4b05-9352-534e9045942e\") " pod="openshift-dns/dns-default-x4vpp" Apr 16 22:14:23.059447 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:23.059371 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:23.059492 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:23.059447 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert podName:f12d17a9-d1af-465c-a23e-81d8f09a5156 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:24.059427103 +0000 UTC m=+34.640409287 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert") pod "ingress-canary-g8j9n" (UID: "f12d17a9-d1af-465c-a23e-81d8f09a5156") : secret "canary-serving-cert" not found Apr 16 22:14:23.059492 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:23.059457 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:23.059616 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:23.059513 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls podName:6f971928-4414-4b05-9352-534e9045942e nodeName:}" failed. No retries permitted until 2026-04-16 22:14:24.059497486 +0000 UTC m=+34.640479670 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls") pod "dns-default-x4vpp" (UID: "6f971928-4414-4b05-9352-534e9045942e") : secret "dns-default-metrics-tls" not found Apr 16 22:14:23.764258 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:23.764222 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxvhn\" (UniqueName: \"kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn\") pod \"network-check-target-jmbfd\" (UID: \"782f4e27-ac2a-4597-87e3-319c3917a087\") " pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:14:23.764258 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:23.764264 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs\") pod \"network-metrics-daemon-cd2s2\" (UID: \"4ffa9102-be6a-431e-b1c8-ee3b5b01e588\") " pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:14:23.764525 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:23.764400 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:14:23.764525 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:23.764409 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:14:23.764525 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:23.764432 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:14:23.764525 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:23.764441 2572 projected.go:194] Error preparing data for projected volume kube-api-access-zxvhn for pod openshift-network-diagnostics/network-check-target-jmbfd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:23.764525 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:23.764457 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs podName:4ffa9102-be6a-431e-b1c8-ee3b5b01e588 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:55.764442918 +0000 UTC m=+66.345425101 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs") pod "network-metrics-daemon-cd2s2" (UID: "4ffa9102-be6a-431e-b1c8-ee3b5b01e588") : secret "metrics-daemon-secret" not found Apr 16 22:14:23.764525 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:23.764487 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn podName:782f4e27-ac2a-4597-87e3-319c3917a087 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:55.764470053 +0000 UTC m=+66.345452239 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-zxvhn" (UniqueName: "kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn") pod "network-check-target-jmbfd" (UID: "782f4e27-ac2a-4597-87e3-319c3917a087") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:14:24.037742 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:24.037650 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:14:24.040671 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:24.040542 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-2r56c\"" Apr 16 22:14:24.040671 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:24.040612 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:14:24.041412 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:24.041392 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:14:24.067334 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:24.067299 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert\") pod \"ingress-canary-g8j9n\" (UID: \"f12d17a9-d1af-465c-a23e-81d8f09a5156\") " pod="openshift-ingress-canary/ingress-canary-g8j9n" Apr 16 22:14:24.067334 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:24.067348 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls\") pod \"dns-default-x4vpp\" (UID: \"6f971928-4414-4b05-9352-534e9045942e\") " pod="openshift-dns/dns-default-x4vpp" Apr 16 22:14:24.067591 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:24.067469 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:24.067591 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:24.067489 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:24.067591 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:24.067569 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert podName:f12d17a9-d1af-465c-a23e-81d8f09a5156 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:26.067533159 +0000 UTC m=+36.648515345 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert") pod "ingress-canary-g8j9n" (UID: "f12d17a9-d1af-465c-a23e-81d8f09a5156") : secret "canary-serving-cert" not found Apr 16 22:14:24.067591 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:24.067591 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls podName:6f971928-4414-4b05-9352-534e9045942e nodeName:}" failed. No retries permitted until 2026-04-16 22:14:26.0675803 +0000 UTC m=+36.648562490 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls") pod "dns-default-x4vpp" (UID: "6f971928-4414-4b05-9352-534e9045942e") : secret "dns-default-metrics-tls" not found Apr 16 22:14:25.273488 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:25.273218 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z8bb" event={"ID":"52f439d6-f3b5-4680-9824-b2e26d67be20","Type":"ContainerStarted","Data":"b84cfbd566f2c085bf9e2b6b3c9cb399ec60fba691fb890b12f0f12f309a5188"} Apr 16 22:14:26.081750 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:26.081710 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert\") pod \"ingress-canary-g8j9n\" (UID: \"f12d17a9-d1af-465c-a23e-81d8f09a5156\") " pod="openshift-ingress-canary/ingress-canary-g8j9n" Apr 16 22:14:26.081914 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:26.081771 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls\") pod \"dns-default-x4vpp\" (UID: \"6f971928-4414-4b05-9352-534e9045942e\") " pod="openshift-dns/dns-default-x4vpp" Apr 16 22:14:26.081914 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:26.081864 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:26.081914 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:26.081873 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:26.082011 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:26.081924 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert podName:f12d17a9-d1af-465c-a23e-81d8f09a5156 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:30.081907112 +0000 UTC m=+40.662889300 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert") pod "ingress-canary-g8j9n" (UID: "f12d17a9-d1af-465c-a23e-81d8f09a5156") : secret "canary-serving-cert" not found Apr 16 22:14:26.082011 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:26.081938 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls podName:6f971928-4414-4b05-9352-534e9045942e nodeName:}" failed. No retries permitted until 2026-04-16 22:14:30.081932605 +0000 UTC m=+40.662914787 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls") pod "dns-default-x4vpp" (UID: "6f971928-4414-4b05-9352-534e9045942e") : secret "dns-default-metrics-tls" not found Apr 16 22:14:26.277119 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:26.277087 2572 generic.go:358] "Generic (PLEG): container finished" podID="52f439d6-f3b5-4680-9824-b2e26d67be20" containerID="b84cfbd566f2c085bf9e2b6b3c9cb399ec60fba691fb890b12f0f12f309a5188" exitCode=0 Apr 16 22:14:26.277451 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:26.277135 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z8bb" event={"ID":"52f439d6-f3b5-4680-9824-b2e26d67be20","Type":"ContainerDied","Data":"b84cfbd566f2c085bf9e2b6b3c9cb399ec60fba691fb890b12f0f12f309a5188"} Apr 16 22:14:27.284209 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:27.284178 2572 generic.go:358] "Generic (PLEG): container finished" podID="52f439d6-f3b5-4680-9824-b2e26d67be20" containerID="63d1259787938d82c8ad7c1920ce73bc2b685b1883dde1ed6193fa02fc1bf46f" exitCode=0 Apr 16 22:14:27.284626 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:27.284240 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z8bb" event={"ID":"52f439d6-f3b5-4680-9824-b2e26d67be20","Type":"ContainerDied","Data":"63d1259787938d82c8ad7c1920ce73bc2b685b1883dde1ed6193fa02fc1bf46f"} Apr 16 22:14:27.986136 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:27.986105 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq"] Apr 16 22:14:27.988030 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:27.988016 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq" Apr 16 22:14:27.990786 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:27.990763 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 22:14:27.990983 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:27.990969 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 22:14:27.991037 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:27.991010 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 22:14:27.991127 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:27.991107 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 22:14:28.000037 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:28.000015 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq"] Apr 16 22:14:28.100172 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:28.100136 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3ba2875b-6c1b-4079-bff7-93fdd5a00802-tmp\") pod \"klusterlet-addon-workmgr-84ffc658ff-bz9xq\" (UID: \"3ba2875b-6c1b-4079-bff7-93fdd5a00802\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq" Apr 16 22:14:28.100172 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:28.100180 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6wpq\" (UniqueName: \"kubernetes.io/projected/3ba2875b-6c1b-4079-bff7-93fdd5a00802-kube-api-access-g6wpq\") pod \"klusterlet-addon-workmgr-84ffc658ff-bz9xq\" (UID: \"3ba2875b-6c1b-4079-bff7-93fdd5a00802\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq" Apr 16 22:14:28.100389 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:28.100287 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3ba2875b-6c1b-4079-bff7-93fdd5a00802-klusterlet-config\") pod \"klusterlet-addon-workmgr-84ffc658ff-bz9xq\" (UID: \"3ba2875b-6c1b-4079-bff7-93fdd5a00802\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq" Apr 16 22:14:28.201565 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:28.201522 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3ba2875b-6c1b-4079-bff7-93fdd5a00802-klusterlet-config\") pod \"klusterlet-addon-workmgr-84ffc658ff-bz9xq\" (UID: \"3ba2875b-6c1b-4079-bff7-93fdd5a00802\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq" Apr 16 22:14:28.201742 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:28.201627 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3ba2875b-6c1b-4079-bff7-93fdd5a00802-tmp\") pod \"klusterlet-addon-workmgr-84ffc658ff-bz9xq\" (UID: \"3ba2875b-6c1b-4079-bff7-93fdd5a00802\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq" Apr 16 22:14:28.201742 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:28.201664 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g6wpq\" (UniqueName: \"kubernetes.io/projected/3ba2875b-6c1b-4079-bff7-93fdd5a00802-kube-api-access-g6wpq\") pod \"klusterlet-addon-workmgr-84ffc658ff-bz9xq\" (UID: \"3ba2875b-6c1b-4079-bff7-93fdd5a00802\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq" Apr 16 22:14:28.202044 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:28.202022 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3ba2875b-6c1b-4079-bff7-93fdd5a00802-tmp\") pod \"klusterlet-addon-workmgr-84ffc658ff-bz9xq\" (UID: \"3ba2875b-6c1b-4079-bff7-93fdd5a00802\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq" Apr 16 22:14:28.204780 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:28.204762 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3ba2875b-6c1b-4079-bff7-93fdd5a00802-klusterlet-config\") pod \"klusterlet-addon-workmgr-84ffc658ff-bz9xq\" (UID: \"3ba2875b-6c1b-4079-bff7-93fdd5a00802\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq" Apr 16 22:14:28.209615 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:28.209590 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6wpq\" (UniqueName: \"kubernetes.io/projected/3ba2875b-6c1b-4079-bff7-93fdd5a00802-kube-api-access-g6wpq\") pod \"klusterlet-addon-workmgr-84ffc658ff-bz9xq\" (UID: \"3ba2875b-6c1b-4079-bff7-93fdd5a00802\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq" Apr 16 22:14:28.288928 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:28.288896 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6z8bb" event={"ID":"52f439d6-f3b5-4680-9824-b2e26d67be20","Type":"ContainerStarted","Data":"6fd376b78883767fc4952c10d172080a6fa4890377ad021051397f975752336f"} Apr 16 22:14:28.297412 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:28.297395 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq" Apr 16 22:14:28.308922 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:28.308345 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6z8bb" podStartSLOduration=5.707608278 podStartE2EDuration="38.308330788s" podCreationTimestamp="2026-04-16 22:13:50 +0000 UTC" firstStartedPulling="2026-04-16 22:13:52.495265259 +0000 UTC m=+3.076247445" lastFinishedPulling="2026-04-16 22:14:25.095987772 +0000 UTC m=+35.676969955" observedRunningTime="2026-04-16 22:14:28.308135162 +0000 UTC m=+38.889117367" watchObservedRunningTime="2026-04-16 22:14:28.308330788 +0000 UTC m=+38.889312996" Apr 16 22:14:28.420336 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:28.420309 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq"] Apr 16 22:14:28.424417 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:14:28.424384 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ba2875b_6c1b_4079_bff7_93fdd5a00802.slice/crio-28bb65aba0d80ba71cc3bf5798892d4978ab4bbcadd829f8ce778491daaf13dc WatchSource:0}: Error finding container 28bb65aba0d80ba71cc3bf5798892d4978ab4bbcadd829f8ce778491daaf13dc: Status 404 returned error can't find the container with id 28bb65aba0d80ba71cc3bf5798892d4978ab4bbcadd829f8ce778491daaf13dc Apr 16 22:14:29.292632 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:29.292590 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq" event={"ID":"3ba2875b-6c1b-4079-bff7-93fdd5a00802","Type":"ContainerStarted","Data":"28bb65aba0d80ba71cc3bf5798892d4978ab4bbcadd829f8ce778491daaf13dc"} Apr 16 22:14:30.115715 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:30.115675 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls\") pod \"dns-default-x4vpp\" (UID: \"6f971928-4414-4b05-9352-534e9045942e\") " pod="openshift-dns/dns-default-x4vpp" Apr 16 22:14:30.115893 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:30.115779 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert\") pod \"ingress-canary-g8j9n\" (UID: \"f12d17a9-d1af-465c-a23e-81d8f09a5156\") " pod="openshift-ingress-canary/ingress-canary-g8j9n" Apr 16 22:14:30.115969 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:30.115893 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:30.115969 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:30.115951 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert podName:f12d17a9-d1af-465c-a23e-81d8f09a5156 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:38.115937474 +0000 UTC m=+48.696919657 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert") pod "ingress-canary-g8j9n" (UID: "f12d17a9-d1af-465c-a23e-81d8f09a5156") : secret "canary-serving-cert" not found Apr 16 22:14:30.116207 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:30.116174 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:30.116320 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:30.116241 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls podName:6f971928-4414-4b05-9352-534e9045942e nodeName:}" failed. No retries permitted until 2026-04-16 22:14:38.116224018 +0000 UTC m=+48.697206216 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls") pod "dns-default-x4vpp" (UID: "6f971928-4414-4b05-9352-534e9045942e") : secret "dns-default-metrics-tls" not found Apr 16 22:14:32.299676 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:32.299647 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq" event={"ID":"3ba2875b-6c1b-4079-bff7-93fdd5a00802","Type":"ContainerStarted","Data":"666703ff874986678676b1ffdf36d992022dba9d6d79ad17eaaf26642bcab445"} Apr 16 22:14:32.300044 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:32.299948 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq" Apr 16 22:14:32.301473 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:32.301450 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq" Apr 16 22:14:32.313704 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:32.313666 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq" podStartSLOduration=1.886300779 podStartE2EDuration="5.313655138s" podCreationTimestamp="2026-04-16 22:14:27 +0000 UTC" firstStartedPulling="2026-04-16 22:14:28.426208379 +0000 UTC m=+39.007190573" lastFinishedPulling="2026-04-16 22:14:31.853562748 +0000 UTC m=+42.434544932" observedRunningTime="2026-04-16 22:14:32.313103389 +0000 UTC m=+42.894085596" watchObservedRunningTime="2026-04-16 22:14:32.313655138 +0000 UTC m=+42.894637342" Apr 16 22:14:38.171183 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:38.171111 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls\") pod \"dns-default-x4vpp\" (UID: \"6f971928-4414-4b05-9352-534e9045942e\") " pod="openshift-dns/dns-default-x4vpp" Apr 16 22:14:38.171709 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:38.171232 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert\") pod \"ingress-canary-g8j9n\" (UID: \"f12d17a9-d1af-465c-a23e-81d8f09a5156\") " pod="openshift-ingress-canary/ingress-canary-g8j9n" Apr 16 22:14:38.171709 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:38.171271 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:38.171709 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:38.171331 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:38.171709 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:38.171340 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls podName:6f971928-4414-4b05-9352-534e9045942e nodeName:}" failed. No retries permitted until 2026-04-16 22:14:54.171323818 +0000 UTC m=+64.752306001 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls") pod "dns-default-x4vpp" (UID: "6f971928-4414-4b05-9352-534e9045942e") : secret "dns-default-metrics-tls" not found Apr 16 22:14:38.171709 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:38.171384 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert podName:f12d17a9-d1af-465c-a23e-81d8f09a5156 nodeName:}" failed. No retries permitted until 2026-04-16 22:14:54.171368242 +0000 UTC m=+64.752350424 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert") pod "ingress-canary-g8j9n" (UID: "f12d17a9-d1af-465c-a23e-81d8f09a5156") : secret "canary-serving-cert" not found Apr 16 22:14:48.268564 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:48.268522 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qntcx" Apr 16 22:14:54.185229 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:54.185192 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert\") pod \"ingress-canary-g8j9n\" (UID: \"f12d17a9-d1af-465c-a23e-81d8f09a5156\") " pod="openshift-ingress-canary/ingress-canary-g8j9n" Apr 16 22:14:54.185650 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:54.185244 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls\") pod \"dns-default-x4vpp\" (UID: \"6f971928-4414-4b05-9352-534e9045942e\") " pod="openshift-dns/dns-default-x4vpp" Apr 16 22:14:54.185650 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:54.185346 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:14:54.185650 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:54.185403 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls podName:6f971928-4414-4b05-9352-534e9045942e nodeName:}" failed. No retries permitted until 2026-04-16 22:15:26.185388498 +0000 UTC m=+96.766370681 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls") pod "dns-default-x4vpp" (UID: "6f971928-4414-4b05-9352-534e9045942e") : secret "dns-default-metrics-tls" not found Apr 16 22:14:54.185650 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:54.185346 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:14:54.185650 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:54.185465 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert podName:f12d17a9-d1af-465c-a23e-81d8f09a5156 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:26.185453969 +0000 UTC m=+96.766436156 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert") pod "ingress-canary-g8j9n" (UID: "f12d17a9-d1af-465c-a23e-81d8f09a5156") : secret "canary-serving-cert" not found Apr 16 22:14:55.795480 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:55.795437 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zxvhn\" (UniqueName: \"kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn\") pod \"network-check-target-jmbfd\" (UID: \"782f4e27-ac2a-4597-87e3-319c3917a087\") " pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:14:55.795480 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:55.795485 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs\") pod \"network-metrics-daemon-cd2s2\" (UID: \"4ffa9102-be6a-431e-b1c8-ee3b5b01e588\") " pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:14:55.795973 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:55.795648 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:14:55.795973 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:14:55.795716 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs podName:4ffa9102-be6a-431e-b1c8-ee3b5b01e588 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:59.795701235 +0000 UTC m=+130.376683418 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs") pod "network-metrics-daemon-cd2s2" (UID: "4ffa9102-be6a-431e-b1c8-ee3b5b01e588") : secret "metrics-daemon-secret" not found Apr 16 22:14:55.798295 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:55.798276 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:14:55.808218 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:55.808197 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:14:55.818649 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:55.818625 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxvhn\" (UniqueName: \"kubernetes.io/projected/782f4e27-ac2a-4597-87e3-319c3917a087-kube-api-access-zxvhn\") pod \"network-check-target-jmbfd\" (UID: \"782f4e27-ac2a-4597-87e3-319c3917a087\") " pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:14:55.850244 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:55.850217 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-2r56c\"" Apr 16 22:14:55.858101 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:55.858084 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:14:55.979434 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:55.979400 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jmbfd"] Apr 16 22:14:55.982739 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:14:55.982706 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod782f4e27_ac2a_4597_87e3_319c3917a087.slice/crio-be0baa8af80b7d8d81aef2a694b0f845d6b2733741d2b58a57c93b53535a5a4c WatchSource:0}: Error finding container be0baa8af80b7d8d81aef2a694b0f845d6b2733741d2b58a57c93b53535a5a4c: Status 404 returned error can't find the container with id be0baa8af80b7d8d81aef2a694b0f845d6b2733741d2b58a57c93b53535a5a4c Apr 16 22:14:56.345341 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:56.345295 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jmbfd" event={"ID":"782f4e27-ac2a-4597-87e3-319c3917a087","Type":"ContainerStarted","Data":"be0baa8af80b7d8d81aef2a694b0f845d6b2733741d2b58a57c93b53535a5a4c"} Apr 16 22:14:59.352929 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:59.352892 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jmbfd" event={"ID":"782f4e27-ac2a-4597-87e3-319c3917a087","Type":"ContainerStarted","Data":"ed47224711afc48296f5fe475ee1164c811a1a26fd4246967c31890a42b7d80a"} Apr 16 22:14:59.353303 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:59.353018 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:14:59.373101 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:14:59.370949 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-jmbfd" podStartSLOduration=66.907867216 podStartE2EDuration="1m9.370925178s" podCreationTimestamp="2026-04-16 22:13:50 +0000 UTC" firstStartedPulling="2026-04-16 22:14:55.984632824 +0000 UTC m=+66.565615010" lastFinishedPulling="2026-04-16 22:14:58.447690788 +0000 UTC m=+69.028672972" observedRunningTime="2026-04-16 22:14:59.369688696 +0000 UTC m=+69.950670912" watchObservedRunningTime="2026-04-16 22:14:59.370925178 +0000 UTC m=+69.951907383" Apr 16 22:15:26.210507 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:26.210379 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls\") pod \"dns-default-x4vpp\" (UID: \"6f971928-4414-4b05-9352-534e9045942e\") " pod="openshift-dns/dns-default-x4vpp" Apr 16 22:15:26.210507 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:26.210437 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert\") pod \"ingress-canary-g8j9n\" (UID: \"f12d17a9-d1af-465c-a23e-81d8f09a5156\") " pod="openshift-ingress-canary/ingress-canary-g8j9n" Apr 16 22:15:26.211005 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:26.210520 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:15:26.211005 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:26.210526 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:15:26.211005 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:26.210600 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert podName:f12d17a9-d1af-465c-a23e-81d8f09a5156 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:30.210585673 +0000 UTC m=+160.791567857 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert") pod "ingress-canary-g8j9n" (UID: "f12d17a9-d1af-465c-a23e-81d8f09a5156") : secret "canary-serving-cert" not found Apr 16 22:15:26.211005 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:26.210615 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls podName:6f971928-4414-4b05-9352-534e9045942e nodeName:}" failed. No retries permitted until 2026-04-16 22:16:30.210609239 +0000 UTC m=+160.791591422 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls") pod "dns-default-x4vpp" (UID: "6f971928-4414-4b05-9352-534e9045942e") : secret "dns-default-metrics-tls" not found Apr 16 22:15:30.358213 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:30.358184 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jmbfd" Apr 16 22:15:54.668899 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.668861 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vnpn9"] Apr 16 22:15:54.671833 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.671629 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vnpn9" Apr 16 22:15:54.674088 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.674065 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:54.674249 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.674185 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-mvw5b\"" Apr 16 22:15:54.674323 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.674255 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 22:15:54.683244 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.683222 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vnpn9"] Apr 16 22:15:54.781053 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.781022 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-nf8x4"] Apr 16 22:15:54.783719 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.783688 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-86cf8799dd-t7q4v"] Apr 16 22:15:54.783856 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.783838 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-nf8x4" Apr 16 22:15:54.786469 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.786448 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-54f7d4b79c-8d9n5"] Apr 16 22:15:54.786632 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.786614 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:15:54.786863 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.786798 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 22:15:54.787461 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.787195 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 22:15:54.787461 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.787200 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 22:15:54.787461 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.787205 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-ncb9n\"" Apr 16 22:15:54.787461 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.787218 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 22:15:54.789450 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.789431 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:54.790114 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.790094 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 22:15:54.790210 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.790186 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 22:15:54.790299 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.790280 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 22:15:54.790599 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.790537 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-xbzv6\"" Apr 16 22:15:54.790700 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.790584 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 22:15:54.790791 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.790772 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 22:15:54.790960 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.790922 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 22:15:54.794465 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.794078 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 22:15:54.794589 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.794486 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 22:15:54.795642 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.795615 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 22:15:54.795973 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.795955 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-48hdh\"" Apr 16 22:15:54.799193 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.799139 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 22:15:54.800294 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.800274 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-nf8x4"] Apr 16 22:15:54.803303 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.803279 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-86cf8799dd-t7q4v"] Apr 16 22:15:54.803748 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.803720 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25rzf\" (UniqueName: \"kubernetes.io/projected/664ba6c7-b752-4fc9-b6c6-50847231a8a0-kube-api-access-25rzf\") pod \"volume-data-source-validator-7c6cbb6c87-vnpn9\" (UID: \"664ba6c7-b752-4fc9-b6c6-50847231a8a0\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vnpn9" Apr 16 22:15:54.804422 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.804406 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 22:15:54.805864 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.805838 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-54f7d4b79c-8d9n5"] Apr 16 22:15:54.875521 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.875492 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6zgrq"] Apr 16 22:15:54.878377 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.878362 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6zgrq" Apr 16 22:15:54.880632 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.880610 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 22:15:54.880857 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.880843 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:15:54.880929 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.880904 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-4wbcd\"" Apr 16 22:15:54.880979 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.880904 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 22:15:54.881217 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.881199 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 22:15:54.886828 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.886809 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6zgrq"] Apr 16 22:15:54.904545 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.904523 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25rzf\" (UniqueName: \"kubernetes.io/projected/664ba6c7-b752-4fc9-b6c6-50847231a8a0-kube-api-access-25rzf\") pod \"volume-data-source-validator-7c6cbb6c87-vnpn9\" (UID: \"664ba6c7-b752-4fc9-b6c6-50847231a8a0\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vnpn9" Apr 16 22:15:54.904667 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.904570 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-bound-sa-token\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:54.904667 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.904595 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9277225-df0b-4b17-9847-c37f05b00c9d-serving-cert\") pod \"insights-operator-585dfdc468-nf8x4\" (UID: \"e9277225-df0b-4b17-9847-c37f05b00c9d\") " pod="openshift-insights/insights-operator-585dfdc468-nf8x4" Apr 16 22:15:54.904667 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.904626 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-image-registry-private-configuration\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:54.904781 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.904682 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9277225-df0b-4b17-9847-c37f05b00c9d-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-nf8x4\" (UID: \"e9277225-df0b-4b17-9847-c37f05b00c9d\") " pod="openshift-insights/insights-operator-585dfdc468-nf8x4" Apr 16 22:15:54.904781 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.904740 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9277225-df0b-4b17-9847-c37f05b00c9d-service-ca-bundle\") pod \"insights-operator-585dfdc468-nf8x4\" (UID: \"e9277225-df0b-4b17-9847-c37f05b00c9d\") " pod="openshift-insights/insights-operator-585dfdc468-nf8x4" Apr 16 22:15:54.904781 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.904778 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9cv9\" (UniqueName: \"kubernetes.io/projected/b1c23d84-45f9-42eb-9087-7a389e8722e3-kube-api-access-w9cv9\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:15:54.904892 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.904795 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-tls\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:54.904892 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.904823 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-certificates\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:54.904892 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.904839 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e9277225-df0b-4b17-9847-c37f05b00c9d-snapshots\") pod \"insights-operator-585dfdc468-nf8x4\" (UID: \"e9277225-df0b-4b17-9847-c37f05b00c9d\") " pod="openshift-insights/insights-operator-585dfdc468-nf8x4" Apr 16 22:15:54.904892 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.904856 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9jlg\" (UniqueName: \"kubernetes.io/projected/e9277225-df0b-4b17-9847-c37f05b00c9d-kube-api-access-g9jlg\") pod \"insights-operator-585dfdc468-nf8x4\" (UID: \"e9277225-df0b-4b17-9847-c37f05b00c9d\") " pod="openshift-insights/insights-operator-585dfdc468-nf8x4" Apr 16 22:15:54.904892 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.904884 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-ca-trust-extracted\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:54.905125 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.904932 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-trusted-ca\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:54.905125 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.904984 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gmv9\" (UniqueName: \"kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-kube-api-access-9gmv9\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:54.905125 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.905095 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-installation-pull-secrets\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:54.905125 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.905122 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9277225-df0b-4b17-9847-c37f05b00c9d-tmp\") pod \"insights-operator-585dfdc468-nf8x4\" (UID: \"e9277225-df0b-4b17-9847-c37f05b00c9d\") " pod="openshift-insights/insights-operator-585dfdc468-nf8x4" Apr 16 22:15:54.905308 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.905149 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1c23d84-45f9-42eb-9087-7a389e8722e3-service-ca-bundle\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:15:54.905308 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.905174 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-stats-auth\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:15:54.905308 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.905206 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-default-certificate\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:15:54.905308 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.905230 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-metrics-certs\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:15:54.912382 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.912354 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-25rzf\" (UniqueName: \"kubernetes.io/projected/664ba6c7-b752-4fc9-b6c6-50847231a8a0-kube-api-access-25rzf\") pod \"volume-data-source-validator-7c6cbb6c87-vnpn9\" (UID: \"664ba6c7-b752-4fc9-b6c6-50847231a8a0\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vnpn9" Apr 16 22:15:54.980207 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:54.980124 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vnpn9" Apr 16 22:15:55.005748 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.005712 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-stats-auth\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:15:55.005895 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.005767 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt55s\" (UniqueName: \"kubernetes.io/projected/090695a5-dd56-4979-855d-5b1e3e681e54-kube-api-access-pt55s\") pod \"kube-storage-version-migrator-operator-6769c5d45-6zgrq\" (UID: \"090695a5-dd56-4979-855d-5b1e3e681e54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6zgrq" Apr 16 22:15:55.005895 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.005805 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-default-certificate\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:15:55.005895 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.005832 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/090695a5-dd56-4979-855d-5b1e3e681e54-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6zgrq\" (UID: \"090695a5-dd56-4979-855d-5b1e3e681e54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6zgrq" Apr 16 22:15:55.006045 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.005898 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-bound-sa-token\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:55.006045 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.005937 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9277225-df0b-4b17-9847-c37f05b00c9d-serving-cert\") pod \"insights-operator-585dfdc468-nf8x4\" (UID: \"e9277225-df0b-4b17-9847-c37f05b00c9d\") " pod="openshift-insights/insights-operator-585dfdc468-nf8x4" Apr 16 22:15:55.006045 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.005975 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9277225-df0b-4b17-9847-c37f05b00c9d-service-ca-bundle\") pod \"insights-operator-585dfdc468-nf8x4\" (UID: \"e9277225-df0b-4b17-9847-c37f05b00c9d\") " pod="openshift-insights/insights-operator-585dfdc468-nf8x4" Apr 16 22:15:55.006045 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.005994 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/090695a5-dd56-4979-855d-5b1e3e681e54-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6zgrq\" (UID: \"090695a5-dd56-4979-855d-5b1e3e681e54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6zgrq" Apr 16 22:15:55.006045 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.006022 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-tls\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:55.006285 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.006047 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9cv9\" (UniqueName: \"kubernetes.io/projected/b1c23d84-45f9-42eb-9087-7a389e8722e3-kube-api-access-w9cv9\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:15:55.006285 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.006074 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-certificates\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:55.006285 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.006100 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e9277225-df0b-4b17-9847-c37f05b00c9d-snapshots\") pod \"insights-operator-585dfdc468-nf8x4\" (UID: \"e9277225-df0b-4b17-9847-c37f05b00c9d\") " pod="openshift-insights/insights-operator-585dfdc468-nf8x4" Apr 16 22:15:55.006285 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.006140 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-trusted-ca\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:55.006285 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.006171 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1c23d84-45f9-42eb-9087-7a389e8722e3-service-ca-bundle\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:15:55.006285 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.006208 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9277225-df0b-4b17-9847-c37f05b00c9d-tmp\") pod \"insights-operator-585dfdc468-nf8x4\" (UID: \"e9277225-df0b-4b17-9847-c37f05b00c9d\") " pod="openshift-insights/insights-operator-585dfdc468-nf8x4" Apr 16 22:15:55.006285 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.006255 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-metrics-certs\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:15:55.006652 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.006296 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-image-registry-private-configuration\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:55.006652 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.006328 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9277225-df0b-4b17-9847-c37f05b00c9d-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-nf8x4\" (UID: \"e9277225-df0b-4b17-9847-c37f05b00c9d\") " pod="openshift-insights/insights-operator-585dfdc468-nf8x4" Apr 16 22:15:55.006652 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.006372 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9jlg\" (UniqueName: \"kubernetes.io/projected/e9277225-df0b-4b17-9847-c37f05b00c9d-kube-api-access-g9jlg\") pod \"insights-operator-585dfdc468-nf8x4\" (UID: \"e9277225-df0b-4b17-9847-c37f05b00c9d\") " pod="openshift-insights/insights-operator-585dfdc468-nf8x4" Apr 16 22:15:55.006652 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.006407 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-ca-trust-extracted\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:55.006652 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.006433 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9gmv9\" (UniqueName: \"kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-kube-api-access-9gmv9\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:55.006652 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.006470 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-installation-pull-secrets\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:55.006652 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.006643 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9277225-df0b-4b17-9847-c37f05b00c9d-service-ca-bundle\") pod \"insights-operator-585dfdc468-nf8x4\" (UID: \"e9277225-df0b-4b17-9847-c37f05b00c9d\") " pod="openshift-insights/insights-operator-585dfdc468-nf8x4" Apr 16 22:15:55.006998 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:55.006727 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:15:55.006998 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:55.006802 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-metrics-certs podName:b1c23d84-45f9-42eb-9087-7a389e8722e3 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:55.506780064 +0000 UTC m=+126.087762250 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-metrics-certs") pod "router-default-86cf8799dd-t7q4v" (UID: "b1c23d84-45f9-42eb-9087-7a389e8722e3") : secret "router-metrics-certs-default" not found Apr 16 22:15:55.006998 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.006849 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-certificates\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:55.006998 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:55.006924 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:15:55.006998 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:55.006938 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54f7d4b79c-8d9n5: secret "image-registry-tls" not found Apr 16 22:15:55.007253 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:55.006991 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-tls podName:b1abd3ee-aae6-4e57-961f-53cc5f3d40c0 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:55.506977519 +0000 UTC m=+126.087959721 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-tls") pod "image-registry-54f7d4b79c-8d9n5" (UID: "b1abd3ee-aae6-4e57-961f-53cc5f3d40c0") : secret "image-registry-tls" not found Apr 16 22:15:55.007397 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.007371 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-ca-trust-extracted\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:55.007536 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:55.007504 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b1c23d84-45f9-42eb-9087-7a389e8722e3-service-ca-bundle podName:b1c23d84-45f9-42eb-9087-7a389e8722e3 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:55.50748422 +0000 UTC m=+126.088466406 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b1c23d84-45f9-42eb-9087-7a389e8722e3-service-ca-bundle") pod "router-default-86cf8799dd-t7q4v" (UID: "b1c23d84-45f9-42eb-9087-7a389e8722e3") : configmap references non-existent config key: service-ca.crt Apr 16 22:15:55.007700 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.007582 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e9277225-df0b-4b17-9847-c37f05b00c9d-snapshots\") pod \"insights-operator-585dfdc468-nf8x4\" (UID: \"e9277225-df0b-4b17-9847-c37f05b00c9d\") " pod="openshift-insights/insights-operator-585dfdc468-nf8x4" Apr 16 22:15:55.007884 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.007843 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e9277225-df0b-4b17-9847-c37f05b00c9d-tmp\") pod \"insights-operator-585dfdc468-nf8x4\" (UID: \"e9277225-df0b-4b17-9847-c37f05b00c9d\") " pod="openshift-insights/insights-operator-585dfdc468-nf8x4" Apr 16 22:15:55.008303 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.008281 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-trusted-ca\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:55.008725 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.008686 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e9277225-df0b-4b17-9847-c37f05b00c9d-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-nf8x4\" (UID: \"e9277225-df0b-4b17-9847-c37f05b00c9d\") " pod="openshift-insights/insights-operator-585dfdc468-nf8x4" Apr 16 22:15:55.008833 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.008815 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e9277225-df0b-4b17-9847-c37f05b00c9d-serving-cert\") pod \"insights-operator-585dfdc468-nf8x4\" (UID: \"e9277225-df0b-4b17-9847-c37f05b00c9d\") " pod="openshift-insights/insights-operator-585dfdc468-nf8x4" Apr 16 22:15:55.009197 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.009173 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-stats-auth\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:15:55.009888 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.009861 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-default-certificate\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:15:55.009986 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.009918 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-image-registry-private-configuration\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:55.010952 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.010933 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-installation-pull-secrets\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:55.020807 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.020772 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9jlg\" (UniqueName: \"kubernetes.io/projected/e9277225-df0b-4b17-9847-c37f05b00c9d-kube-api-access-g9jlg\") pod \"insights-operator-585dfdc468-nf8x4\" (UID: \"e9277225-df0b-4b17-9847-c37f05b00c9d\") " pod="openshift-insights/insights-operator-585dfdc468-nf8x4" Apr 16 22:15:55.021164 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.021129 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-bound-sa-token\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:55.021332 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.021311 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9cv9\" (UniqueName: \"kubernetes.io/projected/b1c23d84-45f9-42eb-9087-7a389e8722e3-kube-api-access-w9cv9\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:15:55.021917 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.021899 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gmv9\" (UniqueName: \"kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-kube-api-access-9gmv9\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:55.094578 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.094534 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-nf8x4" Apr 16 22:15:55.098635 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.098609 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vnpn9"] Apr 16 22:15:55.101710 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:15:55.101684 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod664ba6c7_b752_4fc9_b6c6_50847231a8a0.slice/crio-3cc3a60e07675843ae107d86c08b889920fc5e04e8a69c60e2485969c47ece71 WatchSource:0}: Error finding container 3cc3a60e07675843ae107d86c08b889920fc5e04e8a69c60e2485969c47ece71: Status 404 returned error can't find the container with id 3cc3a60e07675843ae107d86c08b889920fc5e04e8a69c60e2485969c47ece71 Apr 16 22:15:55.107450 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.107424 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pt55s\" (UniqueName: \"kubernetes.io/projected/090695a5-dd56-4979-855d-5b1e3e681e54-kube-api-access-pt55s\") pod \"kube-storage-version-migrator-operator-6769c5d45-6zgrq\" (UID: \"090695a5-dd56-4979-855d-5b1e3e681e54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6zgrq" Apr 16 22:15:55.107541 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.107460 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/090695a5-dd56-4979-855d-5b1e3e681e54-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6zgrq\" (UID: \"090695a5-dd56-4979-855d-5b1e3e681e54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6zgrq" Apr 16 22:15:55.107541 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.107494 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/090695a5-dd56-4979-855d-5b1e3e681e54-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6zgrq\" (UID: \"090695a5-dd56-4979-855d-5b1e3e681e54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6zgrq" Apr 16 22:15:55.107972 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.107949 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/090695a5-dd56-4979-855d-5b1e3e681e54-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-6zgrq\" (UID: \"090695a5-dd56-4979-855d-5b1e3e681e54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6zgrq" Apr 16 22:15:55.109519 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.109495 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/090695a5-dd56-4979-855d-5b1e3e681e54-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-6zgrq\" (UID: \"090695a5-dd56-4979-855d-5b1e3e681e54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6zgrq" Apr 16 22:15:55.116219 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.116197 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt55s\" (UniqueName: \"kubernetes.io/projected/090695a5-dd56-4979-855d-5b1e3e681e54-kube-api-access-pt55s\") pod \"kube-storage-version-migrator-operator-6769c5d45-6zgrq\" (UID: \"090695a5-dd56-4979-855d-5b1e3e681e54\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6zgrq" Apr 16 22:15:55.186854 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.186823 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6zgrq" Apr 16 22:15:55.225960 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.225926 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-nf8x4"] Apr 16 22:15:55.229580 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:15:55.229537 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9277225_df0b_4b17_9847_c37f05b00c9d.slice/crio-984f6dd24f72001dcfd686ed398dd06f2e94b998efe26a9610cd94d046d15b54 WatchSource:0}: Error finding container 984f6dd24f72001dcfd686ed398dd06f2e94b998efe26a9610cd94d046d15b54: Status 404 returned error can't find the container with id 984f6dd24f72001dcfd686ed398dd06f2e94b998efe26a9610cd94d046d15b54 Apr 16 22:15:55.303981 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.303852 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6zgrq"] Apr 16 22:15:55.306287 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:15:55.306262 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod090695a5_dd56_4979_855d_5b1e3e681e54.slice/crio-d6f5560530aa18281086d18cf3aeb692cee17f67645780a12de389bab6c53f40 WatchSource:0}: Error finding container d6f5560530aa18281086d18cf3aeb692cee17f67645780a12de389bab6c53f40: Status 404 returned error can't find the container with id d6f5560530aa18281086d18cf3aeb692cee17f67645780a12de389bab6c53f40 Apr 16 22:15:55.455193 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.455159 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vnpn9" event={"ID":"664ba6c7-b752-4fc9-b6c6-50847231a8a0","Type":"ContainerStarted","Data":"3cc3a60e07675843ae107d86c08b889920fc5e04e8a69c60e2485969c47ece71"} Apr 16 22:15:55.456066 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.456041 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6zgrq" event={"ID":"090695a5-dd56-4979-855d-5b1e3e681e54","Type":"ContainerStarted","Data":"d6f5560530aa18281086d18cf3aeb692cee17f67645780a12de389bab6c53f40"} Apr 16 22:15:55.456915 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.456893 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-nf8x4" event={"ID":"e9277225-df0b-4b17-9847-c37f05b00c9d","Type":"ContainerStarted","Data":"984f6dd24f72001dcfd686ed398dd06f2e94b998efe26a9610cd94d046d15b54"} Apr 16 22:15:55.511824 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.511783 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-tls\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:55.511990 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.511852 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1c23d84-45f9-42eb-9087-7a389e8722e3-service-ca-bundle\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:15:55.511990 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:55.511889 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-metrics-certs\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:15:55.511990 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:55.511927 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:15:55.511990 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:55.511946 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54f7d4b79c-8d9n5: secret "image-registry-tls" not found Apr 16 22:15:55.512159 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:55.511997 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-tls podName:b1abd3ee-aae6-4e57-961f-53cc5f3d40c0 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:56.511981916 +0000 UTC m=+127.092964112 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-tls") pod "image-registry-54f7d4b79c-8d9n5" (UID: "b1abd3ee-aae6-4e57-961f-53cc5f3d40c0") : secret "image-registry-tls" not found Apr 16 22:15:55.512159 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:55.512005 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:15:55.512159 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:55.512010 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b1c23d84-45f9-42eb-9087-7a389e8722e3-service-ca-bundle podName:b1c23d84-45f9-42eb-9087-7a389e8722e3 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:56.512003692 +0000 UTC m=+127.092985875 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b1c23d84-45f9-42eb-9087-7a389e8722e3-service-ca-bundle") pod "router-default-86cf8799dd-t7q4v" (UID: "b1c23d84-45f9-42eb-9087-7a389e8722e3") : configmap references non-existent config key: service-ca.crt Apr 16 22:15:55.512159 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:55.512066 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-metrics-certs podName:b1c23d84-45f9-42eb-9087-7a389e8722e3 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:56.512050347 +0000 UTC m=+127.093032534 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-metrics-certs") pod "router-default-86cf8799dd-t7q4v" (UID: "b1c23d84-45f9-42eb-9087-7a389e8722e3") : secret "router-metrics-certs-default" not found Apr 16 22:15:56.520717 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:56.520678 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-tls\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:56.521153 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:56.520751 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1c23d84-45f9-42eb-9087-7a389e8722e3-service-ca-bundle\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:15:56.521153 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:56.520799 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-metrics-certs\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:15:56.521153 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:56.520842 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:15:56.521153 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:56.520866 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54f7d4b79c-8d9n5: secret "image-registry-tls" not found Apr 16 22:15:56.521153 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:56.520928 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-tls podName:b1abd3ee-aae6-4e57-961f-53cc5f3d40c0 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:58.520906495 +0000 UTC m=+129.101888681 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-tls") pod "image-registry-54f7d4b79c-8d9n5" (UID: "b1abd3ee-aae6-4e57-961f-53cc5f3d40c0") : secret "image-registry-tls" not found Apr 16 22:15:56.521153 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:56.520938 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:15:56.521153 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:56.520987 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-metrics-certs podName:b1c23d84-45f9-42eb-9087-7a389e8722e3 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:58.52097146 +0000 UTC m=+129.101953651 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-metrics-certs") pod "router-default-86cf8799dd-t7q4v" (UID: "b1c23d84-45f9-42eb-9087-7a389e8722e3") : secret "router-metrics-certs-default" not found Apr 16 22:15:56.521153 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:56.521004 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b1c23d84-45f9-42eb-9087-7a389e8722e3-service-ca-bundle podName:b1c23d84-45f9-42eb-9087-7a389e8722e3 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:58.520995645 +0000 UTC m=+129.101977834 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b1c23d84-45f9-42eb-9087-7a389e8722e3-service-ca-bundle") pod "router-default-86cf8799dd-t7q4v" (UID: "b1c23d84-45f9-42eb-9087-7a389e8722e3") : configmap references non-existent config key: service-ca.crt Apr 16 22:15:57.463130 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:57.463084 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vnpn9" event={"ID":"664ba6c7-b752-4fc9-b6c6-50847231a8a0","Type":"ContainerStarted","Data":"672ad2a815dfe5b7982d562d65566b760573295a2547d73e7dbe597744e56b87"} Apr 16 22:15:57.479185 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:57.479140 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-vnpn9" podStartSLOduration=2.1512083029999998 podStartE2EDuration="3.47912672s" podCreationTimestamp="2026-04-16 22:15:54 +0000 UTC" firstStartedPulling="2026-04-16 22:15:55.103680688 +0000 UTC m=+125.684662871" lastFinishedPulling="2026-04-16 22:15:56.431599101 +0000 UTC m=+127.012581288" observedRunningTime="2026-04-16 22:15:57.478353743 +0000 UTC m=+128.059335948" watchObservedRunningTime="2026-04-16 22:15:57.47912672 +0000 UTC m=+128.060108936" Apr 16 22:15:58.190940 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:58.190908 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-jffnd"] Apr 16 22:15:58.194025 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:58.194010 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jffnd" Apr 16 22:15:58.196513 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:58.196482 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 22:15:58.196513 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:58.196505 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-mjcdb\"" Apr 16 22:15:58.196692 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:58.196513 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 22:15:58.204098 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:58.204068 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-jffnd"] Apr 16 22:15:58.336605 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:58.336373 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jffnd\" (UID: \"d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jffnd" Apr 16 22:15:58.336605 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:58.336505 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-jffnd\" (UID: \"d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jffnd" Apr 16 22:15:58.437676 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:58.437639 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-jffnd\" (UID: \"d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jffnd" Apr 16 22:15:58.437831 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:58.437808 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jffnd\" (UID: \"d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jffnd" Apr 16 22:15:58.437991 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:58.437970 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:15:58.438064 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:58.438052 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07-networking-console-plugin-cert podName:d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:58.938030305 +0000 UTC m=+129.519012502 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jffnd" (UID: "d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07") : secret "networking-console-plugin-cert" not found Apr 16 22:15:58.438417 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:58.438385 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-jffnd\" (UID: \"d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jffnd" Apr 16 22:15:58.467164 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:58.467084 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-nf8x4" event={"ID":"e9277225-df0b-4b17-9847-c37f05b00c9d","Type":"ContainerStarted","Data":"425276a7b0ad6209798e2703477501d71522fbfe28f4b2cd6fbb5cad69aeb2d8"} Apr 16 22:15:58.468382 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:58.468357 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6zgrq" event={"ID":"090695a5-dd56-4979-855d-5b1e3e681e54","Type":"ContainerStarted","Data":"9ed21d304b6c36b7b6dbc9fd66a59b49a38be0a3ea71c89ef301995a4343f872"} Apr 16 22:15:58.484275 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:58.484218 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-nf8x4" podStartSLOduration=2.179119958 podStartE2EDuration="4.484201561s" podCreationTimestamp="2026-04-16 22:15:54 +0000 UTC" firstStartedPulling="2026-04-16 22:15:55.231953558 +0000 UTC m=+125.812935740" lastFinishedPulling="2026-04-16 22:15:57.53703516 +0000 UTC m=+128.118017343" observedRunningTime="2026-04-16 22:15:58.482886642 +0000 UTC m=+129.063868846" watchObservedRunningTime="2026-04-16 22:15:58.484201561 +0000 UTC m=+129.065183767" Apr 16 22:15:58.498597 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:58.498533 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6zgrq" podStartSLOduration=2.266557812 podStartE2EDuration="4.498510921s" podCreationTimestamp="2026-04-16 22:15:54 +0000 UTC" firstStartedPulling="2026-04-16 22:15:55.308094524 +0000 UTC m=+125.889076708" lastFinishedPulling="2026-04-16 22:15:57.540047619 +0000 UTC m=+128.121029817" observedRunningTime="2026-04-16 22:15:58.498456547 +0000 UTC m=+129.079438753" watchObservedRunningTime="2026-04-16 22:15:58.498510921 +0000 UTC m=+129.079493126" Apr 16 22:15:58.539246 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:58.539216 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-metrics-certs\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:15:58.539438 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:58.539363 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:15:58.539946 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:58.539912 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-tls\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:15:58.540066 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:58.540001 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1c23d84-45f9-42eb-9087-7a389e8722e3-service-ca-bundle\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:15:58.540531 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:58.540193 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b1c23d84-45f9-42eb-9087-7a389e8722e3-service-ca-bundle podName:b1c23d84-45f9-42eb-9087-7a389e8722e3 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:02.540167529 +0000 UTC m=+133.121149720 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b1c23d84-45f9-42eb-9087-7a389e8722e3-service-ca-bundle") pod "router-default-86cf8799dd-t7q4v" (UID: "b1c23d84-45f9-42eb-9087-7a389e8722e3") : configmap references non-existent config key: service-ca.crt Apr 16 22:15:58.540531 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:58.540227 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:15:58.540531 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:58.540249 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54f7d4b79c-8d9n5: secret "image-registry-tls" not found Apr 16 22:15:58.540531 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:58.540337 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-tls podName:b1abd3ee-aae6-4e57-961f-53cc5f3d40c0 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:02.54031792 +0000 UTC m=+133.121300108 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-tls") pod "image-registry-54f7d4b79c-8d9n5" (UID: "b1abd3ee-aae6-4e57-961f-53cc5f3d40c0") : secret "image-registry-tls" not found Apr 16 22:15:58.541162 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:58.540912 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-metrics-certs podName:b1c23d84-45f9-42eb-9087-7a389e8722e3 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:02.540892085 +0000 UTC m=+133.121874283 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-metrics-certs") pod "router-default-86cf8799dd-t7q4v" (UID: "b1c23d84-45f9-42eb-9087-7a389e8722e3") : secret "router-metrics-certs-default" not found Apr 16 22:15:58.943879 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:58.943839 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jffnd\" (UID: \"d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jffnd" Apr 16 22:15:58.944055 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:58.943954 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:15:58.944055 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:58.944037 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07-networking-console-plugin-cert podName:d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07 nodeName:}" failed. No retries permitted until 2026-04-16 22:15:59.944023398 +0000 UTC m=+130.525005581 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jffnd" (UID: "d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07") : secret "networking-console-plugin-cert" not found Apr 16 22:15:59.850798 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:59.850758 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs\") pod \"network-metrics-daemon-cd2s2\" (UID: \"4ffa9102-be6a-431e-b1c8-ee3b5b01e588\") " pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:15:59.851185 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:59.850898 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:15:59.851185 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:59.850963 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs podName:4ffa9102-be6a-431e-b1c8-ee3b5b01e588 nodeName:}" failed. No retries permitted until 2026-04-16 22:18:01.850948442 +0000 UTC m=+252.431930629 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs") pod "network-metrics-daemon-cd2s2" (UID: "4ffa9102-be6a-431e-b1c8-ee3b5b01e588") : secret "metrics-daemon-secret" not found Apr 16 22:15:59.951885 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:15:59.951844 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jffnd\" (UID: \"d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jffnd" Apr 16 22:15:59.952052 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:59.951980 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:15:59.952091 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:15:59.952054 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07-networking-console-plugin-cert podName:d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:01.952038648 +0000 UTC m=+132.533020830 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jffnd" (UID: "d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07") : secret "networking-console-plugin-cert" not found Apr 16 22:16:00.145415 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:00.145342 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4h7zb_340bef7a-cec6-49ff-9089-98ba19d935b8/dns-node-resolver/0.log" Apr 16 22:16:01.545969 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:01.545939 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fcr4l_03fed82b-78d8-4490-8c58-013c3b157475/node-ca/0.log" Apr 16 22:16:01.968700 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:01.968620 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jffnd\" (UID: \"d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jffnd" Apr 16 22:16:01.968851 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:16:01.968766 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:16:01.968851 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:16:01.968839 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07-networking-console-plugin-cert podName:d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:05.968817858 +0000 UTC m=+136.549800057 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jffnd" (UID: "d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07") : secret "networking-console-plugin-cert" not found Apr 16 22:16:02.574290 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:02.574254 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-tls\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:16:02.574795 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:02.574320 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1c23d84-45f9-42eb-9087-7a389e8722e3-service-ca-bundle\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:16:02.574795 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:02.574366 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-metrics-certs\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:16:02.574795 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:16:02.574410 2572 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 22:16:02.574795 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:16:02.574434 2572 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54f7d4b79c-8d9n5: secret "image-registry-tls" not found Apr 16 22:16:02.574795 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:16:02.574485 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-tls podName:b1abd3ee-aae6-4e57-961f-53cc5f3d40c0 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:10.57446895 +0000 UTC m=+141.155451134 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-tls") pod "image-registry-54f7d4b79c-8d9n5" (UID: "b1abd3ee-aae6-4e57-961f-53cc5f3d40c0") : secret "image-registry-tls" not found Apr 16 22:16:02.574795 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:16:02.574498 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b1c23d84-45f9-42eb-9087-7a389e8722e3-service-ca-bundle podName:b1c23d84-45f9-42eb-9087-7a389e8722e3 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:10.574491777 +0000 UTC m=+141.155473959 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/b1c23d84-45f9-42eb-9087-7a389e8722e3-service-ca-bundle") pod "router-default-86cf8799dd-t7q4v" (UID: "b1c23d84-45f9-42eb-9087-7a389e8722e3") : configmap references non-existent config key: service-ca.crt Apr 16 22:16:02.574795 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:16:02.574503 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:16:02.574795 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:16:02.574595 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-metrics-certs podName:b1c23d84-45f9-42eb-9087-7a389e8722e3 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:10.574577798 +0000 UTC m=+141.155559984 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-metrics-certs") pod "router-default-86cf8799dd-t7q4v" (UID: "b1c23d84-45f9-42eb-9087-7a389e8722e3") : secret "router-metrics-certs-default" not found Apr 16 22:16:06.002428 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:06.002387 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jffnd\" (UID: \"d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jffnd" Apr 16 22:16:06.002840 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:16:06.002496 2572 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 22:16:06.002840 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:16:06.002577 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07-networking-console-plugin-cert podName:d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:14.002545479 +0000 UTC m=+144.583527662 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-jffnd" (UID: "d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07") : secret "networking-console-plugin-cert" not found Apr 16 22:16:10.641823 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:10.641791 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-tls\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:16:10.642206 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:10.641840 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1c23d84-45f9-42eb-9087-7a389e8722e3-service-ca-bundle\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:16:10.642206 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:10.642001 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-metrics-certs\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:16:10.642539 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:10.642513 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1c23d84-45f9-42eb-9087-7a389e8722e3-service-ca-bundle\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:16:10.644175 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:10.644152 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-tls\") pod \"image-registry-54f7d4b79c-8d9n5\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:16:10.644283 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:10.644203 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b1c23d84-45f9-42eb-9087-7a389e8722e3-metrics-certs\") pod \"router-default-86cf8799dd-t7q4v\" (UID: \"b1c23d84-45f9-42eb-9087-7a389e8722e3\") " pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:16:10.705483 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:10.705452 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:16:10.711213 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:10.711181 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:16:10.846648 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:10.846619 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-54f7d4b79c-8d9n5"] Apr 16 22:16:10.852517 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:16:10.852485 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1abd3ee_aae6_4e57_961f_53cc5f3d40c0.slice/crio-49df4a908a1699eec3bcaee301344c952e8e794d5904cc6d9357823aeace5c9c WatchSource:0}: Error finding container 49df4a908a1699eec3bcaee301344c952e8e794d5904cc6d9357823aeace5c9c: Status 404 returned error can't find the container with id 49df4a908a1699eec3bcaee301344c952e8e794d5904cc6d9357823aeace5c9c Apr 16 22:16:10.867071 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:10.867038 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-86cf8799dd-t7q4v"] Apr 16 22:16:10.870489 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:16:10.870456 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1c23d84_45f9_42eb_9087_7a389e8722e3.slice/crio-40fbe422b825c5417b903a9ff6a5b1e41b4069c1f41d0d97f61e66c83e595537 WatchSource:0}: Error finding container 40fbe422b825c5417b903a9ff6a5b1e41b4069c1f41d0d97f61e66c83e595537: Status 404 returned error can't find the container with id 40fbe422b825c5417b903a9ff6a5b1e41b4069c1f41d0d97f61e66c83e595537 Apr 16 22:16:11.495074 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:11.495032 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" event={"ID":"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0","Type":"ContainerStarted","Data":"945c6813b7330d1184d6a58de764fcfae25fdfd491e1a15965df671d0c463a7a"} Apr 16 22:16:11.495074 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:11.495074 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" event={"ID":"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0","Type":"ContainerStarted","Data":"49df4a908a1699eec3bcaee301344c952e8e794d5904cc6d9357823aeace5c9c"} Apr 16 22:16:11.495335 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:11.495139 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:16:11.496390 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:11.496365 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-86cf8799dd-t7q4v" event={"ID":"b1c23d84-45f9-42eb-9087-7a389e8722e3","Type":"ContainerStarted","Data":"5af2139e43086c7fcae160406ef5297b82cfa4522ba303e5efabde23a466de6e"} Apr 16 22:16:11.496390 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:11.496391 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-86cf8799dd-t7q4v" event={"ID":"b1c23d84-45f9-42eb-9087-7a389e8722e3","Type":"ContainerStarted","Data":"40fbe422b825c5417b903a9ff6a5b1e41b4069c1f41d0d97f61e66c83e595537"} Apr 16 22:16:11.536137 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:11.536088 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" podStartSLOduration=17.536072125 podStartE2EDuration="17.536072125s" podCreationTimestamp="2026-04-16 22:15:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:16:11.535208706 +0000 UTC m=+142.116190910" watchObservedRunningTime="2026-04-16 22:16:11.536072125 +0000 UTC m=+142.117054329" Apr 16 22:16:11.560130 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:11.560063 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-86cf8799dd-t7q4v" podStartSLOduration=17.560049304 podStartE2EDuration="17.560049304s" podCreationTimestamp="2026-04-16 22:15:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:16:11.559065704 +0000 UTC m=+142.140047909" watchObservedRunningTime="2026-04-16 22:16:11.560049304 +0000 UTC m=+142.141031537" Apr 16 22:16:11.706225 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:11.706193 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:16:11.708665 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:11.708643 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:16:12.499806 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:12.499775 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:16:12.501044 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:12.501023 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-86cf8799dd-t7q4v" Apr 16 22:16:14.069829 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:14.069791 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jffnd\" (UID: \"d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jffnd" Apr 16 22:16:14.072132 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:14.072110 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-jffnd\" (UID: \"d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-jffnd" Apr 16 22:16:14.103036 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:14.102996 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jffnd" Apr 16 22:16:14.233814 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:14.233782 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-jffnd"] Apr 16 22:16:14.236690 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:16:14.236653 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd40aaf22_02bf_42ef_8eb8_0ec95d0f5d07.slice/crio-3aabe65229afecfffd0ba82687b8705d8eb174423efa64a9f12c4efe69141401 WatchSource:0}: Error finding container 3aabe65229afecfffd0ba82687b8705d8eb174423efa64a9f12c4efe69141401: Status 404 returned error can't find the container with id 3aabe65229afecfffd0ba82687b8705d8eb174423efa64a9f12c4efe69141401 Apr 16 22:16:14.504880 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:14.504848 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jffnd" event={"ID":"d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07","Type":"ContainerStarted","Data":"3aabe65229afecfffd0ba82687b8705d8eb174423efa64a9f12c4efe69141401"} Apr 16 22:16:15.509245 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:15.509208 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jffnd" event={"ID":"d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07","Type":"ContainerStarted","Data":"2f8cfafd755834086b2373625fff0d03501ee61d2059f235d87466cf28042de6"} Apr 16 22:16:15.526244 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:15.526196 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-jffnd" podStartSLOduration=16.617745009 podStartE2EDuration="17.526183072s" podCreationTimestamp="2026-04-16 22:15:58 +0000 UTC" firstStartedPulling="2026-04-16 22:16:14.23890779 +0000 UTC m=+144.819889974" lastFinishedPulling="2026-04-16 22:16:15.147345853 +0000 UTC m=+145.728328037" observedRunningTime="2026-04-16 22:16:15.52572157 +0000 UTC m=+146.106703775" watchObservedRunningTime="2026-04-16 22:16:15.526183072 +0000 UTC m=+146.107165274" Apr 16 22:16:22.103874 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.103847 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-j42tt"] Apr 16 22:16:22.108291 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.108271 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-j42tt" Apr 16 22:16:22.110694 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.110672 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 22:16:22.110831 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.110736 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 22:16:22.111772 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.111753 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-k7nbm\"" Apr 16 22:16:22.119098 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.119073 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-j42tt"] Apr 16 22:16:22.132745 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.132708 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3992fb2c-0d69-437a-a001-636cdb1e92c2-crio-socket\") pod \"insights-runtime-extractor-j42tt\" (UID: \"3992fb2c-0d69-437a-a001-636cdb1e92c2\") " pod="openshift-insights/insights-runtime-extractor-j42tt" Apr 16 22:16:22.132866 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.132786 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3992fb2c-0d69-437a-a001-636cdb1e92c2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j42tt\" (UID: \"3992fb2c-0d69-437a-a001-636cdb1e92c2\") " pod="openshift-insights/insights-runtime-extractor-j42tt" Apr 16 22:16:22.132914 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.132896 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6968g\" (UniqueName: \"kubernetes.io/projected/3992fb2c-0d69-437a-a001-636cdb1e92c2-kube-api-access-6968g\") pod \"insights-runtime-extractor-j42tt\" (UID: \"3992fb2c-0d69-437a-a001-636cdb1e92c2\") " pod="openshift-insights/insights-runtime-extractor-j42tt" Apr 16 22:16:22.132949 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.132939 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3992fb2c-0d69-437a-a001-636cdb1e92c2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j42tt\" (UID: \"3992fb2c-0d69-437a-a001-636cdb1e92c2\") " pod="openshift-insights/insights-runtime-extractor-j42tt" Apr 16 22:16:22.132989 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.132968 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3992fb2c-0d69-437a-a001-636cdb1e92c2-data-volume\") pod \"insights-runtime-extractor-j42tt\" (UID: \"3992fb2c-0d69-437a-a001-636cdb1e92c2\") " pod="openshift-insights/insights-runtime-extractor-j42tt" Apr 16 22:16:22.172205 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.172177 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-54f7d4b79c-8d9n5"] Apr 16 22:16:22.233635 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.233603 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6968g\" (UniqueName: \"kubernetes.io/projected/3992fb2c-0d69-437a-a001-636cdb1e92c2-kube-api-access-6968g\") pod \"insights-runtime-extractor-j42tt\" (UID: \"3992fb2c-0d69-437a-a001-636cdb1e92c2\") " pod="openshift-insights/insights-runtime-extractor-j42tt" Apr 16 22:16:22.233797 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.233651 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3992fb2c-0d69-437a-a001-636cdb1e92c2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j42tt\" (UID: \"3992fb2c-0d69-437a-a001-636cdb1e92c2\") " pod="openshift-insights/insights-runtime-extractor-j42tt" Apr 16 22:16:22.233797 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.233741 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3992fb2c-0d69-437a-a001-636cdb1e92c2-data-volume\") pod \"insights-runtime-extractor-j42tt\" (UID: \"3992fb2c-0d69-437a-a001-636cdb1e92c2\") " pod="openshift-insights/insights-runtime-extractor-j42tt" Apr 16 22:16:22.233888 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.233875 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3992fb2c-0d69-437a-a001-636cdb1e92c2-crio-socket\") pod \"insights-runtime-extractor-j42tt\" (UID: \"3992fb2c-0d69-437a-a001-636cdb1e92c2\") " pod="openshift-insights/insights-runtime-extractor-j42tt" Apr 16 22:16:22.233970 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.233948 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3992fb2c-0d69-437a-a001-636cdb1e92c2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j42tt\" (UID: \"3992fb2c-0d69-437a-a001-636cdb1e92c2\") " pod="openshift-insights/insights-runtime-extractor-j42tt" Apr 16 22:16:22.234108 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.233994 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3992fb2c-0d69-437a-a001-636cdb1e92c2-crio-socket\") pod \"insights-runtime-extractor-j42tt\" (UID: \"3992fb2c-0d69-437a-a001-636cdb1e92c2\") " pod="openshift-insights/insights-runtime-extractor-j42tt" Apr 16 22:16:22.234108 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.234063 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3992fb2c-0d69-437a-a001-636cdb1e92c2-data-volume\") pod \"insights-runtime-extractor-j42tt\" (UID: \"3992fb2c-0d69-437a-a001-636cdb1e92c2\") " pod="openshift-insights/insights-runtime-extractor-j42tt" Apr 16 22:16:22.234361 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.234342 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3992fb2c-0d69-437a-a001-636cdb1e92c2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j42tt\" (UID: \"3992fb2c-0d69-437a-a001-636cdb1e92c2\") " pod="openshift-insights/insights-runtime-extractor-j42tt" Apr 16 22:16:22.236117 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.236098 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3992fb2c-0d69-437a-a001-636cdb1e92c2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j42tt\" (UID: \"3992fb2c-0d69-437a-a001-636cdb1e92c2\") " pod="openshift-insights/insights-runtime-extractor-j42tt" Apr 16 22:16:22.257198 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.257170 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6968g\" (UniqueName: \"kubernetes.io/projected/3992fb2c-0d69-437a-a001-636cdb1e92c2-kube-api-access-6968g\") pod \"insights-runtime-extractor-j42tt\" (UID: \"3992fb2c-0d69-437a-a001-636cdb1e92c2\") " pod="openshift-insights/insights-runtime-extractor-j42tt" Apr 16 22:16:22.417579 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.417472 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-j42tt" Apr 16 22:16:22.534036 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:22.534000 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-j42tt"] Apr 16 22:16:22.537749 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:16:22.537708 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3992fb2c_0d69_437a_a001_636cdb1e92c2.slice/crio-18da4c0d57e3ff314b5fb67f3bddd26e05838d377a19533688af89e24ba40a9d WatchSource:0}: Error finding container 18da4c0d57e3ff314b5fb67f3bddd26e05838d377a19533688af89e24ba40a9d: Status 404 returned error can't find the container with id 18da4c0d57e3ff314b5fb67f3bddd26e05838d377a19533688af89e24ba40a9d Apr 16 22:16:23.529536 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:23.529503 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j42tt" event={"ID":"3992fb2c-0d69-437a-a001-636cdb1e92c2","Type":"ContainerStarted","Data":"309ab5bf3bc536114d6e053a41628c48ad16cc23c5de772282d82a43a0526fcf"} Apr 16 22:16:23.529536 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:23.529540 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j42tt" event={"ID":"3992fb2c-0d69-437a-a001-636cdb1e92c2","Type":"ContainerStarted","Data":"381a460239fa21c7dc157e7b1dfbdb9533c13104efcf59417c7d8465ef982262"} Apr 16 22:16:23.529944 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:23.529562 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j42tt" event={"ID":"3992fb2c-0d69-437a-a001-636cdb1e92c2","Type":"ContainerStarted","Data":"18da4c0d57e3ff314b5fb67f3bddd26e05838d377a19533688af89e24ba40a9d"} Apr 16 22:16:25.307209 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:16:25.307169 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-x4vpp" podUID="6f971928-4414-4b05-9352-534e9045942e" Apr 16 22:16:25.331298 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:16:25.331270 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-g8j9n" podUID="f12d17a9-d1af-465c-a23e-81d8f09a5156" Apr 16 22:16:25.536245 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:25.536213 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x4vpp" Apr 16 22:16:25.536426 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:25.536218 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j42tt" event={"ID":"3992fb2c-0d69-437a-a001-636cdb1e92c2","Type":"ContainerStarted","Data":"e1a251a012921aea15ba253d22e29159779accc79375650d609feaf2ce28ff81"} Apr 16 22:16:25.577864 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:25.577772 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-j42tt" podStartSLOduration=1.392787236 podStartE2EDuration="3.577757046s" podCreationTimestamp="2026-04-16 22:16:22 +0000 UTC" firstStartedPulling="2026-04-16 22:16:22.594574147 +0000 UTC m=+153.175556329" lastFinishedPulling="2026-04-16 22:16:24.779543952 +0000 UTC m=+155.360526139" observedRunningTime="2026-04-16 22:16:25.575702415 +0000 UTC m=+156.156684620" watchObservedRunningTime="2026-04-16 22:16:25.577757046 +0000 UTC m=+156.158739268" Apr 16 22:16:26.048975 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:16:26.048934 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-cd2s2" podUID="4ffa9102-be6a-431e-b1c8-ee3b5b01e588" Apr 16 22:16:30.295866 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.295832 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert\") pod \"ingress-canary-g8j9n\" (UID: \"f12d17a9-d1af-465c-a23e-81d8f09a5156\") " pod="openshift-ingress-canary/ingress-canary-g8j9n" Apr 16 22:16:30.295866 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.295873 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls\") pod \"dns-default-x4vpp\" (UID: \"6f971928-4414-4b05-9352-534e9045942e\") " pod="openshift-dns/dns-default-x4vpp" Apr 16 22:16:30.298191 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.298166 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f12d17a9-d1af-465c-a23e-81d8f09a5156-cert\") pod \"ingress-canary-g8j9n\" (UID: \"f12d17a9-d1af-465c-a23e-81d8f09a5156\") " pod="openshift-ingress-canary/ingress-canary-g8j9n" Apr 16 22:16:30.298307 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.298173 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f971928-4414-4b05-9352-534e9045942e-metrics-tls\") pod \"dns-default-x4vpp\" (UID: \"6f971928-4414-4b05-9352-534e9045942e\") " pod="openshift-dns/dns-default-x4vpp" Apr 16 22:16:30.339674 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.339648 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fk2j6\"" Apr 16 22:16:30.347831 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.347799 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x4vpp" Apr 16 22:16:30.392229 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.392196 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-65d65d8466-j76bf"] Apr 16 22:16:30.396714 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.396622 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:30.399673 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.399648 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 22:16:30.399985 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.399971 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-7vb78\"" Apr 16 22:16:30.401121 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.400827 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 22:16:30.401121 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.400892 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 22:16:30.401121 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.400903 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 22:16:30.401121 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.401051 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 22:16:30.401121 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.401084 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 22:16:30.401345 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.401224 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 22:16:30.410175 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.410128 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65d65d8466-j76bf"] Apr 16 22:16:30.468154 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.468124 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x4vpp"] Apr 16 22:16:30.471321 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:16:30.471292 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f971928_4414_4b05_9352_534e9045942e.slice/crio-0ba31bbbc997980bcc718501d4b807d2f7576519866558a6867f7c50febb294d WatchSource:0}: Error finding container 0ba31bbbc997980bcc718501d4b807d2f7576519866558a6867f7c50febb294d: Status 404 returned error can't find the container with id 0ba31bbbc997980bcc718501d4b807d2f7576519866558a6867f7c50febb294d Apr 16 22:16:30.498122 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.498094 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cn9jb\" (UniqueName: \"kubernetes.io/projected/0e74cbe3-7fcf-4566-9104-5d85636c8296-kube-api-access-cn9jb\") pod \"console-65d65d8466-j76bf\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:30.498252 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.498144 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e74cbe3-7fcf-4566-9104-5d85636c8296-service-ca\") pod \"console-65d65d8466-j76bf\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:30.498252 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.498182 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0e74cbe3-7fcf-4566-9104-5d85636c8296-console-oauth-config\") pod \"console-65d65d8466-j76bf\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:30.498252 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.498209 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0e74cbe3-7fcf-4566-9104-5d85636c8296-oauth-serving-cert\") pod \"console-65d65d8466-j76bf\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:30.498252 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.498237 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0e74cbe3-7fcf-4566-9104-5d85636c8296-console-config\") pod \"console-65d65d8466-j76bf\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:30.498422 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.498267 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e74cbe3-7fcf-4566-9104-5d85636c8296-console-serving-cert\") pod \"console-65d65d8466-j76bf\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:30.548430 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.548352 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x4vpp" event={"ID":"6f971928-4414-4b05-9352-534e9045942e","Type":"ContainerStarted","Data":"0ba31bbbc997980bcc718501d4b807d2f7576519866558a6867f7c50febb294d"} Apr 16 22:16:30.598959 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.598923 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cn9jb\" (UniqueName: \"kubernetes.io/projected/0e74cbe3-7fcf-4566-9104-5d85636c8296-kube-api-access-cn9jb\") pod \"console-65d65d8466-j76bf\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:30.599129 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.598979 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e74cbe3-7fcf-4566-9104-5d85636c8296-service-ca\") pod \"console-65d65d8466-j76bf\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:30.599129 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.599014 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0e74cbe3-7fcf-4566-9104-5d85636c8296-console-oauth-config\") pod \"console-65d65d8466-j76bf\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:30.599129 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.599039 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0e74cbe3-7fcf-4566-9104-5d85636c8296-oauth-serving-cert\") pod \"console-65d65d8466-j76bf\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:30.599287 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.599196 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0e74cbe3-7fcf-4566-9104-5d85636c8296-console-config\") pod \"console-65d65d8466-j76bf\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:30.599287 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.599236 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e74cbe3-7fcf-4566-9104-5d85636c8296-console-serving-cert\") pod \"console-65d65d8466-j76bf\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:30.599776 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.599747 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0e74cbe3-7fcf-4566-9104-5d85636c8296-oauth-serving-cert\") pod \"console-65d65d8466-j76bf\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:30.601091 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.599857 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e74cbe3-7fcf-4566-9104-5d85636c8296-service-ca\") pod \"console-65d65d8466-j76bf\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:30.601091 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.600157 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0e74cbe3-7fcf-4566-9104-5d85636c8296-console-config\") pod \"console-65d65d8466-j76bf\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:30.604042 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.602106 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e74cbe3-7fcf-4566-9104-5d85636c8296-console-serving-cert\") pod \"console-65d65d8466-j76bf\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:30.604042 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.602126 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0e74cbe3-7fcf-4566-9104-5d85636c8296-console-oauth-config\") pod \"console-65d65d8466-j76bf\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:30.606746 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.606725 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cn9jb\" (UniqueName: \"kubernetes.io/projected/0e74cbe3-7fcf-4566-9104-5d85636c8296-kube-api-access-cn9jb\") pod \"console-65d65d8466-j76bf\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:30.709760 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.709729 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:30.835172 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:30.835102 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65d65d8466-j76bf"] Apr 16 22:16:30.838497 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:16:30.838471 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e74cbe3_7fcf_4566_9104_5d85636c8296.slice/crio-cb40c0959c768f07b61808e469aec3bb0f9464d8181c5ddb8d46eb127315cd09 WatchSource:0}: Error finding container cb40c0959c768f07b61808e469aec3bb0f9464d8181c5ddb8d46eb127315cd09: Status 404 returned error can't find the container with id cb40c0959c768f07b61808e469aec3bb0f9464d8181c5ddb8d46eb127315cd09 Apr 16 22:16:31.552907 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:31.552865 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65d65d8466-j76bf" event={"ID":"0e74cbe3-7fcf-4566-9104-5d85636c8296","Type":"ContainerStarted","Data":"cb40c0959c768f07b61808e469aec3bb0f9464d8181c5ddb8d46eb127315cd09"} Apr 16 22:16:32.177675 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.177642 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:16:32.300739 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.300672 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq" podUID="3ba2875b-6c1b-4079-bff7-93fdd5a00802" containerName="acm-agent" probeResult="failure" output="Get \"http://10.134.0.7:8000/readyz\": dial tcp 10.134.0.7:8000: connect: connection refused" Apr 16 22:16:32.332614 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.332585 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-plkbt"] Apr 16 22:16:32.335860 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.335830 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-plkbt" Apr 16 22:16:32.345544 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.345453 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 22:16:32.345718 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.345568 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 22:16:32.345718 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.345603 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 22:16:32.345846 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.345751 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-p4gj5\"" Apr 16 22:16:32.345846 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.345813 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 22:16:32.345933 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.345904 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 22:16:32.360385 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.360322 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-plkbt"] Apr 16 22:16:32.415604 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.415564 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46cddf16-7aa1-4bda-910d-0514af9437a9-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-plkbt\" (UID: \"46cddf16-7aa1-4bda-910d-0514af9437a9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-plkbt" Apr 16 22:16:32.415777 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.415637 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46cddf16-7aa1-4bda-910d-0514af9437a9-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-plkbt\" (UID: \"46cddf16-7aa1-4bda-910d-0514af9437a9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-plkbt" Apr 16 22:16:32.415777 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.415669 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/46cddf16-7aa1-4bda-910d-0514af9437a9-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-plkbt\" (UID: \"46cddf16-7aa1-4bda-910d-0514af9437a9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-plkbt" Apr 16 22:16:32.415777 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.415743 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mst8c\" (UniqueName: \"kubernetes.io/projected/46cddf16-7aa1-4bda-910d-0514af9437a9-kube-api-access-mst8c\") pod \"prometheus-operator-5676c8c784-plkbt\" (UID: \"46cddf16-7aa1-4bda-910d-0514af9437a9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-plkbt" Apr 16 22:16:32.516544 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.516515 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mst8c\" (UniqueName: \"kubernetes.io/projected/46cddf16-7aa1-4bda-910d-0514af9437a9-kube-api-access-mst8c\") pod \"prometheus-operator-5676c8c784-plkbt\" (UID: \"46cddf16-7aa1-4bda-910d-0514af9437a9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-plkbt" Apr 16 22:16:32.516736 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.516671 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46cddf16-7aa1-4bda-910d-0514af9437a9-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-plkbt\" (UID: \"46cddf16-7aa1-4bda-910d-0514af9437a9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-plkbt" Apr 16 22:16:32.516736 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.516700 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46cddf16-7aa1-4bda-910d-0514af9437a9-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-plkbt\" (UID: \"46cddf16-7aa1-4bda-910d-0514af9437a9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-plkbt" Apr 16 22:16:32.516852 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.516733 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/46cddf16-7aa1-4bda-910d-0514af9437a9-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-plkbt\" (UID: \"46cddf16-7aa1-4bda-910d-0514af9437a9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-plkbt" Apr 16 22:16:32.517466 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.517440 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46cddf16-7aa1-4bda-910d-0514af9437a9-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-plkbt\" (UID: \"46cddf16-7aa1-4bda-910d-0514af9437a9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-plkbt" Apr 16 22:16:32.519514 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.519463 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/46cddf16-7aa1-4bda-910d-0514af9437a9-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-plkbt\" (UID: \"46cddf16-7aa1-4bda-910d-0514af9437a9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-plkbt" Apr 16 22:16:32.519715 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.519698 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46cddf16-7aa1-4bda-910d-0514af9437a9-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-plkbt\" (UID: \"46cddf16-7aa1-4bda-910d-0514af9437a9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-plkbt" Apr 16 22:16:32.524285 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.524259 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mst8c\" (UniqueName: \"kubernetes.io/projected/46cddf16-7aa1-4bda-910d-0514af9437a9-kube-api-access-mst8c\") pod \"prometheus-operator-5676c8c784-plkbt\" (UID: \"46cddf16-7aa1-4bda-910d-0514af9437a9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-plkbt" Apr 16 22:16:32.556997 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.556967 2572 generic.go:358] "Generic (PLEG): container finished" podID="3ba2875b-6c1b-4079-bff7-93fdd5a00802" containerID="666703ff874986678676b1ffdf36d992022dba9d6d79ad17eaaf26642bcab445" exitCode=1 Apr 16 22:16:32.557385 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.557051 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq" event={"ID":"3ba2875b-6c1b-4079-bff7-93fdd5a00802","Type":"ContainerDied","Data":"666703ff874986678676b1ffdf36d992022dba9d6d79ad17eaaf26642bcab445"} Apr 16 22:16:32.557508 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.557492 2572 scope.go:117] "RemoveContainer" containerID="666703ff874986678676b1ffdf36d992022dba9d6d79ad17eaaf26642bcab445" Apr 16 22:16:32.558537 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.558517 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x4vpp" event={"ID":"6f971928-4414-4b05-9352-534e9045942e","Type":"ContainerStarted","Data":"79a364b69e37e85aaf7d5133ff07798100350bc4a50fbf3d4a10f1581c67f870"} Apr 16 22:16:32.558642 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.558544 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x4vpp" event={"ID":"6f971928-4414-4b05-9352-534e9045942e","Type":"ContainerStarted","Data":"5797585fa7c54d514d6ea827b62d65a5bde8f65543ce0dc379c0271b246ef1c1"} Apr 16 22:16:32.558688 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.558673 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-x4vpp" Apr 16 22:16:32.591946 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.591892 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-x4vpp" podStartSLOduration=129.241771595 podStartE2EDuration="2m10.591871668s" podCreationTimestamp="2026-04-16 22:14:22 +0000 UTC" firstStartedPulling="2026-04-16 22:16:30.473155981 +0000 UTC m=+161.054138165" lastFinishedPulling="2026-04-16 22:16:31.823256019 +0000 UTC m=+162.404238238" observedRunningTime="2026-04-16 22:16:32.590846271 +0000 UTC m=+163.171828706" watchObservedRunningTime="2026-04-16 22:16:32.591871668 +0000 UTC m=+163.172853874" Apr 16 22:16:32.646180 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:32.646102 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-plkbt" Apr 16 22:16:33.680373 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:33.680349 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-plkbt"] Apr 16 22:16:33.683730 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:16:33.683702 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46cddf16_7aa1_4bda_910d_0514af9437a9.slice/crio-1e26dbce45610fa4ce8207483728e99270b7b2aa2a629695a51ffb2c395f4124 WatchSource:0}: Error finding container 1e26dbce45610fa4ce8207483728e99270b7b2aa2a629695a51ffb2c395f4124: Status 404 returned error can't find the container with id 1e26dbce45610fa4ce8207483728e99270b7b2aa2a629695a51ffb2c395f4124 Apr 16 22:16:34.565181 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:34.565142 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq" event={"ID":"3ba2875b-6c1b-4079-bff7-93fdd5a00802","Type":"ContainerStarted","Data":"1fe219a2fbfd3384b0a5e223f9c5d2557258805ed6ae4d3d77a2cd1e82489b2a"} Apr 16 22:16:34.565494 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:34.565468 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq" Apr 16 22:16:34.566149 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:34.566128 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-84ffc658ff-bz9xq" Apr 16 22:16:34.566527 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:34.566508 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65d65d8466-j76bf" event={"ID":"0e74cbe3-7fcf-4566-9104-5d85636c8296","Type":"ContainerStarted","Data":"5ccaf3587b103ee3cdcab941a3a2c74028f8283bfa391dfab79adecef2cf79e6"} Apr 16 22:16:34.567412 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:34.567394 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-plkbt" event={"ID":"46cddf16-7aa1-4bda-910d-0514af9437a9","Type":"ContainerStarted","Data":"1e26dbce45610fa4ce8207483728e99270b7b2aa2a629695a51ffb2c395f4124"} Apr 16 22:16:34.621016 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:34.620971 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-65d65d8466-j76bf" podStartSLOduration=1.8553794780000001 podStartE2EDuration="4.620958414s" podCreationTimestamp="2026-04-16 22:16:30 +0000 UTC" firstStartedPulling="2026-04-16 22:16:30.84026653 +0000 UTC m=+161.421248712" lastFinishedPulling="2026-04-16 22:16:33.605845463 +0000 UTC m=+164.186827648" observedRunningTime="2026-04-16 22:16:34.620432793 +0000 UTC m=+165.201415008" watchObservedRunningTime="2026-04-16 22:16:34.620958414 +0000 UTC m=+165.201940618" Apr 16 22:16:35.573058 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:35.572967 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-plkbt" event={"ID":"46cddf16-7aa1-4bda-910d-0514af9437a9","Type":"ContainerStarted","Data":"24431e5892259721d8a2c187a75964379fc4bd470a588c077740ffcfc0e4a265"} Apr 16 22:16:35.573546 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:35.573074 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-plkbt" event={"ID":"46cddf16-7aa1-4bda-910d-0514af9437a9","Type":"ContainerStarted","Data":"6e67a3a0554dd11399d833370ad656449a5a12710355b2e73626262baf1c832c"} Apr 16 22:16:35.589290 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:35.589238 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-plkbt" podStartSLOduration=1.906169066 podStartE2EDuration="3.589221941s" podCreationTimestamp="2026-04-16 22:16:32 +0000 UTC" firstStartedPulling="2026-04-16 22:16:33.686253988 +0000 UTC m=+164.267236175" lastFinishedPulling="2026-04-16 22:16:35.369306867 +0000 UTC m=+165.950289050" observedRunningTime="2026-04-16 22:16:35.58907061 +0000 UTC m=+166.170052816" watchObservedRunningTime="2026-04-16 22:16:35.589221941 +0000 UTC m=+166.170204346" Apr 16 22:16:37.037966 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.037916 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:16:37.721390 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.721355 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-8q4ww"] Apr 16 22:16:37.724642 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.724621 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" Apr 16 22:16:37.727054 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.727023 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 22:16:37.727435 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.727418 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 22:16:37.728400 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.728369 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 22:16:37.728400 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.728381 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-k9855\"" Apr 16 22:16:37.730840 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.730820 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-tw4zw"] Apr 16 22:16:37.733988 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.733972 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.736292 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.736272 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 22:16:37.736402 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.736316 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 22:16:37.737008 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.736989 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-pvvh7\"" Apr 16 22:16:37.737101 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.737027 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 22:16:37.737190 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.737175 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-8q4ww"] Apr 16 22:16:37.761993 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.761966 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/dc59af4b-1aaa-4130-832a-3fd476648af6-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-8q4ww\" (UID: \"dc59af4b-1aaa-4130-832a-3fd476648af6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" Apr 16 22:16:37.762130 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.762013 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42jrh\" (UniqueName: \"kubernetes.io/projected/dc59af4b-1aaa-4130-832a-3fd476648af6-kube-api-access-42jrh\") pod \"kube-state-metrics-69db897b98-8q4ww\" (UID: \"dc59af4b-1aaa-4130-832a-3fd476648af6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" Apr 16 22:16:37.762130 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.762037 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/dc59af4b-1aaa-4130-832a-3fd476648af6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-8q4ww\" (UID: \"dc59af4b-1aaa-4130-832a-3fd476648af6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" Apr 16 22:16:37.762130 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.762111 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dc59af4b-1aaa-4130-832a-3fd476648af6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-8q4ww\" (UID: \"dc59af4b-1aaa-4130-832a-3fd476648af6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" Apr 16 22:16:37.762238 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.762169 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dc59af4b-1aaa-4130-832a-3fd476648af6-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-8q4ww\" (UID: \"dc59af4b-1aaa-4130-832a-3fd476648af6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" Apr 16 22:16:37.762238 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.762206 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dc59af4b-1aaa-4130-832a-3fd476648af6-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-8q4ww\" (UID: \"dc59af4b-1aaa-4130-832a-3fd476648af6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" Apr 16 22:16:37.862909 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.862871 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bc45ba3d-3968-4f05-a932-07526eacce50-root\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.862909 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.862909 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g7mw\" (UniqueName: \"kubernetes.io/projected/bc45ba3d-3968-4f05-a932-07526eacce50-kube-api-access-6g7mw\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.863164 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.862937 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dc59af4b-1aaa-4130-832a-3fd476648af6-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-8q4ww\" (UID: \"dc59af4b-1aaa-4130-832a-3fd476648af6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" Apr 16 22:16:37.863164 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.862970 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/dc59af4b-1aaa-4130-832a-3fd476648af6-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-8q4ww\" (UID: \"dc59af4b-1aaa-4130-832a-3fd476648af6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" Apr 16 22:16:37.863164 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.863000 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42jrh\" (UniqueName: \"kubernetes.io/projected/dc59af4b-1aaa-4130-832a-3fd476648af6-kube-api-access-42jrh\") pod \"kube-state-metrics-69db897b98-8q4ww\" (UID: \"dc59af4b-1aaa-4130-832a-3fd476648af6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" Apr 16 22:16:37.863164 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.863031 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/dc59af4b-1aaa-4130-832a-3fd476648af6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-8q4ww\" (UID: \"dc59af4b-1aaa-4130-832a-3fd476648af6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" Apr 16 22:16:37.863164 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.863079 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bc45ba3d-3968-4f05-a932-07526eacce50-sys\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.863164 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.863111 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bc45ba3d-3968-4f05-a932-07526eacce50-node-exporter-accelerators-collector-config\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.863164 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.863137 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dc59af4b-1aaa-4130-832a-3fd476648af6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-8q4ww\" (UID: \"dc59af4b-1aaa-4130-832a-3fd476648af6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" Apr 16 22:16:37.863481 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.863199 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc45ba3d-3968-4f05-a932-07526eacce50-metrics-client-ca\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.863481 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.863241 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bc45ba3d-3968-4f05-a932-07526eacce50-node-exporter-wtmp\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.863481 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.863283 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bc45ba3d-3968-4f05-a932-07526eacce50-node-exporter-textfile\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.863481 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.863328 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dc59af4b-1aaa-4130-832a-3fd476648af6-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-8q4ww\" (UID: \"dc59af4b-1aaa-4130-832a-3fd476648af6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" Apr 16 22:16:37.863481 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.863353 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bc45ba3d-3968-4f05-a932-07526eacce50-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.863481 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.863389 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bc45ba3d-3968-4f05-a932-07526eacce50-node-exporter-tls\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.863481 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.863425 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/dc59af4b-1aaa-4130-832a-3fd476648af6-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-8q4ww\" (UID: \"dc59af4b-1aaa-4130-832a-3fd476648af6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" Apr 16 22:16:37.863760 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.863744 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dc59af4b-1aaa-4130-832a-3fd476648af6-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-8q4ww\" (UID: \"dc59af4b-1aaa-4130-832a-3fd476648af6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" Apr 16 22:16:37.863873 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.863854 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/dc59af4b-1aaa-4130-832a-3fd476648af6-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-8q4ww\" (UID: \"dc59af4b-1aaa-4130-832a-3fd476648af6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" Apr 16 22:16:37.865742 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.865723 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/dc59af4b-1aaa-4130-832a-3fd476648af6-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-8q4ww\" (UID: \"dc59af4b-1aaa-4130-832a-3fd476648af6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" Apr 16 22:16:37.865782 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.865742 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/dc59af4b-1aaa-4130-832a-3fd476648af6-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-8q4ww\" (UID: \"dc59af4b-1aaa-4130-832a-3fd476648af6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" Apr 16 22:16:37.870898 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.870878 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42jrh\" (UniqueName: \"kubernetes.io/projected/dc59af4b-1aaa-4130-832a-3fd476648af6-kube-api-access-42jrh\") pod \"kube-state-metrics-69db897b98-8q4ww\" (UID: \"dc59af4b-1aaa-4130-832a-3fd476648af6\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" Apr 16 22:16:37.964092 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.964056 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bc45ba3d-3968-4f05-a932-07526eacce50-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.964092 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.964093 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bc45ba3d-3968-4f05-a932-07526eacce50-node-exporter-tls\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.964326 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.964116 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bc45ba3d-3968-4f05-a932-07526eacce50-root\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.964326 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.964132 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6g7mw\" (UniqueName: \"kubernetes.io/projected/bc45ba3d-3968-4f05-a932-07526eacce50-kube-api-access-6g7mw\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.964326 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.964180 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bc45ba3d-3968-4f05-a932-07526eacce50-sys\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.964326 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.964189 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bc45ba3d-3968-4f05-a932-07526eacce50-root\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.964326 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:16:37.964214 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 22:16:37.964326 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.964200 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bc45ba3d-3968-4f05-a932-07526eacce50-node-exporter-accelerators-collector-config\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.964326 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.964274 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc45ba3d-3968-4f05-a932-07526eacce50-metrics-client-ca\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.964326 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.964282 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bc45ba3d-3968-4f05-a932-07526eacce50-sys\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.964326 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:16:37.964287 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc45ba3d-3968-4f05-a932-07526eacce50-node-exporter-tls podName:bc45ba3d-3968-4f05-a932-07526eacce50 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:38.464266438 +0000 UTC m=+169.045248635 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/bc45ba3d-3968-4f05-a932-07526eacce50-node-exporter-tls") pod "node-exporter-tw4zw" (UID: "bc45ba3d-3968-4f05-a932-07526eacce50") : secret "node-exporter-tls" not found Apr 16 22:16:37.964755 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.964342 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bc45ba3d-3968-4f05-a932-07526eacce50-node-exporter-wtmp\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.964822 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.964755 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bc45ba3d-3968-4f05-a932-07526eacce50-node-exporter-textfile\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.964905 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.964875 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bc45ba3d-3968-4f05-a932-07526eacce50-node-exporter-wtmp\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.965152 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.965126 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bc45ba3d-3968-4f05-a932-07526eacce50-metrics-client-ca\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.965567 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.965525 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bc45ba3d-3968-4f05-a932-07526eacce50-node-exporter-textfile\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.965831 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.965812 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bc45ba3d-3968-4f05-a932-07526eacce50-node-exporter-accelerators-collector-config\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.967158 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.967134 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bc45ba3d-3968-4f05-a932-07526eacce50-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:37.976782 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:37.976730 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g7mw\" (UniqueName: \"kubernetes.io/projected/bc45ba3d-3968-4f05-a932-07526eacce50-kube-api-access-6g7mw\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:38.033829 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:38.033797 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" Apr 16 22:16:38.159525 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:38.159361 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-8q4ww"] Apr 16 22:16:38.162001 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:16:38.161974 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc59af4b_1aaa_4130_832a_3fd476648af6.slice/crio-9e5e570285ff3348b6410c98692f37f6732755a0ec0c387022506e140be3b8d5 WatchSource:0}: Error finding container 9e5e570285ff3348b6410c98692f37f6732755a0ec0c387022506e140be3b8d5: Status 404 returned error can't find the container with id 9e5e570285ff3348b6410c98692f37f6732755a0ec0c387022506e140be3b8d5 Apr 16 22:16:38.470233 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:38.470192 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bc45ba3d-3968-4f05-a932-07526eacce50-node-exporter-tls\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:38.470417 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:16:38.470326 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 22:16:38.470417 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:16:38.470387 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bc45ba3d-3968-4f05-a932-07526eacce50-node-exporter-tls podName:bc45ba3d-3968-4f05-a932-07526eacce50 nodeName:}" failed. No retries permitted until 2026-04-16 22:16:39.470370682 +0000 UTC m=+170.051352883 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/bc45ba3d-3968-4f05-a932-07526eacce50-node-exporter-tls") pod "node-exporter-tw4zw" (UID: "bc45ba3d-3968-4f05-a932-07526eacce50") : secret "node-exporter-tls" not found Apr 16 22:16:38.581358 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:38.581325 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" event={"ID":"dc59af4b-1aaa-4130-832a-3fd476648af6","Type":"ContainerStarted","Data":"9e5e570285ff3348b6410c98692f37f6732755a0ec0c387022506e140be3b8d5"} Apr 16 22:16:39.479455 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:39.479413 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bc45ba3d-3968-4f05-a932-07526eacce50-node-exporter-tls\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:39.481859 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:39.481819 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bc45ba3d-3968-4f05-a932-07526eacce50-node-exporter-tls\") pod \"node-exporter-tw4zw\" (UID: \"bc45ba3d-3968-4f05-a932-07526eacce50\") " pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:39.542956 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:39.542916 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tw4zw" Apr 16 22:16:39.551242 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:16:39.551213 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc45ba3d_3968_4f05_a932_07526eacce50.slice/crio-b5d89deb6c937561b69689181a224d9f772b4bfbfa8d73f061fb01c85000a581 WatchSource:0}: Error finding container b5d89deb6c937561b69689181a224d9f772b4bfbfa8d73f061fb01c85000a581: Status 404 returned error can't find the container with id b5d89deb6c937561b69689181a224d9f772b4bfbfa8d73f061fb01c85000a581 Apr 16 22:16:39.586648 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:39.586602 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" event={"ID":"dc59af4b-1aaa-4130-832a-3fd476648af6","Type":"ContainerStarted","Data":"2abb5e0d45ad976d242067645f8c198255eb3e09434708736dd119fe4c0eda95"} Apr 16 22:16:39.586648 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:39.586652 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" event={"ID":"dc59af4b-1aaa-4130-832a-3fd476648af6","Type":"ContainerStarted","Data":"87b09a58f75cf1d7d5736af024dfa5870995a783aa7d37287f4db4f6aa541614"} Apr 16 22:16:39.586898 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:39.586666 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" event={"ID":"dc59af4b-1aaa-4130-832a-3fd476648af6","Type":"ContainerStarted","Data":"df3a4e61e877c18bb52778551989214d81cd4788f50baeaf580aaeddf2a7a23c"} Apr 16 22:16:39.588347 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:39.588313 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tw4zw" event={"ID":"bc45ba3d-3968-4f05-a932-07526eacce50","Type":"ContainerStarted","Data":"b5d89deb6c937561b69689181a224d9f772b4bfbfa8d73f061fb01c85000a581"} Apr 16 22:16:39.619113 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:39.619061 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-8q4ww" podStartSLOduration=1.542389103 podStartE2EDuration="2.619045141s" podCreationTimestamp="2026-04-16 22:16:37 +0000 UTC" firstStartedPulling="2026-04-16 22:16:38.163741609 +0000 UTC m=+168.744723792" lastFinishedPulling="2026-04-16 22:16:39.240397644 +0000 UTC m=+169.821379830" observedRunningTime="2026-04-16 22:16:39.610667459 +0000 UTC m=+170.191649664" watchObservedRunningTime="2026-04-16 22:16:39.619045141 +0000 UTC m=+170.200027345" Apr 16 22:16:39.990105 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:39.990021 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz"] Apr 16 22:16:39.998364 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:39.998334 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.007314 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.007292 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 22:16:40.007425 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.007342 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 22:16:40.007425 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.007292 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 22:16:40.007522 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.007455 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 22:16:40.013321 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.013301 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 22:16:40.013321 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.013319 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-fkr10l490ke2p\"" Apr 16 22:16:40.013452 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.013378 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-qdk7h\"" Apr 16 22:16:40.037832 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.037801 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-g8j9n" Apr 16 22:16:40.048393 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.048368 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-86pmr\"" Apr 16 22:16:40.058486 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.058449 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-g8j9n" Apr 16 22:16:40.062917 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.062892 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz"] Apr 16 22:16:40.083674 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.083641 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9j86\" (UniqueName: \"kubernetes.io/projected/afd4a484-060b-40a6-bb14-b477fd906bf1-kube-api-access-b9j86\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.083839 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.083708 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/afd4a484-060b-40a6-bb14-b477fd906bf1-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.083839 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.083812 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/afd4a484-060b-40a6-bb14-b477fd906bf1-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.083959 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.083912 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/afd4a484-060b-40a6-bb14-b477fd906bf1-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.083959 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.083948 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/afd4a484-060b-40a6-bb14-b477fd906bf1-metrics-client-ca\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.084059 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.083974 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/afd4a484-060b-40a6-bb14-b477fd906bf1-secret-grpc-tls\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.084059 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.084021 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/afd4a484-060b-40a6-bb14-b477fd906bf1-secret-thanos-querier-tls\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.084142 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.084059 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/afd4a484-060b-40a6-bb14-b477fd906bf1-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.185299 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.185260 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/afd4a484-060b-40a6-bb14-b477fd906bf1-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.185452 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.185309 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/afd4a484-060b-40a6-bb14-b477fd906bf1-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.185452 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.185328 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/afd4a484-060b-40a6-bb14-b477fd906bf1-metrics-client-ca\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.185452 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.185346 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/afd4a484-060b-40a6-bb14-b477fd906bf1-secret-grpc-tls\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.185452 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.185366 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/afd4a484-060b-40a6-bb14-b477fd906bf1-secret-thanos-querier-tls\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.185452 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.185389 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/afd4a484-060b-40a6-bb14-b477fd906bf1-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.185814 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.185733 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9j86\" (UniqueName: \"kubernetes.io/projected/afd4a484-060b-40a6-bb14-b477fd906bf1-kube-api-access-b9j86\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.185872 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.185836 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/afd4a484-060b-40a6-bb14-b477fd906bf1-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.186126 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.186099 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/afd4a484-060b-40a6-bb14-b477fd906bf1-metrics-client-ca\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.188542 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.188322 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/afd4a484-060b-40a6-bb14-b477fd906bf1-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.188675 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.188655 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/afd4a484-060b-40a6-bb14-b477fd906bf1-secret-grpc-tls\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.188767 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.188688 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/afd4a484-060b-40a6-bb14-b477fd906bf1-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.189011 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.188980 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/afd4a484-060b-40a6-bb14-b477fd906bf1-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.189117 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.189076 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/afd4a484-060b-40a6-bb14-b477fd906bf1-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.189177 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.189129 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/afd4a484-060b-40a6-bb14-b477fd906bf1-secret-thanos-querier-tls\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.211359 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.211328 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9j86\" (UniqueName: \"kubernetes.io/projected/afd4a484-060b-40a6-bb14-b477fd906bf1-kube-api-access-b9j86\") pod \"thanos-querier-5c945fbc4f-9dvfz\" (UID: \"afd4a484-060b-40a6-bb14-b477fd906bf1\") " pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.308099 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.308069 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-g8j9n"] Apr 16 22:16:40.309830 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.308994 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:40.315234 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:16:40.315205 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf12d17a9_d1af_465c_a23e_81d8f09a5156.slice/crio-7e82408b2a3f2eae2c728bcda90b8f3e1434499642b038fcd0a33b368e27c1df WatchSource:0}: Error finding container 7e82408b2a3f2eae2c728bcda90b8f3e1434499642b038fcd0a33b368e27c1df: Status 404 returned error can't find the container with id 7e82408b2a3f2eae2c728bcda90b8f3e1434499642b038fcd0a33b368e27c1df Apr 16 22:16:40.468475 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.468442 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz"] Apr 16 22:16:40.472934 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:16:40.472901 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafd4a484_060b_40a6_bb14_b477fd906bf1.slice/crio-844c06448920c581b8240d7b4260eab7a3832a38ad1d6b7cc998e307c3a1dbf9 WatchSource:0}: Error finding container 844c06448920c581b8240d7b4260eab7a3832a38ad1d6b7cc998e307c3a1dbf9: Status 404 returned error can't find the container with id 844c06448920c581b8240d7b4260eab7a3832a38ad1d6b7cc998e307c3a1dbf9 Apr 16 22:16:40.591710 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.591625 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" event={"ID":"afd4a484-060b-40a6-bb14-b477fd906bf1","Type":"ContainerStarted","Data":"844c06448920c581b8240d7b4260eab7a3832a38ad1d6b7cc998e307c3a1dbf9"} Apr 16 22:16:40.592896 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.592866 2572 generic.go:358] "Generic (PLEG): container finished" podID="bc45ba3d-3968-4f05-a932-07526eacce50" containerID="24513b5dbb79214d9b5fb06e782dd39623045c47db7a1e1e4d05d544d76bb63e" exitCode=0 Apr 16 22:16:40.593019 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.592962 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tw4zw" event={"ID":"bc45ba3d-3968-4f05-a932-07526eacce50","Type":"ContainerDied","Data":"24513b5dbb79214d9b5fb06e782dd39623045c47db7a1e1e4d05d544d76bb63e"} Apr 16 22:16:40.593930 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.593907 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-g8j9n" event={"ID":"f12d17a9-d1af-465c-a23e-81d8f09a5156","Type":"ContainerStarted","Data":"7e82408b2a3f2eae2c728bcda90b8f3e1434499642b038fcd0a33b368e27c1df"} Apr 16 22:16:40.721133 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.715568 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:40.721133 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.716382 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:40.732039 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:40.732016 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:41.599496 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:41.599457 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tw4zw" event={"ID":"bc45ba3d-3968-4f05-a932-07526eacce50","Type":"ContainerStarted","Data":"9471832e35bc20bb2161dc69269941c79c75e4002f38bebda935d3cf6716597c"} Apr 16 22:16:41.599970 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:41.599503 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tw4zw" event={"ID":"bc45ba3d-3968-4f05-a932-07526eacce50","Type":"ContainerStarted","Data":"76870d13170910260876f0852eebdc52cee538ad04faa9a4196ecf99ff45e738"} Apr 16 22:16:41.603974 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:41.603940 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:16:41.620971 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:41.620925 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-tw4zw" podStartSLOduration=3.9538633499999998 podStartE2EDuration="4.620910361s" podCreationTimestamp="2026-04-16 22:16:37 +0000 UTC" firstStartedPulling="2026-04-16 22:16:39.553020588 +0000 UTC m=+170.134002772" lastFinishedPulling="2026-04-16 22:16:40.220067595 +0000 UTC m=+170.801049783" observedRunningTime="2026-04-16 22:16:41.619139984 +0000 UTC m=+172.200122274" watchObservedRunningTime="2026-04-16 22:16:41.620910361 +0000 UTC m=+172.201892783" Apr 16 22:16:42.563729 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:42.563705 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-x4vpp" Apr 16 22:16:43.611945 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:43.611846 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" event={"ID":"afd4a484-060b-40a6-bb14-b477fd906bf1","Type":"ContainerStarted","Data":"fdc7ace1858fd9f8f3d89c77226c0eb689f08ac0bd82f7358eb66483b92ad4b7"} Apr 16 22:16:43.611945 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:43.611890 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" event={"ID":"afd4a484-060b-40a6-bb14-b477fd906bf1","Type":"ContainerStarted","Data":"159a05cfb25fd59ed1937d8804c6cc39275551e4c8ab00d5012783fe0483a4b0"} Apr 16 22:16:43.611945 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:43.611905 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" event={"ID":"afd4a484-060b-40a6-bb14-b477fd906bf1","Type":"ContainerStarted","Data":"e0454d8ce79e2f3c91bceece61b8ed167308d95eb083d1a9fbe702ee67d285b7"} Apr 16 22:16:43.611945 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:43.611918 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" event={"ID":"afd4a484-060b-40a6-bb14-b477fd906bf1","Type":"ContainerStarted","Data":"8875f8150bc1623b251db21252e02086f37539d45d38349b1a9b71255295a01f"} Apr 16 22:16:43.611945 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:43.611932 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" event={"ID":"afd4a484-060b-40a6-bb14-b477fd906bf1","Type":"ContainerStarted","Data":"806c8d6765a71e9e10012e83297a8aaa1cdf7c3c33a0c1d4225a7350dfc4eaa5"} Apr 16 22:16:43.611945 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:43.611944 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" event={"ID":"afd4a484-060b-40a6-bb14-b477fd906bf1","Type":"ContainerStarted","Data":"cfaaf194c7f6a798ad08e32cff6ef89d608ec8fb2d76c9325365bb5867e0af17"} Apr 16 22:16:43.612590 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:43.611985 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:43.613207 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:43.613187 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-g8j9n" event={"ID":"f12d17a9-d1af-465c-a23e-81d8f09a5156","Type":"ContainerStarted","Data":"ed8ef5e3eca618b79411183c904245784e99a454a20e8e42eca2d93fa40c0c9a"} Apr 16 22:16:43.639144 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:43.639097 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" podStartSLOduration=1.769481482 podStartE2EDuration="4.639084361s" podCreationTimestamp="2026-04-16 22:16:39 +0000 UTC" firstStartedPulling="2026-04-16 22:16:40.474774616 +0000 UTC m=+171.055756799" lastFinishedPulling="2026-04-16 22:16:43.344377496 +0000 UTC m=+173.925359678" observedRunningTime="2026-04-16 22:16:43.636998035 +0000 UTC m=+174.217980275" watchObservedRunningTime="2026-04-16 22:16:43.639084361 +0000 UTC m=+174.220066566" Apr 16 22:16:43.656658 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:43.656613 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-g8j9n" podStartSLOduration=139.442460402 podStartE2EDuration="2m21.656602248s" podCreationTimestamp="2026-04-16 22:14:22 +0000 UTC" firstStartedPulling="2026-04-16 22:16:40.320066498 +0000 UTC m=+170.901048680" lastFinishedPulling="2026-04-16 22:16:42.534208329 +0000 UTC m=+173.115190526" observedRunningTime="2026-04-16 22:16:43.655678803 +0000 UTC m=+174.236661009" watchObservedRunningTime="2026-04-16 22:16:43.656602248 +0000 UTC m=+174.237584452" Apr 16 22:16:44.442750 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.442718 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:16:44.447394 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.447364 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.452698 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.452676 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 22:16:44.452698 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.452688 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 22:16:44.453385 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.453354 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-v7pzr\"" Apr 16 22:16:44.453508 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.453381 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 22:16:44.453508 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.453361 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 22:16:44.453508 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.453438 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 22:16:44.453508 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.453354 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 22:16:44.454014 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.453620 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 22:16:44.454014 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.453740 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 22:16:44.454014 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.453814 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 22:16:44.454014 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.453952 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 22:16:44.454014 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.453952 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-97isi8a6u7ip\"" Apr 16 22:16:44.454367 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.454295 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 22:16:44.458461 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.458443 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 22:16:44.464690 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.464672 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 22:16:44.485460 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.485436 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:16:44.524757 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.524726 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.524903 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.524767 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.524903 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.524798 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0727cbea-6f86-4690-a1c1-bb613bedddbe-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.524903 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.524884 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0727cbea-6f86-4690-a1c1-bb613bedddbe-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.525012 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.524932 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.525012 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.524962 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.525012 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.525000 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0727cbea-6f86-4690-a1c1-bb613bedddbe-config-out\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.525132 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.525015 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-web-config\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.525132 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.525113 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.525226 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.525186 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.525266 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.525225 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.525266 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.525255 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.525342 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.525304 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-config\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.525342 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.525328 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.525435 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.525357 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.525435 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.525379 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.525525 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.525458 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.525525 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.525494 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgpd7\" (UniqueName: \"kubernetes.io/projected/0727cbea-6f86-4690-a1c1-bb613bedddbe-kube-api-access-jgpd7\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.626780 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.626750 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.626780 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.626786 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.627244 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.626806 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.627244 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.626829 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.627244 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.626959 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-config\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.627244 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.626992 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.627244 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.627026 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.627244 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.627054 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.627244 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.627125 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.627244 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.627154 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jgpd7\" (UniqueName: \"kubernetes.io/projected/0727cbea-6f86-4690-a1c1-bb613bedddbe-kube-api-access-jgpd7\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.627244 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.627191 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.627244 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.627225 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.627244 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.627247 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0727cbea-6f86-4690-a1c1-bb613bedddbe-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.627796 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.627286 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0727cbea-6f86-4690-a1c1-bb613bedddbe-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.627796 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.627310 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.627796 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.627343 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.627796 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.627391 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0727cbea-6f86-4690-a1c1-bb613bedddbe-config-out\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.627796 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.627412 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-web-config\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.628038 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.627801 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.628146 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.628118 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0727cbea-6f86-4690-a1c1-bb613bedddbe-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.630006 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.629978 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.630279 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.630253 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.630279 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.630268 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.630462 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.630355 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-web-config\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.630605 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.630587 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.630789 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.630766 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.631383 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.631356 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.631476 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.631445 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.632092 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.632069 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.632438 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.632413 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.632525 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.632464 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.632675 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.632655 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-config\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.632783 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.632762 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0727cbea-6f86-4690-a1c1-bb613bedddbe-config-out\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.632836 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.632823 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0727cbea-6f86-4690-a1c1-bb613bedddbe-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.633000 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.632981 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.656560 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.656507 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgpd7\" (UniqueName: \"kubernetes.io/projected/0727cbea-6f86-4690-a1c1-bb613bedddbe-kube-api-access-jgpd7\") pod \"prometheus-k8s-0\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.761613 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.761576 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:44.910652 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:44.910616 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:16:44.914343 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:16:44.914299 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0727cbea_6f86_4690_a1c1_bb613bedddbe.slice/crio-004bfd734a5e6e39ce23dd7011a2fa71e7039864eccc4218678c50610b89a7aa WatchSource:0}: Error finding container 004bfd734a5e6e39ce23dd7011a2fa71e7039864eccc4218678c50610b89a7aa: Status 404 returned error can't find the container with id 004bfd734a5e6e39ce23dd7011a2fa71e7039864eccc4218678c50610b89a7aa Apr 16 22:16:45.620655 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:45.620613 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0727cbea-6f86-4690-a1c1-bb613bedddbe","Type":"ContainerStarted","Data":"004bfd734a5e6e39ce23dd7011a2fa71e7039864eccc4218678c50610b89a7aa"} Apr 16 22:16:46.625615 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:46.625582 2572 generic.go:358] "Generic (PLEG): container finished" podID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerID="1c194172c92f126a69105231c384f7d501de1fdb2ce8fe274f6a54bae6a9d01e" exitCode=0 Apr 16 22:16:46.625958 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:46.625644 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0727cbea-6f86-4690-a1c1-bb613bedddbe","Type":"ContainerDied","Data":"1c194172c92f126a69105231c384f7d501de1fdb2ce8fe274f6a54bae6a9d01e"} Apr 16 22:16:47.190763 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.190719 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" podUID="b1abd3ee-aae6-4e57-961f-53cc5f3d40c0" containerName="registry" containerID="cri-o://945c6813b7330d1184d6a58de764fcfae25fdfd491e1a15965df671d0c463a7a" gracePeriod=30 Apr 16 22:16:47.444029 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.443970 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:16:47.555998 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.555956 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-image-registry-private-configuration\") pod \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " Apr 16 22:16:47.556187 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.556022 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-certificates\") pod \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " Apr 16 22:16:47.556187 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.556052 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-tls\") pod \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " Apr 16 22:16:47.556187 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.556072 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-ca-trust-extracted\") pod \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " Apr 16 22:16:47.556187 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.556119 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gmv9\" (UniqueName: \"kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-kube-api-access-9gmv9\") pod \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " Apr 16 22:16:47.556383 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.556191 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-bound-sa-token\") pod \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " Apr 16 22:16:47.556383 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.556226 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-installation-pull-secrets\") pod \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " Apr 16 22:16:47.556383 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.556262 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-trusted-ca\") pod \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\" (UID: \"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0\") " Apr 16 22:16:47.556540 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.556485 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b1abd3ee-aae6-4e57-961f-53cc5f3d40c0" (UID: "b1abd3ee-aae6-4e57-961f-53cc5f3d40c0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:16:47.557058 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.556945 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b1abd3ee-aae6-4e57-961f-53cc5f3d40c0" (UID: "b1abd3ee-aae6-4e57-961f-53cc5f3d40c0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:16:47.559230 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.559201 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-kube-api-access-9gmv9" (OuterVolumeSpecName: "kube-api-access-9gmv9") pod "b1abd3ee-aae6-4e57-961f-53cc5f3d40c0" (UID: "b1abd3ee-aae6-4e57-961f-53cc5f3d40c0"). InnerVolumeSpecName "kube-api-access-9gmv9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:16:47.559357 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.559254 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b1abd3ee-aae6-4e57-961f-53cc5f3d40c0" (UID: "b1abd3ee-aae6-4e57-961f-53cc5f3d40c0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:16:47.559662 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.559618 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "b1abd3ee-aae6-4e57-961f-53cc5f3d40c0" (UID: "b1abd3ee-aae6-4e57-961f-53cc5f3d40c0"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:16:47.559757 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.559686 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b1abd3ee-aae6-4e57-961f-53cc5f3d40c0" (UID: "b1abd3ee-aae6-4e57-961f-53cc5f3d40c0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:16:47.560682 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.560658 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b1abd3ee-aae6-4e57-961f-53cc5f3d40c0" (UID: "b1abd3ee-aae6-4e57-961f-53cc5f3d40c0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:16:47.568122 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.568091 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b1abd3ee-aae6-4e57-961f-53cc5f3d40c0" (UID: "b1abd3ee-aae6-4e57-961f-53cc5f3d40c0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:16:47.629704 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.629670 2572 generic.go:358] "Generic (PLEG): container finished" podID="b1abd3ee-aae6-4e57-961f-53cc5f3d40c0" containerID="945c6813b7330d1184d6a58de764fcfae25fdfd491e1a15965df671d0c463a7a" exitCode=0 Apr 16 22:16:47.630165 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.629733 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" event={"ID":"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0","Type":"ContainerDied","Data":"945c6813b7330d1184d6a58de764fcfae25fdfd491e1a15965df671d0c463a7a"} Apr 16 22:16:47.630165 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.629740 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" Apr 16 22:16:47.630165 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.629772 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54f7d4b79c-8d9n5" event={"ID":"b1abd3ee-aae6-4e57-961f-53cc5f3d40c0","Type":"ContainerDied","Data":"49df4a908a1699eec3bcaee301344c952e8e794d5904cc6d9357823aeace5c9c"} Apr 16 22:16:47.630165 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.629795 2572 scope.go:117] "RemoveContainer" containerID="945c6813b7330d1184d6a58de764fcfae25fdfd491e1a15965df671d0c463a7a" Apr 16 22:16:47.639840 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.639819 2572 scope.go:117] "RemoveContainer" containerID="945c6813b7330d1184d6a58de764fcfae25fdfd491e1a15965df671d0c463a7a" Apr 16 22:16:47.640169 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:16:47.640131 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"945c6813b7330d1184d6a58de764fcfae25fdfd491e1a15965df671d0c463a7a\": container with ID starting with 945c6813b7330d1184d6a58de764fcfae25fdfd491e1a15965df671d0c463a7a not found: ID does not exist" containerID="945c6813b7330d1184d6a58de764fcfae25fdfd491e1a15965df671d0c463a7a" Apr 16 22:16:47.640253 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.640172 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"945c6813b7330d1184d6a58de764fcfae25fdfd491e1a15965df671d0c463a7a"} err="failed to get container status \"945c6813b7330d1184d6a58de764fcfae25fdfd491e1a15965df671d0c463a7a\": rpc error: code = NotFound desc = could not find container \"945c6813b7330d1184d6a58de764fcfae25fdfd491e1a15965df671d0c463a7a\": container with ID starting with 945c6813b7330d1184d6a58de764fcfae25fdfd491e1a15965df671d0c463a7a not found: ID does not exist" Apr 16 22:16:47.657427 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.657399 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9gmv9\" (UniqueName: \"kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-kube-api-access-9gmv9\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:16:47.657427 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.657429 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-bound-sa-token\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:16:47.657643 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.657447 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-installation-pull-secrets\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:16:47.657643 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.657458 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-trusted-ca\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:16:47.657643 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.657468 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-image-registry-private-configuration\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:16:47.657643 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.657477 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-certificates\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:16:47.657643 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.657485 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-registry-tls\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:16:47.657643 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.657493 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0-ca-trust-extracted\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:16:47.663401 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.663368 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-54f7d4b79c-8d9n5"] Apr 16 22:16:47.669168 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:47.669138 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-54f7d4b79c-8d9n5"] Apr 16 22:16:48.045775 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:48.045738 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1abd3ee-aae6-4e57-961f-53cc5f3d40c0" path="/var/lib/kubelet/pods/b1abd3ee-aae6-4e57-961f-53cc5f3d40c0/volumes" Apr 16 22:16:49.622875 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:49.622845 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5c945fbc4f-9dvfz" Apr 16 22:16:49.638982 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:49.638952 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0727cbea-6f86-4690-a1c1-bb613bedddbe","Type":"ContainerStarted","Data":"f429255b641cf8d7b1fc36b130c3e31b813e41fb04680922e8b98eefcacf89f7"} Apr 16 22:16:49.638982 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:49.638986 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0727cbea-6f86-4690-a1c1-bb613bedddbe","Type":"ContainerStarted","Data":"cd9ddd83138882deea3f02b8eb0359501123e4d5d6c4ba2f7fab6f6c472706f0"} Apr 16 22:16:49.639175 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:49.639000 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0727cbea-6f86-4690-a1c1-bb613bedddbe","Type":"ContainerStarted","Data":"1fe39ea1d17fa8b0186bdd8ce2cc10b97930655625aed5efe0a3718ef1ad845e"} Apr 16 22:16:49.639175 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:49.639011 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0727cbea-6f86-4690-a1c1-bb613bedddbe","Type":"ContainerStarted","Data":"0a6c92440df7d1cf2ed4970e0ca2562f59c01134b1001b70a1b371e6eaacf42f"} Apr 16 22:16:49.639175 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:49.639022 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0727cbea-6f86-4690-a1c1-bb613bedddbe","Type":"ContainerStarted","Data":"8a50f72cc1bb0bd16f693312a0b710d233ad521dad946231968b59e0e8b9999e"} Apr 16 22:16:49.639175 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:49.639032 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0727cbea-6f86-4690-a1c1-bb613bedddbe","Type":"ContainerStarted","Data":"2f5b7efd0b0376c1464d7864e6993fd16069d916d9cc06e775e06c6015f4c50d"} Apr 16 22:16:49.762079 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:49.762028 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:16:59.421951 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:59.421888 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=11.172091055 podStartE2EDuration="15.421869416s" podCreationTimestamp="2026-04-16 22:16:44 +0000 UTC" firstStartedPulling="2026-04-16 22:16:44.920113251 +0000 UTC m=+175.501095435" lastFinishedPulling="2026-04-16 22:16:49.169891612 +0000 UTC m=+179.750873796" observedRunningTime="2026-04-16 22:16:49.688433576 +0000 UTC m=+180.269415781" watchObservedRunningTime="2026-04-16 22:16:59.421869416 +0000 UTC m=+190.002851623" Apr 16 22:16:59.422653 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:16:59.422629 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65d65d8466-j76bf"] Apr 16 22:17:18.725262 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:18.725226 2572 generic.go:358] "Generic (PLEG): container finished" podID="090695a5-dd56-4979-855d-5b1e3e681e54" containerID="9ed21d304b6c36b7b6dbc9fd66a59b49a38be0a3ea71c89ef301995a4343f872" exitCode=0 Apr 16 22:17:18.725642 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:18.725295 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6zgrq" event={"ID":"090695a5-dd56-4979-855d-5b1e3e681e54","Type":"ContainerDied","Data":"9ed21d304b6c36b7b6dbc9fd66a59b49a38be0a3ea71c89ef301995a4343f872"} Apr 16 22:17:18.725642 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:18.725600 2572 scope.go:117] "RemoveContainer" containerID="9ed21d304b6c36b7b6dbc9fd66a59b49a38be0a3ea71c89ef301995a4343f872" Apr 16 22:17:19.730017 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:19.729983 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-6zgrq" event={"ID":"090695a5-dd56-4979-855d-5b1e3e681e54","Type":"ContainerStarted","Data":"a8f5f693ddf950db8105851e8fd44fbca211c641deae0698ee62903b57056995"} Apr 16 22:17:24.441413 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.441348 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-65d65d8466-j76bf" podUID="0e74cbe3-7fcf-4566-9104-5d85636c8296" containerName="console" containerID="cri-o://5ccaf3587b103ee3cdcab941a3a2c74028f8283bfa391dfab79adecef2cf79e6" gracePeriod=15 Apr 16 22:17:24.671939 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.671917 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65d65d8466-j76bf_0e74cbe3-7fcf-4566-9104-5d85636c8296/console/0.log" Apr 16 22:17:24.672059 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.671976 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:17:24.746022 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.745952 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65d65d8466-j76bf_0e74cbe3-7fcf-4566-9104-5d85636c8296/console/0.log" Apr 16 22:17:24.746022 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.745992 2572 generic.go:358] "Generic (PLEG): container finished" podID="0e74cbe3-7fcf-4566-9104-5d85636c8296" containerID="5ccaf3587b103ee3cdcab941a3a2c74028f8283bfa391dfab79adecef2cf79e6" exitCode=2 Apr 16 22:17:24.746192 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.746049 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65d65d8466-j76bf" event={"ID":"0e74cbe3-7fcf-4566-9104-5d85636c8296","Type":"ContainerDied","Data":"5ccaf3587b103ee3cdcab941a3a2c74028f8283bfa391dfab79adecef2cf79e6"} Apr 16 22:17:24.746192 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.746064 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65d65d8466-j76bf" Apr 16 22:17:24.746192 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.746082 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65d65d8466-j76bf" event={"ID":"0e74cbe3-7fcf-4566-9104-5d85636c8296","Type":"ContainerDied","Data":"cb40c0959c768f07b61808e469aec3bb0f9464d8181c5ddb8d46eb127315cd09"} Apr 16 22:17:24.746192 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.746102 2572 scope.go:117] "RemoveContainer" containerID="5ccaf3587b103ee3cdcab941a3a2c74028f8283bfa391dfab79adecef2cf79e6" Apr 16 22:17:24.753565 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.753534 2572 scope.go:117] "RemoveContainer" containerID="5ccaf3587b103ee3cdcab941a3a2c74028f8283bfa391dfab79adecef2cf79e6" Apr 16 22:17:24.753821 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:17:24.753801 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ccaf3587b103ee3cdcab941a3a2c74028f8283bfa391dfab79adecef2cf79e6\": container with ID starting with 5ccaf3587b103ee3cdcab941a3a2c74028f8283bfa391dfab79adecef2cf79e6 not found: ID does not exist" containerID="5ccaf3587b103ee3cdcab941a3a2c74028f8283bfa391dfab79adecef2cf79e6" Apr 16 22:17:24.753866 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.753831 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ccaf3587b103ee3cdcab941a3a2c74028f8283bfa391dfab79adecef2cf79e6"} err="failed to get container status \"5ccaf3587b103ee3cdcab941a3a2c74028f8283bfa391dfab79adecef2cf79e6\": rpc error: code = NotFound desc = could not find container \"5ccaf3587b103ee3cdcab941a3a2c74028f8283bfa391dfab79adecef2cf79e6\": container with ID starting with 5ccaf3587b103ee3cdcab941a3a2c74028f8283bfa391dfab79adecef2cf79e6 not found: ID does not exist" Apr 16 22:17:24.770218 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.770198 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0e74cbe3-7fcf-4566-9104-5d85636c8296-console-oauth-config\") pod \"0e74cbe3-7fcf-4566-9104-5d85636c8296\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " Apr 16 22:17:24.770294 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.770248 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e74cbe3-7fcf-4566-9104-5d85636c8296-service-ca\") pod \"0e74cbe3-7fcf-4566-9104-5d85636c8296\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " Apr 16 22:17:24.770294 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.770282 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cn9jb\" (UniqueName: \"kubernetes.io/projected/0e74cbe3-7fcf-4566-9104-5d85636c8296-kube-api-access-cn9jb\") pod \"0e74cbe3-7fcf-4566-9104-5d85636c8296\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " Apr 16 22:17:24.770369 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.770304 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e74cbe3-7fcf-4566-9104-5d85636c8296-console-serving-cert\") pod \"0e74cbe3-7fcf-4566-9104-5d85636c8296\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " Apr 16 22:17:24.770369 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.770332 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0e74cbe3-7fcf-4566-9104-5d85636c8296-console-config\") pod \"0e74cbe3-7fcf-4566-9104-5d85636c8296\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " Apr 16 22:17:24.770468 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.770386 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0e74cbe3-7fcf-4566-9104-5d85636c8296-oauth-serving-cert\") pod \"0e74cbe3-7fcf-4566-9104-5d85636c8296\" (UID: \"0e74cbe3-7fcf-4566-9104-5d85636c8296\") " Apr 16 22:17:24.770779 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.770750 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e74cbe3-7fcf-4566-9104-5d85636c8296-service-ca" (OuterVolumeSpecName: "service-ca") pod "0e74cbe3-7fcf-4566-9104-5d85636c8296" (UID: "0e74cbe3-7fcf-4566-9104-5d85636c8296"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:24.770894 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.770750 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e74cbe3-7fcf-4566-9104-5d85636c8296-console-config" (OuterVolumeSpecName: "console-config") pod "0e74cbe3-7fcf-4566-9104-5d85636c8296" (UID: "0e74cbe3-7fcf-4566-9104-5d85636c8296"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:24.770894 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.770779 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e74cbe3-7fcf-4566-9104-5d85636c8296-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0e74cbe3-7fcf-4566-9104-5d85636c8296" (UID: "0e74cbe3-7fcf-4566-9104-5d85636c8296"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:17:24.772448 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.772423 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e74cbe3-7fcf-4566-9104-5d85636c8296-kube-api-access-cn9jb" (OuterVolumeSpecName: "kube-api-access-cn9jb") pod "0e74cbe3-7fcf-4566-9104-5d85636c8296" (UID: "0e74cbe3-7fcf-4566-9104-5d85636c8296"). InnerVolumeSpecName "kube-api-access-cn9jb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:17:24.772448 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.772434 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e74cbe3-7fcf-4566-9104-5d85636c8296-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0e74cbe3-7fcf-4566-9104-5d85636c8296" (UID: "0e74cbe3-7fcf-4566-9104-5d85636c8296"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:24.772644 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.772513 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e74cbe3-7fcf-4566-9104-5d85636c8296-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0e74cbe3-7fcf-4566-9104-5d85636c8296" (UID: "0e74cbe3-7fcf-4566-9104-5d85636c8296"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:17:24.871478 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.871450 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0e74cbe3-7fcf-4566-9104-5d85636c8296-console-oauth-config\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:17:24.871478 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.871475 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0e74cbe3-7fcf-4566-9104-5d85636c8296-service-ca\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:17:24.871661 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.871485 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cn9jb\" (UniqueName: \"kubernetes.io/projected/0e74cbe3-7fcf-4566-9104-5d85636c8296-kube-api-access-cn9jb\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:17:24.871661 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.871494 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0e74cbe3-7fcf-4566-9104-5d85636c8296-console-serving-cert\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:17:24.871661 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.871503 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0e74cbe3-7fcf-4566-9104-5d85636c8296-console-config\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:17:24.871661 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:24.871512 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0e74cbe3-7fcf-4566-9104-5d85636c8296-oauth-serving-cert\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:17:25.064356 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:25.064326 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-65d65d8466-j76bf"] Apr 16 22:17:25.067841 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:25.067813 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-65d65d8466-j76bf"] Apr 16 22:17:26.041952 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:26.041912 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e74cbe3-7fcf-4566-9104-5d85636c8296" path="/var/lib/kubelet/pods/0e74cbe3-7fcf-4566-9104-5d85636c8296/volumes" Apr 16 22:17:28.758742 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:28.758705 2572 generic.go:358] "Generic (PLEG): container finished" podID="e9277225-df0b-4b17-9847-c37f05b00c9d" containerID="425276a7b0ad6209798e2703477501d71522fbfe28f4b2cd6fbb5cad69aeb2d8" exitCode=0 Apr 16 22:17:28.759108 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:28.758780 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-nf8x4" event={"ID":"e9277225-df0b-4b17-9847-c37f05b00c9d","Type":"ContainerDied","Data":"425276a7b0ad6209798e2703477501d71522fbfe28f4b2cd6fbb5cad69aeb2d8"} Apr 16 22:17:28.759149 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:28.759109 2572 scope.go:117] "RemoveContainer" containerID="425276a7b0ad6209798e2703477501d71522fbfe28f4b2cd6fbb5cad69aeb2d8" Apr 16 22:17:29.763970 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:29.763935 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-nf8x4" event={"ID":"e9277225-df0b-4b17-9847-c37f05b00c9d","Type":"ContainerStarted","Data":"03e64eb38a75236b6b07fbf92500a9307073061ce37baae60e4f9ad8b061b3c8"} Apr 16 22:17:44.762139 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:44.762101 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:44.777597 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:44.777564 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:17:44.824126 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:17:44.824100 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:01.888853 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:01.888737 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs\") pod \"network-metrics-daemon-cd2s2\" (UID: \"4ffa9102-be6a-431e-b1c8-ee3b5b01e588\") " pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:18:01.891035 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:01.891012 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ffa9102-be6a-431e-b1c8-ee3b5b01e588-metrics-certs\") pod \"network-metrics-daemon-cd2s2\" (UID: \"4ffa9102-be6a-431e-b1c8-ee3b5b01e588\") " pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:18:01.941330 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:01.941297 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7dlkb\"" Apr 16 22:18:01.949416 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:01.949394 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cd2s2" Apr 16 22:18:02.068460 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:02.068393 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cd2s2"] Apr 16 22:18:02.071006 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:18:02.070974 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ffa9102_be6a_431e_b1c8_ee3b5b01e588.slice/crio-4c8cb37e690ba52df983d56fd06a1c17d5fb83ddab4a3afd6b67d7438abc0960 WatchSource:0}: Error finding container 4c8cb37e690ba52df983d56fd06a1c17d5fb83ddab4a3afd6b67d7438abc0960: Status 404 returned error can't find the container with id 4c8cb37e690ba52df983d56fd06a1c17d5fb83ddab4a3afd6b67d7438abc0960 Apr 16 22:18:02.735673 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:02.735636 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:18:02.736195 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:02.736162 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="prometheus" containerID="cri-o://2f5b7efd0b0376c1464d7864e6993fd16069d916d9cc06e775e06c6015f4c50d" gracePeriod=600 Apr 16 22:18:02.736358 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:02.736316 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="kube-rbac-proxy-web" containerID="cri-o://1fe39ea1d17fa8b0186bdd8ce2cc10b97930655625aed5efe0a3718ef1ad845e" gracePeriod=600 Apr 16 22:18:02.736423 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:02.736313 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="kube-rbac-proxy" containerID="cri-o://cd9ddd83138882deea3f02b8eb0359501123e4d5d6c4ba2f7fab6f6c472706f0" gracePeriod=600 Apr 16 22:18:02.736423 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:02.736401 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="thanos-sidecar" containerID="cri-o://0a6c92440df7d1cf2ed4970e0ca2562f59c01134b1001b70a1b371e6eaacf42f" gracePeriod=600 Apr 16 22:18:02.736529 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:02.736410 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="config-reloader" containerID="cri-o://8a50f72cc1bb0bd16f693312a0b710d233ad521dad946231968b59e0e8b9999e" gracePeriod=600 Apr 16 22:18:02.736529 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:02.736451 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="kube-rbac-proxy-thanos" containerID="cri-o://f429255b641cf8d7b1fc36b130c3e31b813e41fb04680922e8b98eefcacf89f7" gracePeriod=600 Apr 16 22:18:02.868021 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:02.867979 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cd2s2" event={"ID":"4ffa9102-be6a-431e-b1c8-ee3b5b01e588","Type":"ContainerStarted","Data":"4c8cb37e690ba52df983d56fd06a1c17d5fb83ddab4a3afd6b67d7438abc0960"} Apr 16 22:18:02.871834 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:02.871797 2572 generic.go:358] "Generic (PLEG): container finished" podID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerID="f429255b641cf8d7b1fc36b130c3e31b813e41fb04680922e8b98eefcacf89f7" exitCode=0 Apr 16 22:18:02.871834 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:02.871828 2572 generic.go:358] "Generic (PLEG): container finished" podID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerID="cd9ddd83138882deea3f02b8eb0359501123e4d5d6c4ba2f7fab6f6c472706f0" exitCode=0 Apr 16 22:18:02.871834 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:02.871836 2572 generic.go:358] "Generic (PLEG): container finished" podID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerID="0a6c92440df7d1cf2ed4970e0ca2562f59c01134b1001b70a1b371e6eaacf42f" exitCode=0 Apr 16 22:18:02.871834 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:02.871844 2572 generic.go:358] "Generic (PLEG): container finished" podID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerID="8a50f72cc1bb0bd16f693312a0b710d233ad521dad946231968b59e0e8b9999e" exitCode=0 Apr 16 22:18:02.871834 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:02.871851 2572 generic.go:358] "Generic (PLEG): container finished" podID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerID="2f5b7efd0b0376c1464d7864e6993fd16069d916d9cc06e775e06c6015f4c50d" exitCode=0 Apr 16 22:18:02.873799 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:02.871883 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0727cbea-6f86-4690-a1c1-bb613bedddbe","Type":"ContainerDied","Data":"f429255b641cf8d7b1fc36b130c3e31b813e41fb04680922e8b98eefcacf89f7"} Apr 16 22:18:02.873799 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:02.871911 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0727cbea-6f86-4690-a1c1-bb613bedddbe","Type":"ContainerDied","Data":"cd9ddd83138882deea3f02b8eb0359501123e4d5d6c4ba2f7fab6f6c472706f0"} Apr 16 22:18:02.873799 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:02.871925 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0727cbea-6f86-4690-a1c1-bb613bedddbe","Type":"ContainerDied","Data":"0a6c92440df7d1cf2ed4970e0ca2562f59c01134b1001b70a1b371e6eaacf42f"} Apr 16 22:18:02.873799 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:02.871937 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0727cbea-6f86-4690-a1c1-bb613bedddbe","Type":"ContainerDied","Data":"8a50f72cc1bb0bd16f693312a0b710d233ad521dad946231968b59e0e8b9999e"} Apr 16 22:18:02.873799 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:02.871952 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0727cbea-6f86-4690-a1c1-bb613bedddbe","Type":"ContainerDied","Data":"2f5b7efd0b0376c1464d7864e6993fd16069d916d9cc06e775e06c6015f4c50d"} Apr 16 22:18:03.876891 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:03.876846 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cd2s2" event={"ID":"4ffa9102-be6a-431e-b1c8-ee3b5b01e588","Type":"ContainerStarted","Data":"3c727242e5034acac189e2620a7bb8840c903000496a53d26dc6cc7d1bfdc70b"} Apr 16 22:18:03.876891 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:03.876897 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cd2s2" event={"ID":"4ffa9102-be6a-431e-b1c8-ee3b5b01e588","Type":"ContainerStarted","Data":"f0ebf45772d8ebfd87e8239448e0b211c4315aa39a2331f6325a504e3ac1fe0e"} Apr 16 22:18:03.880045 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:03.880013 2572 generic.go:358] "Generic (PLEG): container finished" podID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerID="1fe39ea1d17fa8b0186bdd8ce2cc10b97930655625aed5efe0a3718ef1ad845e" exitCode=0 Apr 16 22:18:03.880184 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:03.880066 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0727cbea-6f86-4690-a1c1-bb613bedddbe","Type":"ContainerDied","Data":"1fe39ea1d17fa8b0186bdd8ce2cc10b97930655625aed5efe0a3718ef1ad845e"} Apr 16 22:18:03.892058 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:03.891998 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-cd2s2" podStartSLOduration=252.736542611 podStartE2EDuration="4m13.891978204s" podCreationTimestamp="2026-04-16 22:13:50 +0000 UTC" firstStartedPulling="2026-04-16 22:18:02.072945152 +0000 UTC m=+252.653927335" lastFinishedPulling="2026-04-16 22:18:03.228380742 +0000 UTC m=+253.809362928" observedRunningTime="2026-04-16 22:18:03.89173201 +0000 UTC m=+254.472714220" watchObservedRunningTime="2026-04-16 22:18:03.891978204 +0000 UTC m=+254.472960409" Apr 16 22:18:04.002778 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.002754 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:04.116636 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.116525 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-kube-rbac-proxy\") pod \"0727cbea-6f86-4690-a1c1-bb613bedddbe\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " Apr 16 22:18:04.116636 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.116589 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0727cbea-6f86-4690-a1c1-bb613bedddbe-prometheus-k8s-db\") pod \"0727cbea-6f86-4690-a1c1-bb613bedddbe\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " Apr 16 22:18:04.116636 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.116607 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"0727cbea-6f86-4690-a1c1-bb613bedddbe\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " Apr 16 22:18:04.116636 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.116630 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-grpc-tls\") pod \"0727cbea-6f86-4690-a1c1-bb613bedddbe\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " Apr 16 22:18:04.116963 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.116655 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-configmap-kubelet-serving-ca-bundle\") pod \"0727cbea-6f86-4690-a1c1-bb613bedddbe\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " Apr 16 22:18:04.116963 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.116678 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"0727cbea-6f86-4690-a1c1-bb613bedddbe\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " Apr 16 22:18:04.116963 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.116712 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-prometheus-trusted-ca-bundle\") pod \"0727cbea-6f86-4690-a1c1-bb613bedddbe\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " Apr 16 22:18:04.116963 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.116758 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-configmap-serving-certs-ca-bundle\") pod \"0727cbea-6f86-4690-a1c1-bb613bedddbe\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " Apr 16 22:18:04.116963 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.116787 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-configmap-metrics-client-ca\") pod \"0727cbea-6f86-4690-a1c1-bb613bedddbe\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " Apr 16 22:18:04.116963 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.116813 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0727cbea-6f86-4690-a1c1-bb613bedddbe-config-out\") pod \"0727cbea-6f86-4690-a1c1-bb613bedddbe\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " Apr 16 22:18:04.116963 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.116837 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-prometheus-k8s-rulefiles-0\") pod \"0727cbea-6f86-4690-a1c1-bb613bedddbe\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " Apr 16 22:18:04.116963 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.116861 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-thanos-prometheus-http-client-file\") pod \"0727cbea-6f86-4690-a1c1-bb613bedddbe\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " Apr 16 22:18:04.116963 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.116885 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-metrics-client-certs\") pod \"0727cbea-6f86-4690-a1c1-bb613bedddbe\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " Apr 16 22:18:04.116963 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.116921 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgpd7\" (UniqueName: \"kubernetes.io/projected/0727cbea-6f86-4690-a1c1-bb613bedddbe-kube-api-access-jgpd7\") pod \"0727cbea-6f86-4690-a1c1-bb613bedddbe\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " Apr 16 22:18:04.116963 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.116953 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-config\") pod \"0727cbea-6f86-4690-a1c1-bb613bedddbe\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " Apr 16 22:18:04.117511 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.116977 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0727cbea-6f86-4690-a1c1-bb613bedddbe-tls-assets\") pod \"0727cbea-6f86-4690-a1c1-bb613bedddbe\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " Apr 16 22:18:04.117511 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.117010 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-web-config\") pod \"0727cbea-6f86-4690-a1c1-bb613bedddbe\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " Apr 16 22:18:04.117511 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.117042 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-prometheus-k8s-tls\") pod \"0727cbea-6f86-4690-a1c1-bb613bedddbe\" (UID: \"0727cbea-6f86-4690-a1c1-bb613bedddbe\") " Apr 16 22:18:04.117511 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.117168 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "0727cbea-6f86-4690-a1c1-bb613bedddbe" (UID: "0727cbea-6f86-4690-a1c1-bb613bedddbe"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:04.117511 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.117218 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "0727cbea-6f86-4690-a1c1-bb613bedddbe" (UID: "0727cbea-6f86-4690-a1c1-bb613bedddbe"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:04.117511 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.117332 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:18:04.117511 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.117385 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-prometheus-trusted-ca-bundle\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:18:04.118244 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.117620 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "0727cbea-6f86-4690-a1c1-bb613bedddbe" (UID: "0727cbea-6f86-4690-a1c1-bb613bedddbe"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:04.118244 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.117779 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0727cbea-6f86-4690-a1c1-bb613bedddbe-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "0727cbea-6f86-4690-a1c1-bb613bedddbe" (UID: "0727cbea-6f86-4690-a1c1-bb613bedddbe"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:18:04.118369 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.118156 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "0727cbea-6f86-4690-a1c1-bb613bedddbe" (UID: "0727cbea-6f86-4690-a1c1-bb613bedddbe"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:04.119408 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.119375 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "0727cbea-6f86-4690-a1c1-bb613bedddbe" (UID: "0727cbea-6f86-4690-a1c1-bb613bedddbe"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:18:04.120743 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.120711 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "0727cbea-6f86-4690-a1c1-bb613bedddbe" (UID: "0727cbea-6f86-4690-a1c1-bb613bedddbe"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:18:04.121269 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.121243 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0727cbea-6f86-4690-a1c1-bb613bedddbe-config-out" (OuterVolumeSpecName: "config-out") pod "0727cbea-6f86-4690-a1c1-bb613bedddbe" (UID: "0727cbea-6f86-4690-a1c1-bb613bedddbe"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:18:04.121359 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.121285 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-config" (OuterVolumeSpecName: "config") pod "0727cbea-6f86-4690-a1c1-bb613bedddbe" (UID: "0727cbea-6f86-4690-a1c1-bb613bedddbe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:18:04.121359 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.121328 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "0727cbea-6f86-4690-a1c1-bb613bedddbe" (UID: "0727cbea-6f86-4690-a1c1-bb613bedddbe"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:18:04.121481 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.121371 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "0727cbea-6f86-4690-a1c1-bb613bedddbe" (UID: "0727cbea-6f86-4690-a1c1-bb613bedddbe"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:18:04.121481 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.121385 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "0727cbea-6f86-4690-a1c1-bb613bedddbe" (UID: "0727cbea-6f86-4690-a1c1-bb613bedddbe"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:18:04.121615 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.121588 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0727cbea-6f86-4690-a1c1-bb613bedddbe-kube-api-access-jgpd7" (OuterVolumeSpecName: "kube-api-access-jgpd7") pod "0727cbea-6f86-4690-a1c1-bb613bedddbe" (UID: "0727cbea-6f86-4690-a1c1-bb613bedddbe"). InnerVolumeSpecName "kube-api-access-jgpd7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:18:04.121615 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.121597 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "0727cbea-6f86-4690-a1c1-bb613bedddbe" (UID: "0727cbea-6f86-4690-a1c1-bb613bedddbe"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:18:04.121914 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.121894 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0727cbea-6f86-4690-a1c1-bb613bedddbe-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "0727cbea-6f86-4690-a1c1-bb613bedddbe" (UID: "0727cbea-6f86-4690-a1c1-bb613bedddbe"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:18:04.122222 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.122198 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "0727cbea-6f86-4690-a1c1-bb613bedddbe" (UID: "0727cbea-6f86-4690-a1c1-bb613bedddbe"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:18:04.122301 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.122225 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "0727cbea-6f86-4690-a1c1-bb613bedddbe" (UID: "0727cbea-6f86-4690-a1c1-bb613bedddbe"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:18:04.133081 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.133053 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-web-config" (OuterVolumeSpecName: "web-config") pod "0727cbea-6f86-4690-a1c1-bb613bedddbe" (UID: "0727cbea-6f86-4690-a1c1-bb613bedddbe"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:18:04.217935 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.217891 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-configmap-metrics-client-ca\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:18:04.217935 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.217927 2572 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0727cbea-6f86-4690-a1c1-bb613bedddbe-config-out\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:18:04.217935 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.217944 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:18:04.218208 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.217957 2572 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-thanos-prometheus-http-client-file\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:18:04.218208 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.217971 2572 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-metrics-client-certs\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:18:04.218208 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.217986 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jgpd7\" (UniqueName: \"kubernetes.io/projected/0727cbea-6f86-4690-a1c1-bb613bedddbe-kube-api-access-jgpd7\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:18:04.218208 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.217998 2572 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-config\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:18:04.218208 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.218009 2572 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0727cbea-6f86-4690-a1c1-bb613bedddbe-tls-assets\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:18:04.218208 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.218021 2572 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-web-config\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:18:04.218208 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.218035 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-prometheus-k8s-tls\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:18:04.218208 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.218046 2572 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-kube-rbac-proxy\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:18:04.218208 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.218058 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/0727cbea-6f86-4690-a1c1-bb613bedddbe-prometheus-k8s-db\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:18:04.218208 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.218071 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:18:04.218208 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.218084 2572 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-grpc-tls\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:18:04.218208 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.218097 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0727cbea-6f86-4690-a1c1-bb613bedddbe-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:18:04.218208 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.218109 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0727cbea-6f86-4690-a1c1-bb613bedddbe-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:18:04.885885 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.885850 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"0727cbea-6f86-4690-a1c1-bb613bedddbe","Type":"ContainerDied","Data":"004bfd734a5e6e39ce23dd7011a2fa71e7039864eccc4218678c50610b89a7aa"} Apr 16 22:18:04.885885 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.885868 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:04.886377 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.885900 2572 scope.go:117] "RemoveContainer" containerID="f429255b641cf8d7b1fc36b130c3e31b813e41fb04680922e8b98eefcacf89f7" Apr 16 22:18:04.893308 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.893278 2572 scope.go:117] "RemoveContainer" containerID="cd9ddd83138882deea3f02b8eb0359501123e4d5d6c4ba2f7fab6f6c472706f0" Apr 16 22:18:04.900470 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.900453 2572 scope.go:117] "RemoveContainer" containerID="1fe39ea1d17fa8b0186bdd8ce2cc10b97930655625aed5efe0a3718ef1ad845e" Apr 16 22:18:04.907318 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.907251 2572 scope.go:117] "RemoveContainer" containerID="0a6c92440df7d1cf2ed4970e0ca2562f59c01134b1001b70a1b371e6eaacf42f" Apr 16 22:18:04.910545 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.910511 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:18:04.920826 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.920797 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:18:04.923746 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.923723 2572 scope.go:117] "RemoveContainer" containerID="8a50f72cc1bb0bd16f693312a0b710d233ad521dad946231968b59e0e8b9999e" Apr 16 22:18:04.931255 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.931232 2572 scope.go:117] "RemoveContainer" containerID="2f5b7efd0b0376c1464d7864e6993fd16069d916d9cc06e775e06c6015f4c50d" Apr 16 22:18:04.938361 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938325 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:18:04.938717 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938660 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="init-config-reloader" Apr 16 22:18:04.938717 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938674 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="init-config-reloader" Apr 16 22:18:04.938717 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938688 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e74cbe3-7fcf-4566-9104-5d85636c8296" containerName="console" Apr 16 22:18:04.938717 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938693 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e74cbe3-7fcf-4566-9104-5d85636c8296" containerName="console" Apr 16 22:18:04.938717 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938700 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1abd3ee-aae6-4e57-961f-53cc5f3d40c0" containerName="registry" Apr 16 22:18:04.938717 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938705 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1abd3ee-aae6-4e57-961f-53cc5f3d40c0" containerName="registry" Apr 16 22:18:04.938717 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938714 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="kube-rbac-proxy" Apr 16 22:18:04.939057 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938723 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="kube-rbac-proxy" Apr 16 22:18:04.939057 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938733 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="prometheus" Apr 16 22:18:04.939057 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938739 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="prometheus" Apr 16 22:18:04.939057 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938747 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="kube-rbac-proxy-thanos" Apr 16 22:18:04.939057 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938753 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="kube-rbac-proxy-thanos" Apr 16 22:18:04.939057 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938765 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="thanos-sidecar" Apr 16 22:18:04.939057 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938772 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="thanos-sidecar" Apr 16 22:18:04.939057 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938782 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="config-reloader" Apr 16 22:18:04.939057 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938789 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="config-reloader" Apr 16 22:18:04.939057 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938792 2572 scope.go:117] "RemoveContainer" containerID="1c194172c92f126a69105231c384f7d501de1fdb2ce8fe274f6a54bae6a9d01e" Apr 16 22:18:04.939057 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938800 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="kube-rbac-proxy-web" Apr 16 22:18:04.939057 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938877 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="kube-rbac-proxy-web" Apr 16 22:18:04.939057 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938967 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="config-reloader" Apr 16 22:18:04.939057 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938982 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="kube-rbac-proxy-web" Apr 16 22:18:04.939057 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.938993 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="kube-rbac-proxy" Apr 16 22:18:04.939057 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.939002 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e74cbe3-7fcf-4566-9104-5d85636c8296" containerName="console" Apr 16 22:18:04.939057 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.939012 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="prometheus" Apr 16 22:18:04.939057 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.939020 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="kube-rbac-proxy-thanos" Apr 16 22:18:04.939057 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.939029 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b1abd3ee-aae6-4e57-961f-53cc5f3d40c0" containerName="registry" Apr 16 22:18:04.939057 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.939037 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" containerName="thanos-sidecar" Apr 16 22:18:04.944971 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.944946 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:04.947404 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.947384 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 22:18:04.947503 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.947484 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 22:18:04.947591 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.947523 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 22:18:04.947656 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.947604 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 22:18:04.947778 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.947761 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 22:18:04.947849 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.947784 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 22:18:04.947849 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.947760 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 22:18:04.948059 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.948041 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 22:18:04.948114 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.948098 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 22:18:04.948172 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.948100 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 22:18:04.948224 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.948099 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 22:18:04.948298 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.948281 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-v7pzr\"" Apr 16 22:18:04.948524 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.948502 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-97isi8a6u7ip\"" Apr 16 22:18:04.951093 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.951070 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 22:18:04.954618 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.954598 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 22:18:04.956404 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:04.956382 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:18:05.025404 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.025366 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.025404 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.025410 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.025652 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.025430 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/491d0fec-fa89-47cd-b750-dacc85c63253-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.025652 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.025451 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/491d0fec-fa89-47cd-b750-dacc85c63253-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.025652 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.025469 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.025652 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.025486 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.025652 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.025505 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.025652 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.025521 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491d0fec-fa89-47cd-b750-dacc85c63253-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.025652 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.025538 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/491d0fec-fa89-47cd-b750-dacc85c63253-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.025652 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.025611 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.025652 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.025651 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-web-config\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.025998 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.025680 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-config\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.025998 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.025698 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491d0fec-fa89-47cd-b750-dacc85c63253-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.025998 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.025715 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.025998 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.025732 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/491d0fec-fa89-47cd-b750-dacc85c63253-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.025998 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.025748 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq9jn\" (UniqueName: \"kubernetes.io/projected/491d0fec-fa89-47cd-b750-dacc85c63253-kube-api-access-dq9jn\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.025998 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.025823 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/491d0fec-fa89-47cd-b750-dacc85c63253-config-out\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.025998 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.025867 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491d0fec-fa89-47cd-b750-dacc85c63253-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.126852 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.126815 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.126852 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.126855 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/491d0fec-fa89-47cd-b750-dacc85c63253-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.127092 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.126876 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/491d0fec-fa89-47cd-b750-dacc85c63253-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.127092 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.126894 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.127092 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.126912 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.127092 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.126929 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.127092 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.126947 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491d0fec-fa89-47cd-b750-dacc85c63253-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.127092 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.126962 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/491d0fec-fa89-47cd-b750-dacc85c63253-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.127381 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.127312 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/491d0fec-fa89-47cd-b750-dacc85c63253-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.128259 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.127460 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.128259 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.127533 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-web-config\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.128259 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.127622 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-config\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.128259 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.127651 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491d0fec-fa89-47cd-b750-dacc85c63253-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.128259 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.127679 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.128259 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.127711 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/491d0fec-fa89-47cd-b750-dacc85c63253-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.128259 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.127735 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dq9jn\" (UniqueName: \"kubernetes.io/projected/491d0fec-fa89-47cd-b750-dacc85c63253-kube-api-access-dq9jn\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.128259 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.127772 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/491d0fec-fa89-47cd-b750-dacc85c63253-config-out\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.128259 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.127808 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491d0fec-fa89-47cd-b750-dacc85c63253-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.128259 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.127829 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491d0fec-fa89-47cd-b750-dacc85c63253-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.128259 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.127856 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.128873 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.128650 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491d0fec-fa89-47cd-b750-dacc85c63253-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.130417 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.130073 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/491d0fec-fa89-47cd-b750-dacc85c63253-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.130417 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.130191 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.130417 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.130253 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.130417 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.130367 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.131223 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.131199 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/491d0fec-fa89-47cd-b750-dacc85c63253-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.131361 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.131200 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.131463 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.131393 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.132111 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.131880 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/491d0fec-fa89-47cd-b750-dacc85c63253-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.132111 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.132053 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/491d0fec-fa89-47cd-b750-dacc85c63253-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.132418 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.132388 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.132611 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.132593 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/491d0fec-fa89-47cd-b750-dacc85c63253-config-out\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.132889 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.132870 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-web-config\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.133063 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.133044 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.133114 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.133093 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/491d0fec-fa89-47cd-b750-dacc85c63253-config\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.137435 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.137392 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq9jn\" (UniqueName: \"kubernetes.io/projected/491d0fec-fa89-47cd-b750-dacc85c63253-kube-api-access-dq9jn\") pod \"prometheus-k8s-0\" (UID: \"491d0fec-fa89-47cd-b750-dacc85c63253\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.256639 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.256592 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:05.384805 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.384770 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 22:18:05.388642 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:18:05.388581 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod491d0fec_fa89_47cd_b750_dacc85c63253.slice/crio-5cab3c41c3dc23b413d8b128b59af5bff164e842597744fb5c67b9e6616bc440 WatchSource:0}: Error finding container 5cab3c41c3dc23b413d8b128b59af5bff164e842597744fb5c67b9e6616bc440: Status 404 returned error can't find the container with id 5cab3c41c3dc23b413d8b128b59af5bff164e842597744fb5c67b9e6616bc440 Apr 16 22:18:05.890857 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.890822 2572 generic.go:358] "Generic (PLEG): container finished" podID="491d0fec-fa89-47cd-b750-dacc85c63253" containerID="71917c53958b34c281b5760bb0713506d8749f52dc6cbcaf92aaef49b358afd1" exitCode=0 Apr 16 22:18:05.891301 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.890898 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"491d0fec-fa89-47cd-b750-dacc85c63253","Type":"ContainerDied","Data":"71917c53958b34c281b5760bb0713506d8749f52dc6cbcaf92aaef49b358afd1"} Apr 16 22:18:05.891301 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:05.890927 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"491d0fec-fa89-47cd-b750-dacc85c63253","Type":"ContainerStarted","Data":"5cab3c41c3dc23b413d8b128b59af5bff164e842597744fb5c67b9e6616bc440"} Apr 16 22:18:06.042307 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:06.042272 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0727cbea-6f86-4690-a1c1-bb613bedddbe" path="/var/lib/kubelet/pods/0727cbea-6f86-4690-a1c1-bb613bedddbe/volumes" Apr 16 22:18:06.897658 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:06.897624 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"491d0fec-fa89-47cd-b750-dacc85c63253","Type":"ContainerStarted","Data":"90950d902dba866135eb730c7e4169428a7c86bc6b2c70970e4124aef9289370"} Apr 16 22:18:06.897658 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:06.897661 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"491d0fec-fa89-47cd-b750-dacc85c63253","Type":"ContainerStarted","Data":"2baaac9b83d6451b4637616912dd776bc3dff46ca9b83f72abe6ee62484a69d3"} Apr 16 22:18:06.898079 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:06.897673 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"491d0fec-fa89-47cd-b750-dacc85c63253","Type":"ContainerStarted","Data":"baf90083f791126b407e9b215f24233bf797db23a649973cdd73763385dd43de"} Apr 16 22:18:06.898079 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:06.897682 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"491d0fec-fa89-47cd-b750-dacc85c63253","Type":"ContainerStarted","Data":"5ac7af4d48fc570f47b87ccb5cb4d1364240370701a5185de66d0730b29a904f"} Apr 16 22:18:06.898079 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:06.897692 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"491d0fec-fa89-47cd-b750-dacc85c63253","Type":"ContainerStarted","Data":"23ab88e00986118d19bd0371792de4ee62cca130aa6be376dd3c022670cae144"} Apr 16 22:18:06.898079 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:06.897700 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"491d0fec-fa89-47cd-b750-dacc85c63253","Type":"ContainerStarted","Data":"b3ce377ab2c83e36cce4432cd5529285f6201499900247f72c27c4f46d6df1eb"} Apr 16 22:18:06.925251 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:06.925189 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.925173689 podStartE2EDuration="2.925173689s" podCreationTimestamp="2026-04-16 22:18:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:18:06.922190575 +0000 UTC m=+257.503172780" watchObservedRunningTime="2026-04-16 22:18:06.925173689 +0000 UTC m=+257.506155934" Apr 16 22:18:10.257230 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:10.257168 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:18:35.089530 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:35.089490 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-766wb"] Apr 16 22:18:35.092336 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:35.092317 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-766wb" Apr 16 22:18:35.094823 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:35.094795 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 22:18:35.097354 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:35.097330 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-766wb"] Apr 16 22:18:35.179888 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:35.179847 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/aa73fd77-190c-4322-afe6-9facc2e9e8ad-kubelet-config\") pod \"global-pull-secret-syncer-766wb\" (UID: \"aa73fd77-190c-4322-afe6-9facc2e9e8ad\") " pod="kube-system/global-pull-secret-syncer-766wb" Apr 16 22:18:35.179888 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:35.179888 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/aa73fd77-190c-4322-afe6-9facc2e9e8ad-original-pull-secret\") pod \"global-pull-secret-syncer-766wb\" (UID: \"aa73fd77-190c-4322-afe6-9facc2e9e8ad\") " pod="kube-system/global-pull-secret-syncer-766wb" Apr 16 22:18:35.180125 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:35.179909 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/aa73fd77-190c-4322-afe6-9facc2e9e8ad-dbus\") pod \"global-pull-secret-syncer-766wb\" (UID: \"aa73fd77-190c-4322-afe6-9facc2e9e8ad\") " pod="kube-system/global-pull-secret-syncer-766wb" Apr 16 22:18:35.280497 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:35.280455 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/aa73fd77-190c-4322-afe6-9facc2e9e8ad-kubelet-config\") pod \"global-pull-secret-syncer-766wb\" (UID: \"aa73fd77-190c-4322-afe6-9facc2e9e8ad\") " pod="kube-system/global-pull-secret-syncer-766wb" Apr 16 22:18:35.280698 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:35.280505 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/aa73fd77-190c-4322-afe6-9facc2e9e8ad-original-pull-secret\") pod \"global-pull-secret-syncer-766wb\" (UID: \"aa73fd77-190c-4322-afe6-9facc2e9e8ad\") " pod="kube-system/global-pull-secret-syncer-766wb" Apr 16 22:18:35.280698 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:35.280534 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/aa73fd77-190c-4322-afe6-9facc2e9e8ad-dbus\") pod \"global-pull-secret-syncer-766wb\" (UID: \"aa73fd77-190c-4322-afe6-9facc2e9e8ad\") " pod="kube-system/global-pull-secret-syncer-766wb" Apr 16 22:18:35.280698 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:35.280600 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/aa73fd77-190c-4322-afe6-9facc2e9e8ad-kubelet-config\") pod \"global-pull-secret-syncer-766wb\" (UID: \"aa73fd77-190c-4322-afe6-9facc2e9e8ad\") " pod="kube-system/global-pull-secret-syncer-766wb" Apr 16 22:18:35.280869 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:35.280726 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/aa73fd77-190c-4322-afe6-9facc2e9e8ad-dbus\") pod \"global-pull-secret-syncer-766wb\" (UID: \"aa73fd77-190c-4322-afe6-9facc2e9e8ad\") " pod="kube-system/global-pull-secret-syncer-766wb" Apr 16 22:18:35.282757 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:35.282733 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/aa73fd77-190c-4322-afe6-9facc2e9e8ad-original-pull-secret\") pod \"global-pull-secret-syncer-766wb\" (UID: \"aa73fd77-190c-4322-afe6-9facc2e9e8ad\") " pod="kube-system/global-pull-secret-syncer-766wb" Apr 16 22:18:35.402966 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:35.402871 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-766wb" Apr 16 22:18:35.515362 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:35.515330 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-766wb"] Apr 16 22:18:35.518647 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:18:35.518610 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa73fd77_190c_4322_afe6_9facc2e9e8ad.slice/crio-90278e774537c5b5199ef940d8ad730147ad7b085c3eb634e11419a12ada4756 WatchSource:0}: Error finding container 90278e774537c5b5199ef940d8ad730147ad7b085c3eb634e11419a12ada4756: Status 404 returned error can't find the container with id 90278e774537c5b5199ef940d8ad730147ad7b085c3eb634e11419a12ada4756 Apr 16 22:18:35.987567 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:35.987519 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-766wb" event={"ID":"aa73fd77-190c-4322-afe6-9facc2e9e8ad","Type":"ContainerStarted","Data":"90278e774537c5b5199ef940d8ad730147ad7b085c3eb634e11419a12ada4756"} Apr 16 22:18:40.000116 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:40.000076 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-766wb" event={"ID":"aa73fd77-190c-4322-afe6-9facc2e9e8ad","Type":"ContainerStarted","Data":"2e2e45ee4e602aef68da4858ffdd0b7e267915a73428018140fa9d9f32c54776"} Apr 16 22:18:40.016407 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:40.016350 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-766wb" podStartSLOduration=1.058152175 podStartE2EDuration="5.016335331s" podCreationTimestamp="2026-04-16 22:18:35 +0000 UTC" firstStartedPulling="2026-04-16 22:18:35.520284219 +0000 UTC m=+286.101266402" lastFinishedPulling="2026-04-16 22:18:39.478467362 +0000 UTC m=+290.059449558" observedRunningTime="2026-04-16 22:18:40.015148539 +0000 UTC m=+290.596130746" watchObservedRunningTime="2026-04-16 22:18:40.016335331 +0000 UTC m=+290.597317535" Apr 16 22:18:49.930046 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:18:49.930025 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 22:19:05.257108 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:19:05.257071 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:19:05.271678 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:19:05.271653 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:19:06.088572 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:19:06.088524 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 22:21:55.108605 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:21:55.108570 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-dcbl6"] Apr 16 22:21:55.111772 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:21:55.111754 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-dcbl6" Apr 16 22:21:55.114502 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:21:55.114481 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 22:21:55.114627 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:21:55.114539 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 22:21:55.114677 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:21:55.114650 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-fkqp7\"" Apr 16 22:21:55.115468 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:21:55.115451 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 22:21:55.117894 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:21:55.117874 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-dcbl6"] Apr 16 22:21:55.209290 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:21:55.209256 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqpgk\" (UniqueName: \"kubernetes.io/projected/52f8515b-400b-4d73-b5d7-12f84d118d4e-kube-api-access-lqpgk\") pod \"s3-init-dcbl6\" (UID: \"52f8515b-400b-4d73-b5d7-12f84d118d4e\") " pod="kserve/s3-init-dcbl6" Apr 16 22:21:55.309797 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:21:55.309764 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqpgk\" (UniqueName: \"kubernetes.io/projected/52f8515b-400b-4d73-b5d7-12f84d118d4e-kube-api-access-lqpgk\") pod \"s3-init-dcbl6\" (UID: \"52f8515b-400b-4d73-b5d7-12f84d118d4e\") " pod="kserve/s3-init-dcbl6" Apr 16 22:21:55.320823 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:21:55.320791 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqpgk\" (UniqueName: \"kubernetes.io/projected/52f8515b-400b-4d73-b5d7-12f84d118d4e-kube-api-access-lqpgk\") pod \"s3-init-dcbl6\" (UID: \"52f8515b-400b-4d73-b5d7-12f84d118d4e\") " pod="kserve/s3-init-dcbl6" Apr 16 22:21:55.431701 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:21:55.431607 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-dcbl6" Apr 16 22:21:55.552869 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:21:55.552832 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-dcbl6"] Apr 16 22:21:55.559267 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:21:55.559241 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:21:56.561020 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:21:56.560978 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-dcbl6" event={"ID":"52f8515b-400b-4d73-b5d7-12f84d118d4e","Type":"ContainerStarted","Data":"1bceb1aaef1b774543d863fd41fe0ac747b886f90bbf114a7b6103b6aeb572f3"} Apr 16 22:22:00.575254 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:22:00.575209 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-dcbl6" event={"ID":"52f8515b-400b-4d73-b5d7-12f84d118d4e","Type":"ContainerStarted","Data":"d3f1cd4f9e8450085ab27fe0ac9879cd2d9755198e3d1d72199d3d1dc6055824"} Apr 16 22:22:00.590630 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:22:00.590581 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-dcbl6" podStartSLOduration=1.114447985 podStartE2EDuration="5.590565119s" podCreationTimestamp="2026-04-16 22:21:55 +0000 UTC" firstStartedPulling="2026-04-16 22:21:55.559364038 +0000 UTC m=+486.140346222" lastFinishedPulling="2026-04-16 22:22:00.035481157 +0000 UTC m=+490.616463356" observedRunningTime="2026-04-16 22:22:00.590109773 +0000 UTC m=+491.171091977" watchObservedRunningTime="2026-04-16 22:22:00.590565119 +0000 UTC m=+491.171547316" Apr 16 22:22:03.585537 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:22:03.585503 2572 generic.go:358] "Generic (PLEG): container finished" podID="52f8515b-400b-4d73-b5d7-12f84d118d4e" containerID="d3f1cd4f9e8450085ab27fe0ac9879cd2d9755198e3d1d72199d3d1dc6055824" exitCode=0 Apr 16 22:22:03.585930 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:22:03.585541 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-dcbl6" event={"ID":"52f8515b-400b-4d73-b5d7-12f84d118d4e","Type":"ContainerDied","Data":"d3f1cd4f9e8450085ab27fe0ac9879cd2d9755198e3d1d72199d3d1dc6055824"} Apr 16 22:22:04.715467 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:22:04.715440 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-dcbl6" Apr 16 22:22:04.794924 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:22:04.794884 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqpgk\" (UniqueName: \"kubernetes.io/projected/52f8515b-400b-4d73-b5d7-12f84d118d4e-kube-api-access-lqpgk\") pod \"52f8515b-400b-4d73-b5d7-12f84d118d4e\" (UID: \"52f8515b-400b-4d73-b5d7-12f84d118d4e\") " Apr 16 22:22:04.797107 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:22:04.797083 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52f8515b-400b-4d73-b5d7-12f84d118d4e-kube-api-access-lqpgk" (OuterVolumeSpecName: "kube-api-access-lqpgk") pod "52f8515b-400b-4d73-b5d7-12f84d118d4e" (UID: "52f8515b-400b-4d73-b5d7-12f84d118d4e"). InnerVolumeSpecName "kube-api-access-lqpgk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:22:04.895991 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:22:04.895906 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lqpgk\" (UniqueName: \"kubernetes.io/projected/52f8515b-400b-4d73-b5d7-12f84d118d4e-kube-api-access-lqpgk\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:22:05.592322 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:22:05.592293 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-dcbl6" Apr 16 22:22:05.592490 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:22:05.592296 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-dcbl6" event={"ID":"52f8515b-400b-4d73-b5d7-12f84d118d4e","Type":"ContainerDied","Data":"1bceb1aaef1b774543d863fd41fe0ac747b886f90bbf114a7b6103b6aeb572f3"} Apr 16 22:22:05.592490 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:22:05.592403 2572 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bceb1aaef1b774543d863fd41fe0ac747b886f90bbf114a7b6103b6aeb572f3" Apr 16 22:25:39.865238 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:39.865200 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8"] Apr 16 22:25:39.865836 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:39.865668 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52f8515b-400b-4d73-b5d7-12f84d118d4e" containerName="s3-init" Apr 16 22:25:39.865836 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:39.865686 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f8515b-400b-4d73-b5d7-12f84d118d4e" containerName="s3-init" Apr 16 22:25:39.865836 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:39.865825 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="52f8515b-400b-4d73-b5d7-12f84d118d4e" containerName="s3-init" Apr 16 22:25:39.868618 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:39.868598 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" Apr 16 22:25:39.870997 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:39.870971 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 22:25:39.871109 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:39.871027 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-b6a6b-kube-rbac-proxy-sar-config\"" Apr 16 22:25:39.871109 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:39.871040 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-b6a6b-serving-cert\"" Apr 16 22:25:39.871221 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:39.871128 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-gj7r7\"" Apr 16 22:25:39.876263 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:39.876014 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8"] Apr 16 22:25:39.990201 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:39.990168 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db8537bb-fb88-4451-bf05-ad422a1a71d9-proxy-tls\") pod \"model-chainer-raw-b6a6b-c9d8fd588-kv7k8\" (UID: \"db8537bb-fb88-4451-bf05-ad422a1a71d9\") " pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" Apr 16 22:25:39.990404 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:39.990243 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db8537bb-fb88-4451-bf05-ad422a1a71d9-openshift-service-ca-bundle\") pod \"model-chainer-raw-b6a6b-c9d8fd588-kv7k8\" (UID: \"db8537bb-fb88-4451-bf05-ad422a1a71d9\") " pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" Apr 16 22:25:40.091209 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:40.091167 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db8537bb-fb88-4451-bf05-ad422a1a71d9-proxy-tls\") pod \"model-chainer-raw-b6a6b-c9d8fd588-kv7k8\" (UID: \"db8537bb-fb88-4451-bf05-ad422a1a71d9\") " pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" Apr 16 22:25:40.091444 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:40.091243 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db8537bb-fb88-4451-bf05-ad422a1a71d9-openshift-service-ca-bundle\") pod \"model-chainer-raw-b6a6b-c9d8fd588-kv7k8\" (UID: \"db8537bb-fb88-4451-bf05-ad422a1a71d9\") " pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" Apr 16 22:25:40.091444 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:25:40.091324 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-b6a6b-serving-cert: secret "model-chainer-raw-b6a6b-serving-cert" not found Apr 16 22:25:40.091444 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:25:40.091406 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db8537bb-fb88-4451-bf05-ad422a1a71d9-proxy-tls podName:db8537bb-fb88-4451-bf05-ad422a1a71d9 nodeName:}" failed. No retries permitted until 2026-04-16 22:25:40.591384968 +0000 UTC m=+711.172367151 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/db8537bb-fb88-4451-bf05-ad422a1a71d9-proxy-tls") pod "model-chainer-raw-b6a6b-c9d8fd588-kv7k8" (UID: "db8537bb-fb88-4451-bf05-ad422a1a71d9") : secret "model-chainer-raw-b6a6b-serving-cert" not found Apr 16 22:25:40.091955 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:40.091938 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db8537bb-fb88-4451-bf05-ad422a1a71d9-openshift-service-ca-bundle\") pod \"model-chainer-raw-b6a6b-c9d8fd588-kv7k8\" (UID: \"db8537bb-fb88-4451-bf05-ad422a1a71d9\") " pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" Apr 16 22:25:40.596410 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:40.596373 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db8537bb-fb88-4451-bf05-ad422a1a71d9-proxy-tls\") pod \"model-chainer-raw-b6a6b-c9d8fd588-kv7k8\" (UID: \"db8537bb-fb88-4451-bf05-ad422a1a71d9\") " pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" Apr 16 22:25:40.598900 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:40.598877 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db8537bb-fb88-4451-bf05-ad422a1a71d9-proxy-tls\") pod \"model-chainer-raw-b6a6b-c9d8fd588-kv7k8\" (UID: \"db8537bb-fb88-4451-bf05-ad422a1a71d9\") " pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" Apr 16 22:25:40.779892 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:40.779856 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" Apr 16 22:25:40.895480 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:40.895446 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8"] Apr 16 22:25:40.898769 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:25:40.898741 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb8537bb_fb88_4451_bf05_ad422a1a71d9.slice/crio-7344b095781655e4ff1b2eab7091c14535690103551c18e73e0035c8032e5cb1 WatchSource:0}: Error finding container 7344b095781655e4ff1b2eab7091c14535690103551c18e73e0035c8032e5cb1: Status 404 returned error can't find the container with id 7344b095781655e4ff1b2eab7091c14535690103551c18e73e0035c8032e5cb1 Apr 16 22:25:41.211858 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:41.211764 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" event={"ID":"db8537bb-fb88-4451-bf05-ad422a1a71d9","Type":"ContainerStarted","Data":"7344b095781655e4ff1b2eab7091c14535690103551c18e73e0035c8032e5cb1"} Apr 16 22:25:44.221223 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:44.221173 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" event={"ID":"db8537bb-fb88-4451-bf05-ad422a1a71d9","Type":"ContainerStarted","Data":"ae9bf3a839f161577839759815ba9866bdd022f70d1d97fd3e0205647edf9e79"} Apr 16 22:25:44.221686 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:44.221330 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" Apr 16 22:25:44.236513 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:44.236466 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" podStartSLOduration=2.951068486 podStartE2EDuration="5.236454522s" podCreationTimestamp="2026-04-16 22:25:39 +0000 UTC" firstStartedPulling="2026-04-16 22:25:40.90093425 +0000 UTC m=+711.481916433" lastFinishedPulling="2026-04-16 22:25:43.186320282 +0000 UTC m=+713.767302469" observedRunningTime="2026-04-16 22:25:44.235892822 +0000 UTC m=+714.816875027" watchObservedRunningTime="2026-04-16 22:25:44.236454522 +0000 UTC m=+714.817436727" Apr 16 22:25:49.925709 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:49.925678 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8"] Apr 16 22:25:49.926161 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:49.925905 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" podUID="db8537bb-fb88-4451-bf05-ad422a1a71d9" containerName="model-chainer-raw-b6a6b" containerID="cri-o://ae9bf3a839f161577839759815ba9866bdd022f70d1d97fd3e0205647edf9e79" gracePeriod=30 Apr 16 22:25:49.932927 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:49.932898 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" podUID="db8537bb-fb88-4451-bf05-ad422a1a71d9" containerName="model-chainer-raw-b6a6b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:25:54.930126 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:54.930082 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" podUID="db8537bb-fb88-4451-bf05-ad422a1a71d9" containerName="model-chainer-raw-b6a6b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:25:59.930058 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:25:59.930017 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" podUID="db8537bb-fb88-4451-bf05-ad422a1a71d9" containerName="model-chainer-raw-b6a6b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:26:04.930008 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:26:04.929967 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" podUID="db8537bb-fb88-4451-bf05-ad422a1a71d9" containerName="model-chainer-raw-b6a6b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:26:09.930062 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:26:09.930021 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" podUID="db8537bb-fb88-4451-bf05-ad422a1a71d9" containerName="model-chainer-raw-b6a6b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:26:14.930405 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:26:14.930363 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" podUID="db8537bb-fb88-4451-bf05-ad422a1a71d9" containerName="model-chainer-raw-b6a6b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:26:19.930545 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:26:19.930509 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" podUID="db8537bb-fb88-4451-bf05-ad422a1a71d9" containerName="model-chainer-raw-b6a6b" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:26:20.059642 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:26:20.059617 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" Apr 16 22:26:20.132644 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:26:20.132614 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db8537bb-fb88-4451-bf05-ad422a1a71d9-proxy-tls\") pod \"db8537bb-fb88-4451-bf05-ad422a1a71d9\" (UID: \"db8537bb-fb88-4451-bf05-ad422a1a71d9\") " Apr 16 22:26:20.132809 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:26:20.132698 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db8537bb-fb88-4451-bf05-ad422a1a71d9-openshift-service-ca-bundle\") pod \"db8537bb-fb88-4451-bf05-ad422a1a71d9\" (UID: \"db8537bb-fb88-4451-bf05-ad422a1a71d9\") " Apr 16 22:26:20.133055 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:26:20.133023 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db8537bb-fb88-4451-bf05-ad422a1a71d9-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "db8537bb-fb88-4451-bf05-ad422a1a71d9" (UID: "db8537bb-fb88-4451-bf05-ad422a1a71d9"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:26:20.134771 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:26:20.134752 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8537bb-fb88-4451-bf05-ad422a1a71d9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "db8537bb-fb88-4451-bf05-ad422a1a71d9" (UID: "db8537bb-fb88-4451-bf05-ad422a1a71d9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:26:20.233436 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:26:20.233352 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db8537bb-fb88-4451-bf05-ad422a1a71d9-openshift-service-ca-bundle\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:26:20.233436 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:26:20.233382 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/db8537bb-fb88-4451-bf05-ad422a1a71d9-proxy-tls\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:26:20.323389 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:26:20.323351 2572 generic.go:358] "Generic (PLEG): container finished" podID="db8537bb-fb88-4451-bf05-ad422a1a71d9" containerID="ae9bf3a839f161577839759815ba9866bdd022f70d1d97fd3e0205647edf9e79" exitCode=0 Apr 16 22:26:20.323571 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:26:20.323399 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" event={"ID":"db8537bb-fb88-4451-bf05-ad422a1a71d9","Type":"ContainerDied","Data":"ae9bf3a839f161577839759815ba9866bdd022f70d1d97fd3e0205647edf9e79"} Apr 16 22:26:20.323571 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:26:20.323421 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" event={"ID":"db8537bb-fb88-4451-bf05-ad422a1a71d9","Type":"ContainerDied","Data":"7344b095781655e4ff1b2eab7091c14535690103551c18e73e0035c8032e5cb1"} Apr 16 22:26:20.323571 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:26:20.323429 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8" Apr 16 22:26:20.323571 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:26:20.323436 2572 scope.go:117] "RemoveContainer" containerID="ae9bf3a839f161577839759815ba9866bdd022f70d1d97fd3e0205647edf9e79" Apr 16 22:26:20.331704 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:26:20.331686 2572 scope.go:117] "RemoveContainer" containerID="ae9bf3a839f161577839759815ba9866bdd022f70d1d97fd3e0205647edf9e79" Apr 16 22:26:20.331976 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:26:20.331956 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae9bf3a839f161577839759815ba9866bdd022f70d1d97fd3e0205647edf9e79\": container with ID starting with ae9bf3a839f161577839759815ba9866bdd022f70d1d97fd3e0205647edf9e79 not found: ID does not exist" containerID="ae9bf3a839f161577839759815ba9866bdd022f70d1d97fd3e0205647edf9e79" Apr 16 22:26:20.332030 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:26:20.331984 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae9bf3a839f161577839759815ba9866bdd022f70d1d97fd3e0205647edf9e79"} err="failed to get container status \"ae9bf3a839f161577839759815ba9866bdd022f70d1d97fd3e0205647edf9e79\": rpc error: code = NotFound desc = could not find container \"ae9bf3a839f161577839759815ba9866bdd022f70d1d97fd3e0205647edf9e79\": container with ID starting with ae9bf3a839f161577839759815ba9866bdd022f70d1d97fd3e0205647edf9e79 not found: ID does not exist" Apr 16 22:26:20.344291 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:26:20.344270 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8"] Apr 16 22:26:20.350251 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:26:20.350229 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-b6a6b-c9d8fd588-kv7k8"] Apr 16 22:26:22.041900 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:26:22.041865 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db8537bb-fb88-4451-bf05-ad422a1a71d9" path="/var/lib/kubelet/pods/db8537bb-fb88-4451-bf05-ad422a1a71d9/volumes" Apr 16 22:27:20.171924 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:20.171883 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd"] Apr 16 22:27:20.172369 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:20.172196 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db8537bb-fb88-4451-bf05-ad422a1a71d9" containerName="model-chainer-raw-b6a6b" Apr 16 22:27:20.172369 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:20.172207 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="db8537bb-fb88-4451-bf05-ad422a1a71d9" containerName="model-chainer-raw-b6a6b" Apr 16 22:27:20.172369 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:20.172262 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="db8537bb-fb88-4451-bf05-ad422a1a71d9" containerName="model-chainer-raw-b6a6b" Apr 16 22:27:20.175200 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:20.175178 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" Apr 16 22:27:20.177877 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:20.177857 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-gj7r7\"" Apr 16 22:27:20.178010 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:20.177992 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 22:27:20.178824 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:20.178807 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-23b0a-serving-cert\"" Apr 16 22:27:20.178904 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:20.178822 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-23b0a-kube-rbac-proxy-sar-config\"" Apr 16 22:27:20.184214 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:20.184194 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd"] Apr 16 22:27:20.236440 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:20.236406 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31db3325-9321-4971-9bdd-69c81e27932b-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-23b0a-75756cc884-kzzpd\" (UID: \"31db3325-9321-4971-9bdd-69c81e27932b\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" Apr 16 22:27:20.236625 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:20.236451 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31db3325-9321-4971-9bdd-69c81e27932b-proxy-tls\") pod \"model-chainer-raw-hpa-23b0a-75756cc884-kzzpd\" (UID: \"31db3325-9321-4971-9bdd-69c81e27932b\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" Apr 16 22:27:20.337677 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:20.337641 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31db3325-9321-4971-9bdd-69c81e27932b-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-23b0a-75756cc884-kzzpd\" (UID: \"31db3325-9321-4971-9bdd-69c81e27932b\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" Apr 16 22:27:20.337677 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:20.337680 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31db3325-9321-4971-9bdd-69c81e27932b-proxy-tls\") pod \"model-chainer-raw-hpa-23b0a-75756cc884-kzzpd\" (UID: \"31db3325-9321-4971-9bdd-69c81e27932b\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" Apr 16 22:27:20.337884 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:27:20.337770 2572 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-serving-cert: secret "model-chainer-raw-hpa-23b0a-serving-cert" not found Apr 16 22:27:20.337884 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:27:20.337846 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31db3325-9321-4971-9bdd-69c81e27932b-proxy-tls podName:31db3325-9321-4971-9bdd-69c81e27932b nodeName:}" failed. No retries permitted until 2026-04-16 22:27:20.837829069 +0000 UTC m=+811.418811252 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/31db3325-9321-4971-9bdd-69c81e27932b-proxy-tls") pod "model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" (UID: "31db3325-9321-4971-9bdd-69c81e27932b") : secret "model-chainer-raw-hpa-23b0a-serving-cert" not found Apr 16 22:27:20.338260 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:20.338239 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31db3325-9321-4971-9bdd-69c81e27932b-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-23b0a-75756cc884-kzzpd\" (UID: \"31db3325-9321-4971-9bdd-69c81e27932b\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" Apr 16 22:27:20.841434 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:20.841385 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31db3325-9321-4971-9bdd-69c81e27932b-proxy-tls\") pod \"model-chainer-raw-hpa-23b0a-75756cc884-kzzpd\" (UID: \"31db3325-9321-4971-9bdd-69c81e27932b\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" Apr 16 22:27:20.843824 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:20.843802 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31db3325-9321-4971-9bdd-69c81e27932b-proxy-tls\") pod \"model-chainer-raw-hpa-23b0a-75756cc884-kzzpd\" (UID: \"31db3325-9321-4971-9bdd-69c81e27932b\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" Apr 16 22:27:21.085658 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:21.085624 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" Apr 16 22:27:21.210629 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:21.210597 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd"] Apr 16 22:27:21.213798 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:27:21.213771 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31db3325_9321_4971_9bdd_69c81e27932b.slice/crio-45f8524d0e523b59bf3dfae68f17d77017fc55a876f393fe2bbd1b4fb4e28136 WatchSource:0}: Error finding container 45f8524d0e523b59bf3dfae68f17d77017fc55a876f393fe2bbd1b4fb4e28136: Status 404 returned error can't find the container with id 45f8524d0e523b59bf3dfae68f17d77017fc55a876f393fe2bbd1b4fb4e28136 Apr 16 22:27:21.216025 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:21.216010 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:27:21.504845 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:21.504815 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" event={"ID":"31db3325-9321-4971-9bdd-69c81e27932b","Type":"ContainerStarted","Data":"936f6f7ccee576759a544ee7fe07566f0f843328dced4dd8ec1b8d2651ed0ffe"} Apr 16 22:27:21.505012 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:21.504852 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" event={"ID":"31db3325-9321-4971-9bdd-69c81e27932b","Type":"ContainerStarted","Data":"45f8524d0e523b59bf3dfae68f17d77017fc55a876f393fe2bbd1b4fb4e28136"} Apr 16 22:27:21.505012 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:21.504879 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" Apr 16 22:27:21.522575 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:21.522519 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" podStartSLOduration=1.52250782 podStartE2EDuration="1.52250782s" podCreationTimestamp="2026-04-16 22:27:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:27:21.521020138 +0000 UTC m=+812.102002343" watchObservedRunningTime="2026-04-16 22:27:21.52250782 +0000 UTC m=+812.103490025" Apr 16 22:27:27.513257 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:27.513226 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" Apr 16 22:27:30.241523 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:30.241486 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd"] Apr 16 22:27:30.241970 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:30.241799 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" podUID="31db3325-9321-4971-9bdd-69c81e27932b" containerName="model-chainer-raw-hpa-23b0a" containerID="cri-o://936f6f7ccee576759a544ee7fe07566f0f843328dced4dd8ec1b8d2651ed0ffe" gracePeriod=30 Apr 16 22:27:32.512479 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:32.512435 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" podUID="31db3325-9321-4971-9bdd-69c81e27932b" containerName="model-chainer-raw-hpa-23b0a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:27:37.512184 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:37.512137 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" podUID="31db3325-9321-4971-9bdd-69c81e27932b" containerName="model-chainer-raw-hpa-23b0a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:27:42.513596 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:42.513534 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" podUID="31db3325-9321-4971-9bdd-69c81e27932b" containerName="model-chainer-raw-hpa-23b0a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:27:42.513987 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:42.513689 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" Apr 16 22:27:47.512122 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:47.512079 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" podUID="31db3325-9321-4971-9bdd-69c81e27932b" containerName="model-chainer-raw-hpa-23b0a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:27:52.511645 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:52.511606 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" podUID="31db3325-9321-4971-9bdd-69c81e27932b" containerName="model-chainer-raw-hpa-23b0a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:27:57.512519 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:27:57.512474 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" podUID="31db3325-9321-4971-9bdd-69c81e27932b" containerName="model-chainer-raw-hpa-23b0a" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 22:28:00.284002 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:28:00.283960 2572 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31db3325_9321_4971_9bdd_69c81e27932b.slice/crio-conmon-936f6f7ccee576759a544ee7fe07566f0f843328dced4dd8ec1b8d2651ed0ffe.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31db3325_9321_4971_9bdd_69c81e27932b.slice/crio-936f6f7ccee576759a544ee7fe07566f0f843328dced4dd8ec1b8d2651ed0ffe.scope\": RecentStats: unable to find data in memory cache]" Apr 16 22:28:00.387625 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:28:00.387601 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" Apr 16 22:28:00.566814 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:28:00.566786 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31db3325-9321-4971-9bdd-69c81e27932b-openshift-service-ca-bundle\") pod \"31db3325-9321-4971-9bdd-69c81e27932b\" (UID: \"31db3325-9321-4971-9bdd-69c81e27932b\") " Apr 16 22:28:00.566984 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:28:00.566821 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31db3325-9321-4971-9bdd-69c81e27932b-proxy-tls\") pod \"31db3325-9321-4971-9bdd-69c81e27932b\" (UID: \"31db3325-9321-4971-9bdd-69c81e27932b\") " Apr 16 22:28:00.567198 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:28:00.567171 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31db3325-9321-4971-9bdd-69c81e27932b-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "31db3325-9321-4971-9bdd-69c81e27932b" (UID: "31db3325-9321-4971-9bdd-69c81e27932b"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:28:00.568902 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:28:00.568880 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31db3325-9321-4971-9bdd-69c81e27932b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31db3325-9321-4971-9bdd-69c81e27932b" (UID: "31db3325-9321-4971-9bdd-69c81e27932b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:28:00.626987 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:28:00.626955 2572 generic.go:358] "Generic (PLEG): container finished" podID="31db3325-9321-4971-9bdd-69c81e27932b" containerID="936f6f7ccee576759a544ee7fe07566f0f843328dced4dd8ec1b8d2651ed0ffe" exitCode=0 Apr 16 22:28:00.627153 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:28:00.627012 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" event={"ID":"31db3325-9321-4971-9bdd-69c81e27932b","Type":"ContainerDied","Data":"936f6f7ccee576759a544ee7fe07566f0f843328dced4dd8ec1b8d2651ed0ffe"} Apr 16 22:28:00.627153 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:28:00.627025 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" Apr 16 22:28:00.627153 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:28:00.627040 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd" event={"ID":"31db3325-9321-4971-9bdd-69c81e27932b","Type":"ContainerDied","Data":"45f8524d0e523b59bf3dfae68f17d77017fc55a876f393fe2bbd1b4fb4e28136"} Apr 16 22:28:00.627153 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:28:00.627056 2572 scope.go:117] "RemoveContainer" containerID="936f6f7ccee576759a544ee7fe07566f0f843328dced4dd8ec1b8d2651ed0ffe" Apr 16 22:28:00.635775 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:28:00.635753 2572 scope.go:117] "RemoveContainer" containerID="936f6f7ccee576759a544ee7fe07566f0f843328dced4dd8ec1b8d2651ed0ffe" Apr 16 22:28:00.636019 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:28:00.635996 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"936f6f7ccee576759a544ee7fe07566f0f843328dced4dd8ec1b8d2651ed0ffe\": container with ID starting with 936f6f7ccee576759a544ee7fe07566f0f843328dced4dd8ec1b8d2651ed0ffe not found: ID does not exist" containerID="936f6f7ccee576759a544ee7fe07566f0f843328dced4dd8ec1b8d2651ed0ffe" Apr 16 22:28:00.636083 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:28:00.636027 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"936f6f7ccee576759a544ee7fe07566f0f843328dced4dd8ec1b8d2651ed0ffe"} err="failed to get container status \"936f6f7ccee576759a544ee7fe07566f0f843328dced4dd8ec1b8d2651ed0ffe\": rpc error: code = NotFound desc = could not find container \"936f6f7ccee576759a544ee7fe07566f0f843328dced4dd8ec1b8d2651ed0ffe\": container with ID starting with 936f6f7ccee576759a544ee7fe07566f0f843328dced4dd8ec1b8d2651ed0ffe not found: ID does not exist" Apr 16 22:28:00.648682 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:28:00.648653 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd"] Apr 16 22:28:00.652755 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:28:00.652735 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-23b0a-75756cc884-kzzpd"] Apr 16 22:28:00.667422 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:28:00.667395 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31db3325-9321-4971-9bdd-69c81e27932b-openshift-service-ca-bundle\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:28:00.667524 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:28:00.667422 2572 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31db3325-9321-4971-9bdd-69c81e27932b-proxy-tls\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:28:02.042113 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:28:02.042078 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31db3325-9321-4971-9bdd-69c81e27932b" path="/var/lib/kubelet/pods/31db3325-9321-4971-9bdd-69c81e27932b/volumes" Apr 16 22:36:10.842630 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:10.842535 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kzl9q/must-gather-sw8p4"] Apr 16 22:36:10.843090 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:10.842844 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="31db3325-9321-4971-9bdd-69c81e27932b" containerName="model-chainer-raw-hpa-23b0a" Apr 16 22:36:10.843090 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:10.842854 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="31db3325-9321-4971-9bdd-69c81e27932b" containerName="model-chainer-raw-hpa-23b0a" Apr 16 22:36:10.843090 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:10.842919 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="31db3325-9321-4971-9bdd-69c81e27932b" containerName="model-chainer-raw-hpa-23b0a" Apr 16 22:36:10.845921 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:10.845900 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzl9q/must-gather-sw8p4" Apr 16 22:36:10.848775 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:10.848750 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kzl9q\"/\"openshift-service-ca.crt\"" Apr 16 22:36:10.848894 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:10.848776 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kzl9q\"/\"kube-root-ca.crt\"" Apr 16 22:36:10.848894 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:10.848792 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kzl9q\"/\"default-dockercfg-q27z6\"" Apr 16 22:36:10.854604 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:10.854578 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kzl9q/must-gather-sw8p4"] Apr 16 22:36:10.949957 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:10.949920 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d9c03ea7-bfdc-43b7-9b20-efbb307733ba-must-gather-output\") pod \"must-gather-sw8p4\" (UID: \"d9c03ea7-bfdc-43b7-9b20-efbb307733ba\") " pod="openshift-must-gather-kzl9q/must-gather-sw8p4" Apr 16 22:36:10.949957 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:10.949961 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42ds2\" (UniqueName: \"kubernetes.io/projected/d9c03ea7-bfdc-43b7-9b20-efbb307733ba-kube-api-access-42ds2\") pod \"must-gather-sw8p4\" (UID: \"d9c03ea7-bfdc-43b7-9b20-efbb307733ba\") " pod="openshift-must-gather-kzl9q/must-gather-sw8p4" Apr 16 22:36:11.050829 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:11.050793 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d9c03ea7-bfdc-43b7-9b20-efbb307733ba-must-gather-output\") pod \"must-gather-sw8p4\" (UID: \"d9c03ea7-bfdc-43b7-9b20-efbb307733ba\") " pod="openshift-must-gather-kzl9q/must-gather-sw8p4" Apr 16 22:36:11.051006 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:11.050836 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42ds2\" (UniqueName: \"kubernetes.io/projected/d9c03ea7-bfdc-43b7-9b20-efbb307733ba-kube-api-access-42ds2\") pod \"must-gather-sw8p4\" (UID: \"d9c03ea7-bfdc-43b7-9b20-efbb307733ba\") " pod="openshift-must-gather-kzl9q/must-gather-sw8p4" Apr 16 22:36:11.051217 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:11.051195 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d9c03ea7-bfdc-43b7-9b20-efbb307733ba-must-gather-output\") pod \"must-gather-sw8p4\" (UID: \"d9c03ea7-bfdc-43b7-9b20-efbb307733ba\") " pod="openshift-must-gather-kzl9q/must-gather-sw8p4" Apr 16 22:36:11.058675 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:11.058649 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42ds2\" (UniqueName: \"kubernetes.io/projected/d9c03ea7-bfdc-43b7-9b20-efbb307733ba-kube-api-access-42ds2\") pod \"must-gather-sw8p4\" (UID: \"d9c03ea7-bfdc-43b7-9b20-efbb307733ba\") " pod="openshift-must-gather-kzl9q/must-gather-sw8p4" Apr 16 22:36:11.171831 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:11.171739 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzl9q/must-gather-sw8p4" Apr 16 22:36:11.285653 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:11.285622 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kzl9q/must-gather-sw8p4"] Apr 16 22:36:11.289421 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:36:11.289390 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9c03ea7_bfdc_43b7_9b20_efbb307733ba.slice/crio-8e2fbb9e63a70baacaf74cfa1393ea28b2fa3e11903e102b3a0fa0380fecfe42 WatchSource:0}: Error finding container 8e2fbb9e63a70baacaf74cfa1393ea28b2fa3e11903e102b3a0fa0380fecfe42: Status 404 returned error can't find the container with id 8e2fbb9e63a70baacaf74cfa1393ea28b2fa3e11903e102b3a0fa0380fecfe42 Apr 16 22:36:11.291075 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:11.291059 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:36:12.048653 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:12.048614 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kzl9q/must-gather-sw8p4" event={"ID":"d9c03ea7-bfdc-43b7-9b20-efbb307733ba","Type":"ContainerStarted","Data":"8e2fbb9e63a70baacaf74cfa1393ea28b2fa3e11903e102b3a0fa0380fecfe42"} Apr 16 22:36:16.062928 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:16.062871 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kzl9q/must-gather-sw8p4" event={"ID":"d9c03ea7-bfdc-43b7-9b20-efbb307733ba","Type":"ContainerStarted","Data":"83cb838d20f1239abe2b5fec1cff37b7367153bc9091c99017edab21f7e201f7"} Apr 16 22:36:17.068589 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:17.068530 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kzl9q/must-gather-sw8p4" event={"ID":"d9c03ea7-bfdc-43b7-9b20-efbb307733ba","Type":"ContainerStarted","Data":"bfb595d88bf092a083c4a3d491f6157dd2dff1e9506e29aaf6ec2e08046f4e9d"} Apr 16 22:36:17.085149 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:17.085085 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kzl9q/must-gather-sw8p4" podStartSLOduration=2.519615585 podStartE2EDuration="7.08506596s" podCreationTimestamp="2026-04-16 22:36:10 +0000 UTC" firstStartedPulling="2026-04-16 22:36:11.291186006 +0000 UTC m=+1341.872168188" lastFinishedPulling="2026-04-16 22:36:15.856636378 +0000 UTC m=+1346.437618563" observedRunningTime="2026-04-16 22:36:17.083973723 +0000 UTC m=+1347.664955932" watchObservedRunningTime="2026-04-16 22:36:17.08506596 +0000 UTC m=+1347.666048165" Apr 16 22:36:34.125575 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:34.125472 2572 generic.go:358] "Generic (PLEG): container finished" podID="d9c03ea7-bfdc-43b7-9b20-efbb307733ba" containerID="83cb838d20f1239abe2b5fec1cff37b7367153bc9091c99017edab21f7e201f7" exitCode=0 Apr 16 22:36:34.125575 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:34.125545 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kzl9q/must-gather-sw8p4" event={"ID":"d9c03ea7-bfdc-43b7-9b20-efbb307733ba","Type":"ContainerDied","Data":"83cb838d20f1239abe2b5fec1cff37b7367153bc9091c99017edab21f7e201f7"} Apr 16 22:36:34.125997 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:34.125854 2572 scope.go:117] "RemoveContainer" containerID="83cb838d20f1239abe2b5fec1cff37b7367153bc9091c99017edab21f7e201f7" Apr 16 22:36:34.869138 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:34.869100 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kzl9q_must-gather-sw8p4_d9c03ea7-bfdc-43b7-9b20-efbb307733ba/gather/0.log" Apr 16 22:36:38.025358 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:38.025326 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-766wb_aa73fd77-190c-4322-afe6-9facc2e9e8ad/global-pull-secret-syncer/0.log" Apr 16 22:36:38.226687 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:38.226658 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-fdq6c_33fba0e3-6c1e-4bb4-a2e7-db58d15d18a4/konnectivity-agent/0.log" Apr 16 22:36:38.273700 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:38.273667 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-65.ec2.internal_cb37a96c125ac85aff486f553494b98e/haproxy/0.log" Apr 16 22:36:40.247328 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:40.247297 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kzl9q/must-gather-sw8p4"] Apr 16 22:36:40.247811 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:40.247525 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-kzl9q/must-gather-sw8p4" podUID="d9c03ea7-bfdc-43b7-9b20-efbb307733ba" containerName="copy" containerID="cri-o://bfb595d88bf092a083c4a3d491f6157dd2dff1e9506e29aaf6ec2e08046f4e9d" gracePeriod=2 Apr 16 22:36:40.249955 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:40.249918 2572 status_manager.go:895] "Failed to get status for pod" podUID="d9c03ea7-bfdc-43b7-9b20-efbb307733ba" pod="openshift-must-gather-kzl9q/must-gather-sw8p4" err="pods \"must-gather-sw8p4\" is forbidden: User \"system:node:ip-10-0-140-65.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kzl9q\": no relationship found between node 'ip-10-0-140-65.ec2.internal' and this object" Apr 16 22:36:40.251834 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:40.251233 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kzl9q/must-gather-sw8p4"] Apr 16 22:36:40.470377 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:40.470350 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kzl9q_must-gather-sw8p4_d9c03ea7-bfdc-43b7-9b20-efbb307733ba/copy/0.log" Apr 16 22:36:40.470738 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:40.470717 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzl9q/must-gather-sw8p4" Apr 16 22:36:40.473444 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:40.473419 2572 status_manager.go:895] "Failed to get status for pod" podUID="d9c03ea7-bfdc-43b7-9b20-efbb307733ba" pod="openshift-must-gather-kzl9q/must-gather-sw8p4" err="pods \"must-gather-sw8p4\" is forbidden: User \"system:node:ip-10-0-140-65.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kzl9q\": no relationship found between node 'ip-10-0-140-65.ec2.internal' and this object" Apr 16 22:36:40.619272 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:40.619239 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42ds2\" (UniqueName: \"kubernetes.io/projected/d9c03ea7-bfdc-43b7-9b20-efbb307733ba-kube-api-access-42ds2\") pod \"d9c03ea7-bfdc-43b7-9b20-efbb307733ba\" (UID: \"d9c03ea7-bfdc-43b7-9b20-efbb307733ba\") " Apr 16 22:36:40.619416 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:40.619391 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d9c03ea7-bfdc-43b7-9b20-efbb307733ba-must-gather-output\") pod \"d9c03ea7-bfdc-43b7-9b20-efbb307733ba\" (UID: \"d9c03ea7-bfdc-43b7-9b20-efbb307733ba\") " Apr 16 22:36:40.620635 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:40.620606 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9c03ea7-bfdc-43b7-9b20-efbb307733ba-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d9c03ea7-bfdc-43b7-9b20-efbb307733ba" (UID: "d9c03ea7-bfdc-43b7-9b20-efbb307733ba"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:36:40.621348 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:40.621321 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9c03ea7-bfdc-43b7-9b20-efbb307733ba-kube-api-access-42ds2" (OuterVolumeSpecName: "kube-api-access-42ds2") pod "d9c03ea7-bfdc-43b7-9b20-efbb307733ba" (UID: "d9c03ea7-bfdc-43b7-9b20-efbb307733ba"). InnerVolumeSpecName "kube-api-access-42ds2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:36:40.720589 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:40.720539 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-42ds2\" (UniqueName: \"kubernetes.io/projected/d9c03ea7-bfdc-43b7-9b20-efbb307733ba-kube-api-access-42ds2\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:36:40.720589 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:40.720588 2572 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d9c03ea7-bfdc-43b7-9b20-efbb307733ba-must-gather-output\") on node \"ip-10-0-140-65.ec2.internal\" DevicePath \"\"" Apr 16 22:36:41.144671 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:41.144640 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kzl9q_must-gather-sw8p4_d9c03ea7-bfdc-43b7-9b20-efbb307733ba/copy/0.log" Apr 16 22:36:41.144956 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:41.144935 2572 generic.go:358] "Generic (PLEG): container finished" podID="d9c03ea7-bfdc-43b7-9b20-efbb307733ba" containerID="bfb595d88bf092a083c4a3d491f6157dd2dff1e9506e29aaf6ec2e08046f4e9d" exitCode=143 Apr 16 22:36:41.145017 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:41.144982 2572 scope.go:117] "RemoveContainer" containerID="bfb595d88bf092a083c4a3d491f6157dd2dff1e9506e29aaf6ec2e08046f4e9d" Apr 16 22:36:41.145017 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:41.144995 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kzl9q/must-gather-sw8p4" Apr 16 22:36:41.147448 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:41.147422 2572 status_manager.go:895] "Failed to get status for pod" podUID="d9c03ea7-bfdc-43b7-9b20-efbb307733ba" pod="openshift-must-gather-kzl9q/must-gather-sw8p4" err="pods \"must-gather-sw8p4\" is forbidden: User \"system:node:ip-10-0-140-65.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kzl9q\": no relationship found between node 'ip-10-0-140-65.ec2.internal' and this object" Apr 16 22:36:41.153531 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:41.153511 2572 scope.go:117] "RemoveContainer" containerID="83cb838d20f1239abe2b5fec1cff37b7367153bc9091c99017edab21f7e201f7" Apr 16 22:36:41.155871 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:41.155850 2572 status_manager.go:895] "Failed to get status for pod" podUID="d9c03ea7-bfdc-43b7-9b20-efbb307733ba" pod="openshift-must-gather-kzl9q/must-gather-sw8p4" err="pods \"must-gather-sw8p4\" is forbidden: User \"system:node:ip-10-0-140-65.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kzl9q\": no relationship found between node 'ip-10-0-140-65.ec2.internal' and this object" Apr 16 22:36:41.165810 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:41.165790 2572 scope.go:117] "RemoveContainer" containerID="bfb595d88bf092a083c4a3d491f6157dd2dff1e9506e29aaf6ec2e08046f4e9d" Apr 16 22:36:41.166094 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:36:41.166069 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfb595d88bf092a083c4a3d491f6157dd2dff1e9506e29aaf6ec2e08046f4e9d\": container with ID starting with bfb595d88bf092a083c4a3d491f6157dd2dff1e9506e29aaf6ec2e08046f4e9d not found: ID does not exist" containerID="bfb595d88bf092a083c4a3d491f6157dd2dff1e9506e29aaf6ec2e08046f4e9d" Apr 16 22:36:41.166154 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:41.166105 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfb595d88bf092a083c4a3d491f6157dd2dff1e9506e29aaf6ec2e08046f4e9d"} err="failed to get container status \"bfb595d88bf092a083c4a3d491f6157dd2dff1e9506e29aaf6ec2e08046f4e9d\": rpc error: code = NotFound desc = could not find container \"bfb595d88bf092a083c4a3d491f6157dd2dff1e9506e29aaf6ec2e08046f4e9d\": container with ID starting with bfb595d88bf092a083c4a3d491f6157dd2dff1e9506e29aaf6ec2e08046f4e9d not found: ID does not exist" Apr 16 22:36:41.166154 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:41.166123 2572 scope.go:117] "RemoveContainer" containerID="83cb838d20f1239abe2b5fec1cff37b7367153bc9091c99017edab21f7e201f7" Apr 16 22:36:41.166358 ip-10-0-140-65 kubenswrapper[2572]: E0416 22:36:41.166338 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83cb838d20f1239abe2b5fec1cff37b7367153bc9091c99017edab21f7e201f7\": container with ID starting with 83cb838d20f1239abe2b5fec1cff37b7367153bc9091c99017edab21f7e201f7 not found: ID does not exist" containerID="83cb838d20f1239abe2b5fec1cff37b7367153bc9091c99017edab21f7e201f7" Apr 16 22:36:41.166397 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:41.166364 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83cb838d20f1239abe2b5fec1cff37b7367153bc9091c99017edab21f7e201f7"} err="failed to get container status \"83cb838d20f1239abe2b5fec1cff37b7367153bc9091c99017edab21f7e201f7\": rpc error: code = NotFound desc = could not find container \"83cb838d20f1239abe2b5fec1cff37b7367153bc9091c99017edab21f7e201f7\": container with ID starting with 83cb838d20f1239abe2b5fec1cff37b7367153bc9091c99017edab21f7e201f7 not found: ID does not exist" Apr 16 22:36:41.950079 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:41.950000 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-8q4ww_dc59af4b-1aaa-4130-832a-3fd476648af6/kube-state-metrics/0.log" Apr 16 22:36:41.970706 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:41.970672 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-8q4ww_dc59af4b-1aaa-4130-832a-3fd476648af6/kube-rbac-proxy-main/0.log" Apr 16 22:36:41.989000 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:41.988975 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-8q4ww_dc59af4b-1aaa-4130-832a-3fd476648af6/kube-rbac-proxy-self/0.log" Apr 16 22:36:42.044861 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:42.044819 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9c03ea7-bfdc-43b7-9b20-efbb307733ba" path="/var/lib/kubelet/pods/d9c03ea7-bfdc-43b7-9b20-efbb307733ba/volumes" Apr 16 22:36:42.228423 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:42.228338 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tw4zw_bc45ba3d-3968-4f05-a932-07526eacce50/node-exporter/0.log" Apr 16 22:36:42.248779 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:42.248743 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tw4zw_bc45ba3d-3968-4f05-a932-07526eacce50/kube-rbac-proxy/0.log" Apr 16 22:36:42.267072 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:42.267045 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-tw4zw_bc45ba3d-3968-4f05-a932-07526eacce50/init-textfile/0.log" Apr 16 22:36:42.371442 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:42.371408 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_491d0fec-fa89-47cd-b750-dacc85c63253/prometheus/0.log" Apr 16 22:36:42.386353 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:42.386331 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_491d0fec-fa89-47cd-b750-dacc85c63253/config-reloader/0.log" Apr 16 22:36:42.406524 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:42.406491 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_491d0fec-fa89-47cd-b750-dacc85c63253/thanos-sidecar/0.log" Apr 16 22:36:42.429055 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:42.429025 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_491d0fec-fa89-47cd-b750-dacc85c63253/kube-rbac-proxy-web/0.log" Apr 16 22:36:42.456133 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:42.456106 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_491d0fec-fa89-47cd-b750-dacc85c63253/kube-rbac-proxy/0.log" Apr 16 22:36:42.476288 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:42.476260 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_491d0fec-fa89-47cd-b750-dacc85c63253/kube-rbac-proxy-thanos/0.log" Apr 16 22:36:42.499278 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:42.499254 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_491d0fec-fa89-47cd-b750-dacc85c63253/init-config-reloader/0.log" Apr 16 22:36:42.528993 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:42.528965 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-plkbt_46cddf16-7aa1-4bda-910d-0514af9437a9/prometheus-operator/0.log" Apr 16 22:36:42.547795 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:42.547768 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-plkbt_46cddf16-7aa1-4bda-910d-0514af9437a9/kube-rbac-proxy/0.log" Apr 16 22:36:42.679001 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:42.678970 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c945fbc4f-9dvfz_afd4a484-060b-40a6-bb14-b477fd906bf1/thanos-query/0.log" Apr 16 22:36:42.699812 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:42.699784 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c945fbc4f-9dvfz_afd4a484-060b-40a6-bb14-b477fd906bf1/kube-rbac-proxy-web/0.log" Apr 16 22:36:42.724193 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:42.724166 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c945fbc4f-9dvfz_afd4a484-060b-40a6-bb14-b477fd906bf1/kube-rbac-proxy/0.log" Apr 16 22:36:42.753146 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:42.753123 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c945fbc4f-9dvfz_afd4a484-060b-40a6-bb14-b477fd906bf1/prom-label-proxy/0.log" Apr 16 22:36:42.780620 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:42.780590 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c945fbc4f-9dvfz_afd4a484-060b-40a6-bb14-b477fd906bf1/kube-rbac-proxy-rules/0.log" Apr 16 22:36:42.805757 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:42.805726 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5c945fbc4f-9dvfz_afd4a484-060b-40a6-bb14-b477fd906bf1/kube-rbac-proxy-metrics/0.log" Apr 16 22:36:43.934946 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:43.934859 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-jffnd_d40aaf22-02bf-42ef-8eb8-0ec95d0f5d07/networking-console-plugin/0.log" Apr 16 22:36:45.112735 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.112706 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-vnpn9_664ba6c7-b752-4fc9-b6c6-50847231a8a0/volume-data-source-validator/0.log" Apr 16 22:36:45.296926 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.296893 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4"] Apr 16 22:36:45.297215 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.297202 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9c03ea7-bfdc-43b7-9b20-efbb307733ba" containerName="gather" Apr 16 22:36:45.297215 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.297216 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c03ea7-bfdc-43b7-9b20-efbb307733ba" containerName="gather" Apr 16 22:36:45.297304 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.297231 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9c03ea7-bfdc-43b7-9b20-efbb307733ba" containerName="copy" Apr 16 22:36:45.297304 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.297236 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c03ea7-bfdc-43b7-9b20-efbb307733ba" containerName="copy" Apr 16 22:36:45.297304 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.297286 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9c03ea7-bfdc-43b7-9b20-efbb307733ba" containerName="copy" Apr 16 22:36:45.297304 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.297299 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9c03ea7-bfdc-43b7-9b20-efbb307733ba" containerName="gather" Apr 16 22:36:45.303067 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.303049 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4" Apr 16 22:36:45.305635 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.305609 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wls74\"/\"kube-root-ca.crt\"" Apr 16 22:36:45.306423 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.306408 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wls74\"/\"openshift-service-ca.crt\"" Apr 16 22:36:45.306491 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.306413 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wls74\"/\"default-dockercfg-zkhg7\"" Apr 16 22:36:45.310621 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.310603 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4"] Apr 16 22:36:45.456876 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.456775 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cb7b071a-e6d1-48e9-8408-47d00eee9563-proc\") pod \"perf-node-gather-daemonset-4skb4\" (UID: \"cb7b071a-e6d1-48e9-8408-47d00eee9563\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4" Apr 16 22:36:45.456876 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.456833 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cb7b071a-e6d1-48e9-8408-47d00eee9563-lib-modules\") pod \"perf-node-gather-daemonset-4skb4\" (UID: \"cb7b071a-e6d1-48e9-8408-47d00eee9563\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4" Apr 16 22:36:45.457162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.456891 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cb7b071a-e6d1-48e9-8408-47d00eee9563-podres\") pod \"perf-node-gather-daemonset-4skb4\" (UID: \"cb7b071a-e6d1-48e9-8408-47d00eee9563\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4" Apr 16 22:36:45.457162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.456964 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9g9q\" (UniqueName: \"kubernetes.io/projected/cb7b071a-e6d1-48e9-8408-47d00eee9563-kube-api-access-b9g9q\") pod \"perf-node-gather-daemonset-4skb4\" (UID: \"cb7b071a-e6d1-48e9-8408-47d00eee9563\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4" Apr 16 22:36:45.457162 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.456992 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb7b071a-e6d1-48e9-8408-47d00eee9563-sys\") pod \"perf-node-gather-daemonset-4skb4\" (UID: \"cb7b071a-e6d1-48e9-8408-47d00eee9563\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4" Apr 16 22:36:45.557982 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.557941 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cb7b071a-e6d1-48e9-8408-47d00eee9563-proc\") pod \"perf-node-gather-daemonset-4skb4\" (UID: \"cb7b071a-e6d1-48e9-8408-47d00eee9563\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4" Apr 16 22:36:45.558167 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.558021 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cb7b071a-e6d1-48e9-8408-47d00eee9563-lib-modules\") pod \"perf-node-gather-daemonset-4skb4\" (UID: \"cb7b071a-e6d1-48e9-8408-47d00eee9563\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4" Apr 16 22:36:45.558167 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.558046 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cb7b071a-e6d1-48e9-8408-47d00eee9563-podres\") pod \"perf-node-gather-daemonset-4skb4\" (UID: \"cb7b071a-e6d1-48e9-8408-47d00eee9563\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4" Apr 16 22:36:45.558167 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.558065 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cb7b071a-e6d1-48e9-8408-47d00eee9563-proc\") pod \"perf-node-gather-daemonset-4skb4\" (UID: \"cb7b071a-e6d1-48e9-8408-47d00eee9563\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4" Apr 16 22:36:45.558167 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.558074 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9g9q\" (UniqueName: \"kubernetes.io/projected/cb7b071a-e6d1-48e9-8408-47d00eee9563-kube-api-access-b9g9q\") pod \"perf-node-gather-daemonset-4skb4\" (UID: \"cb7b071a-e6d1-48e9-8408-47d00eee9563\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4" Apr 16 22:36:45.558167 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.558139 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb7b071a-e6d1-48e9-8408-47d00eee9563-sys\") pod \"perf-node-gather-daemonset-4skb4\" (UID: \"cb7b071a-e6d1-48e9-8408-47d00eee9563\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4" Apr 16 22:36:45.558421 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.558178 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cb7b071a-e6d1-48e9-8408-47d00eee9563-lib-modules\") pod \"perf-node-gather-daemonset-4skb4\" (UID: \"cb7b071a-e6d1-48e9-8408-47d00eee9563\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4" Apr 16 22:36:45.558421 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.558192 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cb7b071a-e6d1-48e9-8408-47d00eee9563-podres\") pod \"perf-node-gather-daemonset-4skb4\" (UID: \"cb7b071a-e6d1-48e9-8408-47d00eee9563\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4" Apr 16 22:36:45.558421 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.558254 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb7b071a-e6d1-48e9-8408-47d00eee9563-sys\") pod \"perf-node-gather-daemonset-4skb4\" (UID: \"cb7b071a-e6d1-48e9-8408-47d00eee9563\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4" Apr 16 22:36:45.565773 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.565753 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9g9q\" (UniqueName: \"kubernetes.io/projected/cb7b071a-e6d1-48e9-8408-47d00eee9563-kube-api-access-b9g9q\") pod \"perf-node-gather-daemonset-4skb4\" (UID: \"cb7b071a-e6d1-48e9-8408-47d00eee9563\") " pod="openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4" Apr 16 22:36:45.613886 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.613850 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4" Apr 16 22:36:45.735526 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.735448 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4"] Apr 16 22:36:45.739089 ip-10-0-140-65 kubenswrapper[2572]: W0416 22:36:45.739053 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcb7b071a_e6d1_48e9_8408_47d00eee9563.slice/crio-aa8418b14b2100af33a196dc534cb6c6d7bf3cdbde6adc276f8fd193312d8674 WatchSource:0}: Error finding container aa8418b14b2100af33a196dc534cb6c6d7bf3cdbde6adc276f8fd193312d8674: Status 404 returned error can't find the container with id aa8418b14b2100af33a196dc534cb6c6d7bf3cdbde6adc276f8fd193312d8674 Apr 16 22:36:45.862432 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.862405 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-x4vpp_6f971928-4414-4b05-9352-534e9045942e/dns/0.log" Apr 16 22:36:45.881544 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.881518 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-x4vpp_6f971928-4414-4b05-9352-534e9045942e/kube-rbac-proxy/0.log" Apr 16 22:36:45.904423 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:45.904390 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-4h7zb_340bef7a-cec6-49ff-9089-98ba19d935b8/dns-node-resolver/0.log" Apr 16 22:36:46.160792 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:46.160756 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4" event={"ID":"cb7b071a-e6d1-48e9-8408-47d00eee9563","Type":"ContainerStarted","Data":"91aa91e28d1483c50d398e17ce0a9a74eb8d02a982fb0b666d6dc051e2e8c1ad"} Apr 16 22:36:46.160792 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:46.160794 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4" event={"ID":"cb7b071a-e6d1-48e9-8408-47d00eee9563","Type":"ContainerStarted","Data":"aa8418b14b2100af33a196dc534cb6c6d7bf3cdbde6adc276f8fd193312d8674"} Apr 16 22:36:46.161228 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:46.160902 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4" Apr 16 22:36:46.177258 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:46.177209 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4" podStartSLOduration=1.177192842 podStartE2EDuration="1.177192842s" podCreationTimestamp="2026-04-16 22:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:36:46.175379477 +0000 UTC m=+1376.756361682" watchObservedRunningTime="2026-04-16 22:36:46.177192842 +0000 UTC m=+1376.758175102" Apr 16 22:36:46.384312 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:46.384285 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fcr4l_03fed82b-78d8-4490-8c58-013c3b157475/node-ca/0.log" Apr 16 22:36:47.075765 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:47.075729 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-86cf8799dd-t7q4v_b1c23d84-45f9-42eb-9087-7a389e8722e3/router/0.log" Apr 16 22:36:47.404696 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:47.404619 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-g8j9n_f12d17a9-d1af-465c-a23e-81d8f09a5156/serve-healthcheck-canary/0.log" Apr 16 22:36:47.771164 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:47.771126 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-nf8x4_e9277225-df0b-4b17-9847-c37f05b00c9d/insights-operator/1.log" Apr 16 22:36:47.771641 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:47.771611 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-nf8x4_e9277225-df0b-4b17-9847-c37f05b00c9d/insights-operator/0.log" Apr 16 22:36:47.791188 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:47.791152 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j42tt_3992fb2c-0d69-437a-a001-636cdb1e92c2/kube-rbac-proxy/0.log" Apr 16 22:36:47.809394 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:47.809368 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j42tt_3992fb2c-0d69-437a-a001-636cdb1e92c2/exporter/0.log" Apr 16 22:36:47.832177 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:47.832150 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j42tt_3992fb2c-0d69-437a-a001-636cdb1e92c2/extractor/0.log" Apr 16 22:36:50.044172 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:50.044134 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-dcbl6_52f8515b-400b-4d73-b5d7-12f84d118d4e/s3-init/0.log" Apr 16 22:36:52.173532 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:52.173500 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wls74/perf-node-gather-daemonset-4skb4" Apr 16 22:36:53.979040 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:53.979004 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6zgrq_090695a5-dd56-4979-855d-5b1e3e681e54/kube-storage-version-migrator-operator/1.log" Apr 16 22:36:53.980663 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:53.980631 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-6zgrq_090695a5-dd56-4979-855d-5b1e3e681e54/kube-storage-version-migrator-operator/0.log" Apr 16 22:36:55.044232 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:55.044200 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6z8bb_52f439d6-f3b5-4680-9824-b2e26d67be20/kube-multus-additional-cni-plugins/0.log" Apr 16 22:36:55.067400 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:55.067376 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6z8bb_52f439d6-f3b5-4680-9824-b2e26d67be20/egress-router-binary-copy/0.log" Apr 16 22:36:55.089320 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:55.089285 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6z8bb_52f439d6-f3b5-4680-9824-b2e26d67be20/cni-plugins/0.log" Apr 16 22:36:55.111936 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:55.111912 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6z8bb_52f439d6-f3b5-4680-9824-b2e26d67be20/bond-cni-plugin/0.log" Apr 16 22:36:55.133161 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:55.133139 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6z8bb_52f439d6-f3b5-4680-9824-b2e26d67be20/routeoverride-cni/0.log" Apr 16 22:36:55.153645 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:55.153623 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6z8bb_52f439d6-f3b5-4680-9824-b2e26d67be20/whereabouts-cni-bincopy/0.log" Apr 16 22:36:55.173288 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:55.173231 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6z8bb_52f439d6-f3b5-4680-9824-b2e26d67be20/whereabouts-cni/0.log" Apr 16 22:36:55.538060 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:55.538031 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-rztk7_428e9197-b90c-43be-a86a-81a7795c4f68/kube-multus/0.log" Apr 16 22:36:55.608502 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:55.608424 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cd2s2_4ffa9102-be6a-431e-b1c8-ee3b5b01e588/network-metrics-daemon/0.log" Apr 16 22:36:55.625297 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:55.625267 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cd2s2_4ffa9102-be6a-431e-b1c8-ee3b5b01e588/kube-rbac-proxy/0.log" Apr 16 22:36:56.914221 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:56.914189 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qntcx_829af633-ea5e-4051-916a-45ddab265148/ovn-controller/0.log" Apr 16 22:36:56.950723 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:56.950694 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qntcx_829af633-ea5e-4051-916a-45ddab265148/ovn-acl-logging/0.log" Apr 16 22:36:56.973101 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:56.973070 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qntcx_829af633-ea5e-4051-916a-45ddab265148/kube-rbac-proxy-node/0.log" Apr 16 22:36:56.991973 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:56.991948 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qntcx_829af633-ea5e-4051-916a-45ddab265148/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 22:36:57.007294 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:57.007268 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qntcx_829af633-ea5e-4051-916a-45ddab265148/northd/0.log" Apr 16 22:36:57.023906 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:57.023883 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qntcx_829af633-ea5e-4051-916a-45ddab265148/nbdb/0.log" Apr 16 22:36:57.041922 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:57.041898 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qntcx_829af633-ea5e-4051-916a-45ddab265148/sbdb/0.log" Apr 16 22:36:57.192784 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:57.192700 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qntcx_829af633-ea5e-4051-916a-45ddab265148/ovnkube-controller/0.log" Apr 16 22:36:58.150076 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:58.150042 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-jmbfd_782f4e27-ac2a-4597-87e3-319c3917a087/network-check-target-container/0.log" Apr 16 22:36:59.049127 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:59.049098 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-4mbj2_8eecc640-8a16-4e14-9e02-e8cf75b619f9/iptables-alerter/0.log" Apr 16 22:36:59.638129 ip-10-0-140-65 kubenswrapper[2572]: I0416 22:36:59.638097 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-2z49w_f190b2ca-0861-4d63-b3a6-20531ce22e01/tuned/0.log"